Sample records for methodology high performance

  1. Seven Performance Drivers.

    ERIC Educational Resources Information Center

    Ross, Linda

    2003-01-01

    Recent work with automotive e-commerce clients led to the development of a performance analysis methodology called the Seven Performance Drivers, including: standards, incentives, capacity, knowledge and skill, measurement, feedback, and analysis. This methodology has been highly effective in introducing and implementing performance improvement.…

  2. Safety of High Speed Ground Transportation Systems : Analytical Methodology for Safety Validation of Computer Controlled Subsystems : Volume 2. Development of a Safety Validation Methodology

    DOT National Transportation Integrated Search

    1995-01-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety cortical functions in high-speed rail or magnetic levitation ...

  3. Methodology for the Preliminary Design of High Performance Schools in Hot and Humid Climates

    ERIC Educational Resources Information Center

    Im, Piljae

    2009-01-01

    A methodology to develop an easy-to-use toolkit for the preliminary design of high performance schools in hot and humid climates was presented. The toolkit proposed in this research will allow decision makers without simulation knowledge easily to evaluate accurately energy efficient measures for K-5 schools, which would contribute to the…

  4. Analytical methodology for safety validation of computer controlled subsystems. Volume 1 : state-of-the-art and assessment of safety verification/validation methodologies

    DOT National Transportation Integrated Search

    1995-09-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety critical functions in high-speed rail or magnetic levitation ...

  5. Design Methodology for Multi-Element High-Lift Systems on Subsonic Civil Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Pepper, R. S.; vanDam, C. P.

    1996-01-01

    The choice of a high-lift system is crucial in the preliminary design process of a subsonic civil transport aircraft. Its purpose is to increase the allowable aircraft weight or decrease the aircraft's wing area for a given takeoff and landing performance. However, the implementation of a high-lift system into a design must be done carefully, for it can improve the aerodynamic performance of an aircraft but may also drastically increase the aircraft empty weight. If designed properly, a high-lift system can improve the cost effectiveness of an aircraft by increasing the payload weight for a given takeoff and landing performance. This is why the design methodology for a high-lift system should incorporate aerodynamic performance, weight, and cost. The airframe industry has experienced rapid technological growth in recent years which has led to significant advances in high-lift systems. For this reason many existing design methodologies have become obsolete since they are based on outdated low Reynolds number wind-tunnel data and can no longer accurately predict the aerodynamic characteristics or weight of current multi-element wings. Therefore, a new design methodology has been created that reflects current aerodynamic, weight, and cost data and provides enough flexibility to allow incorporation of new data when it becomes available.

  6. Standardized Laboratory Test Requirements for Hardening Equipment to Withstand Wave Impact Shock in Small High Speed Craft

    DTIC Science & Technology

    2017-02-06

    and methodology for transitioning craft acceleration data to laboratory shock test requirements are summarized and example requirements for...engineering rationale, assumptions, and methodology for transitioning craft acceleration data to laboratory shock test requirements are summarized and... Methodologies for Small High-Speed Craft Structure, Equipment, Shock Isolation Seats, and Human Performance At-Sea, 10 th Symposium on High

  7. Evaluating Multi-Input/Multi-Output Digital Control Systems

    NASA Technical Reports Server (NTRS)

    Pototzky, Anthony S.; Wieseman, Carol D.; Hoadley, Sherwood T.; Mukhopadhyay, Vivek

    1994-01-01

    Controller-performance-evaluation (CPE) methodology for multi-input/multi-output (MIMO) digital control systems developed. Procedures identify potentially destabilizing controllers and confirm satisfactory performance of stabilizing ones. Methodology generic and used in many types of multi-loop digital-controller applications, including digital flight-control systems, digitally controlled spacecraft structures, and actively controlled wind-tunnel models. Also applicable to other complex, highly dynamic digital controllers, such as those in high-performance robot systems.

  8. Performance evaluation in full-mission simulation - Methodological advances and research challenges. [in air transport operations

    NASA Technical Reports Server (NTRS)

    Chidester, Thomas R.; Kanki, Barbara G.; Helmreich, Robert L.

    1989-01-01

    The crew-factors research program at NASA Ames has developed a methodology for studying the impact of a variety of variables on the effectiveness of crews flying realistic but high workload simulated trips. The validity of investigations using the methodology is enhanced by careful design of full-mission scenarios, performance assessment using converging sources of data, and recruitment of representative subjects. Recently, portions of this methodology have been adapted for use in assessing the effectiveness of crew coordination among participants in line-oriented flight training.

  9. The 2014 Michigan Public High School Context and Performance Report Card

    ERIC Educational Resources Information Center

    Spalding, Audrey

    2014-01-01

    The 2014 Michigan Public High School Context and Performance Report Card is the Mackinac Center's second effort to measure high school performance. The first high school assessment was published in 2012, followed by the Center's 2013 elementary and middle school report card, which used a similar methodology to evaluate school performance. The…

  10. Surrogate based wind farm layout optimization using manifold mapping

    NASA Astrophysics Data System (ADS)

    Kaja Kamaludeen, Shaafi M.; van Zuijle, Alexander; Bijl, Hester

    2016-09-01

    High computational cost associated with the high fidelity wake models such as RANS or LES serves as a primary bottleneck to perform a direct high fidelity wind farm layout optimization (WFLO) using accurate CFD based wake models. Therefore, a surrogate based multi-fidelity WFLO methodology (SWFLO) is proposed. The surrogate model is built using an SBO method referred as manifold mapping (MM). As a verification, optimization of spacing between two staggered wind turbines was performed using the proposed surrogate based methodology and the performance was compared with that of direct optimization using high fidelity model. Significant reduction in computational cost was achieved using MM: a maximum computational cost reduction of 65%, while arriving at the same optima as that of direct high fidelity optimization. The similarity between the response of models, the number of mapping points and its position, highly influences the computational efficiency of the proposed method. As a proof of concept, realistic WFLO of a small 7-turbine wind farm is performed using the proposed surrogate based methodology. Two variants of Jensen wake model with different decay coefficients were used as the fine and coarse model. The proposed SWFLO method arrived at the same optima as that of the fine model with very less number of fine model simulations.

  11. High-throughput fabrication and screening improves gold nanoparticle chemiresistor sensor performance.

    PubMed

    Hubble, Lee J; Cooper, James S; Sosa-Pintos, Andrea; Kiiveri, Harri; Chow, Edith; Webster, Melissa S; Wieczorek, Lech; Raguse, Burkhard

    2015-02-09

    Chemiresistor sensor arrays are a promising technology to replace current laboratory-based analysis instrumentation, with the advantage of facile integration into portable, low-cost devices for in-field use. To increase the performance of chemiresistor sensor arrays a high-throughput fabrication and screening methodology was developed to assess different organothiol-functionalized gold nanoparticle chemiresistors. This high-throughput fabrication and testing methodology was implemented to screen a library consisting of 132 different organothiol compounds as capping agents for functionalized gold nanoparticle chemiresistor sensors. The methodology utilized an automated liquid handling workstation for the in situ functionalization of gold nanoparticle films and subsequent automated analyte testing of sensor arrays using a flow-injection analysis system. To test the methodology we focused on the discrimination and quantitation of benzene, toluene, ethylbenzene, p-xylene, and naphthalene (BTEXN) mixtures in water at low microgram per liter concentration levels. The high-throughput methodology identified a sensor array configuration consisting of a subset of organothiol-functionalized chemiresistors which in combination with random forests analysis was able to predict individual analyte concentrations with overall root-mean-square errors ranging between 8-17 μg/L for mixtures of BTEXN in water at the 100 μg/L concentration. The ability to use a simple sensor array system to quantitate BTEXN mixtures in water at the low μg/L concentration range has direct and significant implications to future environmental monitoring and reporting strategies. In addition, these results demonstrate the advantages of high-throughput screening to improve the performance of gold nanoparticle based chemiresistors for both new and existing applications.

  12. High Performance Work System, HRD Climate and Organisational Performance: An Empirical Study

    ERIC Educational Resources Information Center

    Muduli, Ashutosh

    2015-01-01

    Purpose: This paper aims to study the relationship between high-performance work system (HPWS) and organizational performance and to examine the role of human resource development (HRD) Climate in mediating the relationship between HPWS and the organizational performance in the context of the power sector of India. Design/methodology/approach: The…

  13. Energy index decomposition methodology at the plant level

    NASA Astrophysics Data System (ADS)

    Kumphai, Wisit

    Scope and method of study. The dissertation explores the use of a high level energy intensity index as a facility-level energy performance monitoring indicator with a goal of developing a methodology for an economically based energy performance monitoring system that incorporates production information. The performance measure closely monitors energy usage, production quantity, and product mix and determines the production efficiency as a part of an ongoing process that would enable facility managers to keep track of and, in the future, be able to predict when to perform a recommissioning process. The study focuses on the use of the index decomposition methodology and explored several high level (industry, sector, and country levels) energy utilization indexes, namely, Additive Log Mean Divisia, Multiplicative Log Mean Divisia, and Additive Refined Laspeyres. One level of index decomposition is performed. The indexes are decomposed into Intensity and Product mix effects. These indexes are tested on a flow shop brick manufacturing plant model in three different climates in the United States. The indexes obtained are analyzed by fitting an ARIMA model and testing for dependency between the two decomposed indexes. Findings and conclusions. The results concluded that the Additive Refined Laspeyres index decomposition methodology is suitable to use on a flow shop, non air conditioned production environment as an energy performance monitoring indicator. It is likely that this research can be further expanded in to predicting when to perform a recommissioning process.

  14. Manufacturing Advantage: Why High-Performance Work Systems Pay Off.

    ERIC Educational Resources Information Center

    Appelbaum, Eileen; Bailey, Thomas; Berg, Peter; Kalleberg, Arne L.

    A study examined the relationship between high-performance workplace practices and the performance of plants in the following manufacturing industries: steel, apparel, and medical electronic instruments and imaging. The multilevel research methodology combined the following data collection activities: (1) site visits; (2) collection of plant…

  15. Overcoming barriers to high performance seismic design using lessons learned from the green building industry

    NASA Astrophysics Data System (ADS)

    Glezil, Dorothy

    NEHRP's Provisions today currently governing conventional seismic resistant design. These provisions, though they ensure the life-safety of building occupants, extensive damage and economic losses may still occur in the structures. This minimum performance can be enhanced using the Performance-Based Earthquake Engineering methodology and passive control systems like base isolation and energy dissipation systems. Even though these technologies and the PBEE methodology are effective reducing economic losses and fatalities during earthquakes, getting them implemented into seismic resistant design has been challenging. One of the many barriers to their implementation has been their upfront costs. The green building community has faced some of the same challenges that the high performance seismic design community currently faces. The goal of this thesis is to draw on the success of the green building industry to provide recommendations that may be used overcome the barriers that high performance seismic design (HPSD) is currently facing.

  16. A methodology towards virtualisation-based high performance simulation platform supporting multidisciplinary design of complex products

    NASA Astrophysics Data System (ADS)

    Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin

    2012-08-01

    Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.

  17. Multiscale Thermo-Mechanical Design and Analysis of High Frequency and High Power Vacuum Electron Devices

    NASA Astrophysics Data System (ADS)

    Gamzina, Diana

    Diana Gamzina March 2016 Mechanical and Aerospace Engineering Multiscale Thermo-Mechanical Design and Analysis of High Frequency and High Power Vacuum Electron Devices Abstract A methodology for performing thermo-mechanical design and analysis of high frequency and high average power vacuum electron devices is presented. This methodology results in a "first-pass" engineering design directly ready for manufacturing. The methodology includes establishment of thermal and mechanical boundary conditions, evaluation of convective film heat transfer coefficients, identification of material options, evaluation of temperature and stress field distributions, assessment of microscale effects on the stress state of the material, and fatigue analysis. The feature size of vacuum electron devices operating in the high frequency regime of 100 GHz to 1 THz is comparable to the microstructure of the materials employed for their fabrication. As a result, the thermo-mechanical performance of a device is affected by the local material microstructure. Such multiscale effects on the stress state are considered in the range of scales from about 10 microns up to a few millimeters. The design and analysis methodology is demonstrated on three separate microwave devices: a 95 GHz 10 kW cw sheet beam klystron, a 263 GHz 50 W long pulse wide-bandwidth sheet beam travelling wave tube, and a 346 GHz 1 W cw backward wave oscillator.

  18. Enhanced High Performance Power Compensation Methodology by IPFC Using PIGBT-IDVR

    PubMed Central

    Arumugom, Subramanian; Rajaram, Marimuthu

    2015-01-01

    Currently, power systems are involuntarily controlled without high speed control and are frequently initiated, therefore resulting in a slow process when compared with static electronic devices. Among various power interruptions in power supply systems, voltage dips play a central role in causing disruption. The dynamic voltage restorer (DVR) is a process based on voltage control that compensates for line transients in the distributed system. To overcome these issues and to achieve a higher speed, a new methodology called the Parallel IGBT-Based Interline Dynamic Voltage Restorer (PIGBT-IDVR) method has been proposed, which mainly spotlights the dynamic processing of energy reloads in common dc-linked energy storage with less adaptive transition. The interline power flow controller (IPFC) scheme has been employed to manage the power transmission between the lines and the restorer method for controlling the reactive power in the individual lines. By employing the proposed methodology, the failure of a distributed system has been avoided and provides better performance than the existing methodologies. PMID:26613101

  19. Comparison between two methodologies for urban drainage decision aid.

    PubMed

    Moura, P M; Baptista, M B; Barraud, S

    2006-01-01

    The objective of the present work is to compare two methodologies based on multicriteria analysis for the evaluation of stormwater systems. The first methodology was developed in Brazil and is based on performance-cost analysis, the second one is ELECTRE III. Both methodologies were applied to a case study. Sensitivity and robustness analyses were then carried out. These analyses demonstrate that both methodologies have equivalent results, and present low sensitivity and high robustness. These results prove that the Brazilian methodology is consistent and can be used safely in order to select a good solution or a small set of good solutions that could be compared with more detailed methods afterwards.

  20. Going beyond a First Reader: A Machine Learning Methodology for Optimizing Cost and Performance in Breast Ultrasound Diagnosis.

    PubMed

    Venkatesh, Santosh S; Levenback, Benjamin J; Sultan, Laith R; Bouzghar, Ghizlane; Sehgal, Chandra M

    2015-12-01

    The goal of this study was to devise a machine learning methodology as a viable low-cost alternative to a second reader to help augment physicians' interpretations of breast ultrasound images in differentiating benign and malignant masses. Two independent feature sets consisting of visual features based on a radiologist's interpretation of images and computer-extracted features when used as first and second readers and combined by adaptive boosting (AdaBoost) and a pruning classifier resulted in a very high level of diagnostic performance (area under the receiver operating characteristic curve = 0.98) at a cost of pruning a fraction (20%) of the cases for further evaluation by independent methods. AdaBoost also improved the diagnostic performance of the individual human observers and increased the agreement between their analyses. Pairing AdaBoost with selective pruning is a principled methodology for achieving high diagnostic performance without the added cost of an additional reader for differentiating solid breast masses by ultrasound. Copyright © 2015 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  1. Nonlinear and adaptive control

    NASA Technical Reports Server (NTRS)

    Athans, Michael

    1989-01-01

    The primary thrust of the research was to conduct fundamental research in the theories and methodologies for designing complex high-performance multivariable feedback control systems; and to conduct feasibiltiy studies in application areas of interest to NASA sponsors that point out advantages and shortcomings of available control system design methodologies.

  2. Characterization of Radiation Hardened Bipolar Linear Devices for High Total Dose Missions

    NASA Technical Reports Server (NTRS)

    McClure, Steven S.; Harris, Richard D.; Rax, Bernard G.; Thorbourn, Dennis O.

    2012-01-01

    Radiation hardened linear devices are characterized for performance in combined total dose and displacement damage environments for a mission scenario with a high radiation level. Performance at low and high dose rate for both biased and unbiased conditions is compared and the impact to hardness assurance methodology is discussed.

  3. Aero-Mechanical Design Methodology for Subsonic Civil Transport High-Lift Systems

    NASA Technical Reports Server (NTRS)

    vanDam, C. P.; Shaw, S. G.; VanderKam, J. C.; Brodeur, R. R.; Rudolph, P. K. C.; Kinney, D.

    2000-01-01

    In today's highly competitive and economically driven commercial aviation market, the trend is to make aircraft systems simpler and to shorten their design cycle which reduces recurring, non-recurring and operating costs. One such system is the high-lift system. A methodology has been developed which merges aerodynamic data with kinematic analysis of the trailing-edge flap mechanism with minimum mechanism definition required. This methodology provides quick and accurate aerodynamic performance prediction for a given flap deployment mechanism early on in the high-lift system preliminary design stage. Sample analysis results for four different deployment mechanisms are presented as well as descriptions of the aerodynamic and mechanism data required for evaluation. Extensions to interactive design capabilities are also discussed.

  4. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 1: Executive summary and technical narrative

    NASA Technical Reports Server (NTRS)

    Pieper, Jerry L.; Walker, Richard E.

    1993-01-01

    During the past three decades, an enormous amount of resources were expended in the design and development of Liquid Oxygen/Hydrocarbon and Hydrogen (LOX/HC and LOX/H2) rocket engines. A significant portion of these resources were used to develop and demonstrate the performance and combustion stability for each new engine. During these efforts, many analytical and empirical models were developed that characterize design parameters and combustion processes that influence performance and stability. Many of these models are suitable as design tools, but they have not been assembled into an industry-wide usable analytical design methodology. The objective of this program was to assemble existing performance and combustion stability models into a usable methodology capable of producing high performing and stable LOX/hydrocarbon and LOX/hydrogen propellant booster engines.

  5. How Methodology Decisions Affect the Variability of Schools Identified as Beating the Odds. REL 2015-071.rev

    ERIC Educational Resources Information Center

    Abe, Yasuyo; Weinstock, Phyllis; Chan, Vincent; Meyers, Coby; Gerdeman, R. Dean; Brandt, W. Christopher

    2015-01-01

    A number of states and school districts have identified schools that perform better than expected, given the populations they serve, in order to recognize school performance or to learn from local school practices and policies. These schools have been labeled "beating the odds," "high-performing/high-poverty,"…

  6. Organizing Performance Requirements For Dynamical Systems

    NASA Technical Reports Server (NTRS)

    Malchow, Harvey L.; Croopnick, Steven R.

    1990-01-01

    Paper describes methodology for establishing performance requirements for complicated dynamical systems. Uses top-down approach. In series of steps, makes connections between high-level mission requirements and lower-level functional performance requirements. Provides systematic delineation of elements accommodating design compromises.

  7. Integrated Design Methodology for Highly Reliable Liquid Rocket Engine

    NASA Astrophysics Data System (ADS)

    Kuratani, Naoshi; Aoki, Hiroshi; Yasui, Masaaki; Kure, Hirotaka; Masuya, Goro

    The Integrated Design Methodology is strongly required at the conceptual design phase to achieve the highly reliable space transportation systems, especially the propulsion systems, not only in Japan but also all over the world in these days. Because in the past some catastrophic failures caused some losses of mission and vehicle (LOM/LOV) at the operational phase, moreover did affect severely the schedule delays and cost overrun at the later development phase. Design methodology for highly reliable liquid rocket engine is being preliminarily established and investigated in this study. The sensitivity analysis is systematically performed to demonstrate the effectiveness of this methodology, and to clarify and especially to focus on the correlation between the combustion chamber, turbopump and main valve as main components. This study describes the essential issues to understand the stated correlations, the need to apply this methodology to the remaining critical failure modes in the whole engine system, and the perspective on the engine development in the future.

  8. A new hyperspectral image compression paradigm based on fusion

    NASA Astrophysics Data System (ADS)

    Guerra, Raúl; Melián, José; López, Sebastián.; Sarmiento, Roberto

    2016-10-01

    The on-board compression of remote sensed hyperspectral images is an important task nowadays. One of the main difficulties is that the compression of these images must be performed in the satellite which carries the hyperspectral sensor. Hence, this process must be performed by space qualified hardware, having area, power and speed limitations. Moreover, it is important to achieve high compression ratios without compromising the quality of the decompress image. In this manuscript we proposed a new methodology for compressing hyperspectral images based on hyperspectral image fusion concepts. The proposed compression process has two independent steps. The first one is to spatially degrade the remote sensed hyperspectral image to obtain a low resolution hyperspectral image. The second step is to spectrally degrade the remote sensed hyperspectral image to obtain a high resolution multispectral image. These two degraded images are then send to the earth surface, where they must be fused using a fusion algorithm for hyperspectral and multispectral image, in order to recover the remote sensed hyperspectral image. The main advantage of the proposed methodology for compressing remote sensed hyperspectral images is that the compression process, which must be performed on-board, becomes very simple, being the fusion process used to reconstruct image the more complex one. An extra advantage is that the compression ratio can be fixed in advanced. Many simulations have been performed using different fusion algorithms and different methodologies for degrading the hyperspectral image. The results obtained in the simulations performed corroborate the benefits of the proposed methodology.

  9. Development/Modernization of an Advanced Non-Light Water Reactor Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henneke, Dennis W.; Robinson, James

    In 2015, GE Hitachi Nuclear Energy (GEH) teamed with Argonne National Laboratory (Argonne) to perform Research and Development (R&D) of next-generation Probabilistic Risk Assessment (PRA) methodologies for the modernization of an advanced non-Light Water Reactor (non-LWR) PRA. This effort built upon a PRA developed in the early 1990s for GEH’s Power Reactor Inherently Safe Module (PRISM) Sodium Fast Reactor (SFR). The work had four main tasks: internal events development modeling the risk from the reactor for hazards occurring at-power internal to the plant; an all hazards scoping review to analyze the risk at a high level from external hazards suchmore » as earthquakes and high winds; an all modes scoping review to understand the risk at a high level from operating modes other than at-power; and risk insights to integrate the results from each of the three phases above. To achieve these objectives, GEH and Argonne used and adapted proven PRA methodologies and techniques to build a modern non-LWR all hazards/all modes PRA. The teams also advanced non-LWR PRA methodologies, which is an important outcome from this work. This report summarizes the project outcomes in two major phases. The first phase presents the methodologies developed for non-LWR PRAs. The methodologies are grouped by scope, from Internal Events At-Power (IEAP) to hazards analysis to modes analysis. The second phase presents details of the PRISM PRA model which was developed as a validation of the non-LWR methodologies. The PRISM PRA was performed in detail for IEAP, and at a broader level for hazards and modes. In addition to contributing methodologies, this project developed risk insights applicable to non-LWR PRA, including focus-areas for future R&D, and conclusions about the PRISM design.« less

  10. IMPAC: An Integrated Methodology for Propulsion and Airframe Control

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Ouzts, Peter J.; Lorenzo, Carl F.; Mattern, Duane L.

    1991-01-01

    The National Aeronautics and Space Administration is actively involved in the development of enabling technologies that will lead towards aircraft with new/enhanced maneuver capabilities such as Short Take-Off Vertical Landing (STOVL) and high angle of attack performance. Because of the high degree of dynamic coupling between the airframe and propulsion systems of these types of aircraft, one key technology is the integration of the flight and propulsion control. The NASA Lewis Research Center approach to developing Integrated Flight Propulsion Control (IFPC) technologies is an in-house research program referred to as IMPAC (Integrated Methodology for Propulsion and Airframe Control). The goals of IMPAC are to develop a viable alternative to the existing integrated control design methodologies that will allow for improved system performance and simplicity of control law synthesis and implementation, and to demonstrate the applicability of the methodology to a supersonic STOVL fighter aircraft. Based on some preliminary control design studies that included evaluation of the existing methodologies, the IFPC design methodology that is emerging at the Lewis Research Center consists of considering the airframe and propulsion system as one integrated system for an initial centralized controller design and then partitioning the centralized controller into separate airframe and propulsion system subcontrollers to ease implementation and to set meaningful design requirements for detailed subsystem control design and evaluation. An overview of IMPAC is provided and detailed discussion of the various important design and evaluation steps in the methodology are included.

  11. Reliability-Based Stability Analysis of Rock Slopes Using Numerical Analysis and Response Surface Method

    NASA Astrophysics Data System (ADS)

    Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.

    2017-08-01

    While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.

  12. Inlet design for high-speed propfans

    NASA Technical Reports Server (NTRS)

    Little, B. H., Jr.; Hinson, B. L.

    1982-01-01

    A two-part study was performed to design inlets for high-speed propfan installation. The first part was a parametric study to select promising inlet concepts. A wide range of inlet geometries was examined and evaluated - primarily on the basis of cruise thrust and fuel burn performance. Two inlet concepts were than chosen for more detailed design studies - one apropriate to offset engine/gearbox arrangements and the other to in-line arrangements. In the second part of this study, inlet design points were chosen to optimize the net installed thrust, and detailed design of the two inlet configurations was performed. An analytical methodology was developed to account for propfan slipstream effects, transonic flow efects, and three-dimensional geometry effects. Using this methodology, low drag cowls were designed for the two inlets.

  13. High-Performance Cricket Coaches' Perceptions of an Educationally Informed Coach Education Programme

    ERIC Educational Resources Information Center

    Galvan, Hugh; Fyall, Glenn; Culpan, Ian

    2012-01-01

    This paper reports and discusses the findings of a research project that investigated the recently conceptualized and implemented New Zealand Cricket, Level 3, high-performance coach education programme (CEP). A qualitative methodology was employed to gather data from six coaches involved in the CEP. In particular the researchers sought the…

  14. How to Improve a School that Is Already High Performing: Innovation in the Field of Education

    ERIC Educational Resources Information Center

    Caridas, Evangeline; Hammer, Mark

    2006-01-01

    (Purpose) The case study's purpose was to examine Participative Management Style, high performance strategies, intangible and tangible indicators, trust and its creation of superior achievement in a school district for elementary and middle school children (Illinois). (Methodology) A collaboration effort by Superintendent, administrative staff,…

  15. Initial Teacher Education: Does Self-Efficacy Influence Candidate Teacher Academic Achievement and Future Career Performance?

    ERIC Educational Resources Information Center

    Shawer, Saad F.

    2013-01-01

    This quantitative investigation examined the influence of low and high self-efficacy on candidate teacher academic performance in a foreign language teaching methodology course through testing the speculation that high self-efficacy levels would improve pedagogical-content knowledge (PCK). Positivism guided the research design at the levels of…

  16. Rating of Dynamic Coefficient for Simple Beam Bridge Design on High-Speed Railways

    NASA Astrophysics Data System (ADS)

    Diachenko, Leonid; Benin, Andrey; Smirnov, Vladimir; Diachenko, Anastasia

    2018-06-01

    The aim of the work is to improve the methodology for the dynamic computation of simple beam spans during the impact of high-speed trains. Mathematical simulation utilizing numerical and analytical methods of structural mechanics is used in the research. The article analyses parameters of the effect of high-speed trains on simple beam spanning bridge structures and suggests a technique of determining of the dynamic index to the live load. Reliability of the proposed methodology is confirmed by results of numerical simulation of high-speed train passage over spans with different speeds. The proposed algorithm of dynamic computation is based on a connection between maximum acceleration of the span in the resonance mode of vibrations and the main factors of stress-strain state. The methodology allows determining maximum and also minimum values of the main efforts in the construction that makes possible to perform endurance tests. It is noted that dynamic additions for the components of the stress-strain state (bending moments, transverse force and vertical deflections) are different. This condition determines the necessity for differentiated approach to evaluation of dynamic coefficients performing design verification of I and II groups of limiting state. The practical importance: the methodology of determining the dynamic coefficients allows making dynamic calculation and determining the main efforts in split beam spans without numerical simulation and direct dynamic analysis that significantly reduces the labour costs for design.

  17. Risk of bias in overviews of reviews: a scoping review of methodological guidance and four-item checklist.

    PubMed

    Ballard, Madeleine; Montgomery, Paul

    2017-03-01

    To assess the conditions under which employing an overview of systematic reviews is likely to lead to a high risk of bias. To synthesise existing guidance concerning overview practice, a scoping review was conducted. Four electronic databases were searched with a pre-specified strategy (PROSPERO 2015:CRD42015027592) ending October 2015. Included studies needed to describe or develop overview methodology. Data were narratively synthesised to delineate areas highlighted as outstanding challenges or where methodological recommendations conflict. Twenty-four papers met the inclusion criteria. There is emerging debate regarding overlapping systematic reviews; systematic review scope; quality of included research; updating; and synthesizing and reporting results. While three functions for overviews have been proposed-identify gaps, explore heterogeneity, summarize evidence-overviews cannot perform the first; are unlikely to achieve the second and third simultaneously; and can only perform the third under specific circumstances. Namely, when identified systematic reviews meet the following four conditions: (1) include primary trials that do not substantially overlap, (2) match overview scope, (3) are of high methodological quality, and (4) are up-to-date. Considering the intended function of proposed overviews with the corresponding methodological conditions may improve the quality of this burgeoning publication type. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Opportunities for Improved Management Efficiency of the Head Start Program: Performance Evaluation and High Risk Determination.

    ERIC Educational Resources Information Center

    Gall, Mary Sheila

    This report provides results of a review of the methodology used by the Office of Human Development Services (HDS) to measure Head Start performance and to control high risk Head Start agencies. The review was performed at HDS headquarters and regional locations nationwide. The review was based on a sample of 200 Head Start agencies and focused on…

  19. Select Methodology for Validating Advanced Satellite Measurement Systems

    NASA Technical Reports Server (NTRS)

    Larar, Allen M.; Zhou, Daniel K.; Liu, Xi; Smith, William L.

    2008-01-01

    Advanced satellite sensors are tasked with improving global measurements of the Earth's atmosphere, clouds, and surface to enable enhancements in weather prediction, climate monitoring capability, and environmental change detection. Measurement system validation is crucial to achieving this goal and maximizing research and operational utility of resultant data. Field campaigns including satellite under-flights with well calibrated FTS sensors aboard high-altitude aircraft are an essential part of the validation task. This presentation focuses on an overview of validation methodology developed for assessment of high spectral resolution infrared systems, and includes results of preliminary studies performed to investigate the performance of the Infrared Atmospheric Sounding Interferometer (IASI) instrument aboard the MetOp-A satellite.

  20. An evaluation of NASA's program in human factors research: Aircrew-vehicle system interaction

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Research in human factors in the aircraft cockpit and a proposed program augmentation were reviewed. The dramatic growth of microprocessor technology makes it entirely feasible to automate increasingly more functions in the aircraft cockpit; the promise of improved vehicle performance, efficiency, and safety through automation makes highly automated flight inevitable. An organized data base and validated methodology for predicting the effects of automation on human performance and thus on safety are lacking and without such a data base and validated methodology for analyzing human performance, increased automation may introduce new risks. Efforts should be concentrated on developing methods and techniques for analyzing man machine interactions, including human workload and prediction of performance.

  1. How to emerge from the conservatism in clinical research methodology?

    PubMed

    Kotecki, Nuria; Penel, Nicolas; Awada, Ahmad

    2017-09-01

    Despite recent changes in clinical research methodology, many challenges remain in drug development methodology. Advances in molecular biology and cancer treatments have changed the clinical research landscape. Thus, we moved from empirical clinical oncology to molecular and immunological therapeutic approaches. Along with this move, adapted dose-limiting toxicities definitions, endpoints, and dose escalation methods have been proposed. Moreover, the classical frontier between phase I, phase II, and phase III has become unclear in particular for immunological approaches. So, investigators are facing major challenges in drug development methodology. We propose to individualize clinical research using innovative approaches to significantly improve patient outcomes and targeting what is considered unmet need. Integrating high level of translational research and performing well designed biomarker studies with great potential for clinical practice are of utmost importance. This could be performed within new models of clinical research networks and by building a strong collaboration between academic, cooperative groups, on-site investigators, and pharma.

  2. Going the Distance: Are There Common Factors in High Performance Distance Learning? Research Report.

    ERIC Educational Resources Information Center

    Hawksley, Rosemary; Owen, Jane

    Common factors among high-performing distance learning (DL) programs were examined through case studies at 9 further education colleges and 2 nonsector organizations in the United Kingdom and a backup survey of a sample of 50 distance learners at 5 of the colleges. The study methodology incorporated numerous principles of process benchmarking. The…

  3. Hindering Factors of Beginning Teachers' High Performance in Higher Education Pakistan: Case Study of IUB

    ERIC Educational Resources Information Center

    Sarwar, Shakeel; Aslam, Hassan Danyal; Rasheed, Muhammad Imran

    2012-01-01

    Purpose: The aim of the researchers in this endeavor is to identify the challenges and obstacles faced by beginning teachers in higher education. This study also explores practical implications and what adaptation can be utilized in order to have high performance of the beginning teachers. Design/methodology/approach: Researchers have applied…

  4. Comparing Alternative Instruments to Measure Service Quality in Higher Education

    ERIC Educational Resources Information Center

    Brochado, Ana

    2009-01-01

    Purpose: The purpose of this paper is to examine the performance of five alternative measures of service quality in the high education sector--service quality (SERVQUAL), importance-weighted SERVQUAL, service performance (SERVPERF), importance-weighted SERVPERF, and higher education performance (HEdPERF). Design/methodology/approach: Data were…

  5. Strategic Analysis Overview

    NASA Technical Reports Server (NTRS)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  6. Application of Design Methodologies for Feedback Compensation Associated with Linear Systems

    NASA Technical Reports Server (NTRS)

    Smith, Monty J.

    1996-01-01

    The work that follows is concerned with the application of design methodologies for feedback compensation associated with linear systems. In general, the intent is to provide a well behaved closed loop system in terms of stability and robustness (internal signals remain bounded with a certain amount of uncertainty) and simultaneously achieve an acceptable level of performance. The approach here has been to convert the closed loop system and control synthesis problem into the interpolation setting. The interpolation formulation then serves as our mathematical representation of the design process. Lifting techniques have been used to solve the corresponding interpolation and control synthesis problems. Several applications using this multiobjective design methodology have been included to show the effectiveness of these techniques. In particular, the mixed H 2-H performance criteria with algorithm has been used on several examples including an F-18 HARV (High Angle of Attack Research Vehicle) for sensitivity performance.

  7. Distributed Large Data-Object Environments: End-to-End Performance Analysis of High Speed Distributed Storage Systems in Wide Area ATM Networks

    NASA Technical Reports Server (NTRS)

    Johnston, William; Tierney, Brian; Lee, Jason; Hoo, Gary; Thompson, Mary

    1996-01-01

    We have developed and deployed a distributed-parallel storage system (DPSS) in several high speed asynchronous transfer mode (ATM) wide area networks (WAN) testbeds to support several different types of data-intensive applications. Architecturally, the DPSS is a network striped disk array, but is fairly unique in that its implementation allows applications complete freedom to determine optimal data layout, replication and/or coding redundancy strategy, security policy, and dynamic reconfiguration. In conjunction with the DPSS, we have developed a 'top-to-bottom, end-to-end' performance monitoring and analysis methodology that has allowed us to characterize all aspects of the DPSS operating in high speed ATM networks. In particular, we have run a variety of performance monitoring experiments involving the DPSS in the MAGIC testbed, which is a large scale, high speed, ATM network and we describe our experience using the monitoring methodology to identify and correct problems that limit the performance of high speed distributed applications. Finally, the DPSS is part of an overall architecture for using high speed, WAN's for enabling the routine, location independent use of large data-objects. Since this is part of the motivation for a distributed storage system, we describe this architecture.

  8. Can cultural differences lead to accidents? Team cultural differences and sociotechnical system operations.

    PubMed

    Strauch, Barry

    2010-04-01

    I discuss cultural factors and how they may influence sociotechnical system operations. Investigations of several major transportation accidents suggest that cultural factors may have played a role in the causes of the accidents. However, research has not fully addressed how cultural factors can influence sociotechnical systems. I review literature on cultural differences in general and cultural factors in sociotechnical systems and discuss how these differences can affect team performance in sociotechnical systems. Cultural differences have been observed in social and interpersonal dimensions and in cognitive and perceptual styles; these differences can affect multioperator team performance. Cultural factors may account for team errors in sociotechnical systems, most likely during high-workload, high-stress operational phases. However, much of the research on cultural factors has methodological and interpretive shortcomings that limit their applicability to sociotechnical systems. Although some research has been conducted on the role of cultural differences on team performance in sociotechnical system operations, considerable work remains to be done before the effects of these differences can be fully understood. I propose a model that illustrates how culture can interact with sociotechnical system operations and suggest avenues of future research. Given methodological challenges in measuring cultural differences and team performance in sociotechnical system operations, research in these systems should use a variety of methodologies to better understand how culture can affect multioperator team performance in these systems.

  9. SSME Investment in Turbomachinery Inducer Impeller Design Tools and Methodology

    NASA Technical Reports Server (NTRS)

    Zoladz, Thomas; Mitchell, William; Lunde, Kevin

    2010-01-01

    Within the rocket engine industry, SSME turbomachines are the de facto standards of success with regard to meeting aggressive performance requirements under challenging operational environments. Over the Shuttle era, SSME has invested heavily in our national inducer impeller design infrastructure. While both low and high pressure turbopump failures/anomaly resolution efforts spurred some of these investments, the SSME program was a major benefactor of key areas of turbomachinery inducer-impeller research outside of flight manifest pressures. Over the past several decades, key turbopump internal environments have been interrogated via highly instrumented hot-fire and cold-flow testing. Likewise, SSME has sponsored the advancement of time accurate and cavitating inducer impeller computation fluid dynamics (CFD) tools. These investments together have led to a better understanding of the complex internal flow fields within aggressive high performing inducers and impellers. New design tools and methodologies have evolved which intend to provide confident blade designs which strike an appropriate balance between performance and self induced load management.

  10. High-performance radial AMTEC cell design for ultra-high-power solar AMTEC systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendricks, T.J.; Huang, C.

    1999-07-01

    Alkali Metal Thermal to Electric Conversion (AMTEC) technology is rapidly maturing for potential application in ultra-high-power solar AMTEC systems required by potential future US Air Force (USAF) spacecraft missions in medium-earth and geosynchronous orbits (MEO and GEO). Solar thermal AMTEC power systems potentially have several important advantages over current solar photovoltaic power systems in ultra-high-power spacecraft applications for USAF MEO and GEO missions. This work presents key aspects of radial AMTEC cell design to achieve high cell performance in solar AMTEC systems delivering larger than 50 kW(e) to support high power USAF missions. These missions typically require AMTEC cell conversionmore » efficiency larger than 25%. A sophisticated design parameter methodology is described and demonstrated which establishes optimum design parameters in any radial cell design to satisfy high-power mission requirements. Specific relationships, which are distinct functions of cell temperatures and pressures, define critical dependencies between key cell design parameters, particularly the impact of parasitic thermal losses on Beta Alumina Solid Electrolyte (BASE) area requirements, voltage, number of BASE tubes, and system power production for both maximum power-per-BASE-area and optimum efficiency conditions. Finally, some high-level system tradeoffs are demonstrated using the design parameter methodology to establish high-power radial cell design requirements and philosophy. The discussion highlights how to incorporate this methodology with sophisticated SINDA/FLUINT AMTEC cell modeling capabilities to determine optimum radial AMTEC cell designs.« less

  11. Are normative sonographic values of kidney size in children valid and reliable? A systematic review of the methodological quality of ultrasound studies using the Anatomical Quality Assessment (AQUA) tool.

    PubMed

    Chhapola, Viswas; Tiwari, Soumya; Deepthi, Bobbity; Henry, Brandon Michael; Brar, Rekha; Kanwal, Sandeep Kumar

    2018-06-01

    A plethora of research is available on ultrasonographic kidney size standards. We performed a systematic review of methodological quality of ultrasound studies aimed at developing normative renal parameters in healthy children, by evaluating the risk of bias (ROB) using the 'Anatomical Quality Assessment (AQUA)' tool. We searched Medline, Scopus, CINAHL, and Google Scholar on June 04 2018, and observational studies measuring kidney size by ultrasonography in healthy children (0-18 years) were included. The ROB of each study was evaluated in five domains using a 20 item coding scheme based on AQUA tool framework. Fifty-four studies were included. Domain 1 (subject characteristics) had a high ROB in 63% of studies due to the unclear description of age, sex, and ethnicity. The performance in Domain 2 (study design) was the best with 85% of studies having a prospective design. Methodological characterization (Domain 3) was poor across the studies (< 10% compliance), with suboptimal performance in the description of patient positioning, operator experience, and assessment of intra/inter-observer reliability. About three-fourth of the studies had a low ROB in Domain 4 (descriptive anatomy). Domain 5 (reporting of results) had a high ROB in approximately half of the studies, the majority reporting results in the form of central tendency measures. Significant deficiencies and heterogeneity were observed in the methodological quality of USG studies performed to-date for measurement of kidney size in children. We hereby provide a framework for the conducting such studies in future. PROSPERO (CRD42017071601).

  12. Credentials versus Performance: Review of the Teacher Performance Pay Research

    ERIC Educational Resources Information Center

    Podgursky, Michael; Springer, Matthew G.

    2007-01-01

    In this article we examine the economic case for merit or performance-based pay for K-12 teachers. We review several areas of germane research. The direct evaluation literature on these incentive plans is slender; highly diverse in terms of methodology, targeted populations, and programs evaluated; and primarily focused on short-run motivational…

  13. Failure detection system design methodology. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.

    1980-01-01

    The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.

  14. Validation of musculoskeletal ultrasound to assess and quantify muscle glycogen content. A novel approach.

    PubMed

    Hill, John C; Millán, Iñigo San

    2014-09-01

    Glycogen storage is essential for exercise performance. The ability to assess muscle glycogen levels should be an important advantage for performance. However, skeletal muscle glycogen assessment has only been available and validated through muscle biopsy. We have developed a new methodology using high-frequency ultrasound to assess skeletal muscle glycogen content in a rapid, portable, and noninvasive way using MuscleSound (MuscleSound, LCC, Denver, CO) technology. To validate the utilization of high-frequency musculoskeletal ultrasound for muscle glycogen assessment and correlate it with histochemical glycogen quantification through muscle biopsy. Twenty-two male competitive cyclists (categories: Pro, 1-4; average height, 183.7 ± 4.9 cm; average weight, 76.8 ± 7.8 kg) performed a steady-state test on a cyclergometer for 90 minutes at a moderate to high exercise intensity, eliciting a carbohydrate oxidation of 2-3 g·min⁻¹ and a blood lactate concentration of 2 to 3 mM. Pre- and post-exercise glycogen content from rectus femoris muscle was measured using histochemical analysis through muscle biopsy and through high-frequency ultrasound scans using MuscleSound technology. Correlations between muscle biopsy glycogen histochemical quantification (mmol·kg⁻¹) and high-frequency ultrasound methodology through MuscleSound technology were r = 0.93 (P < 0.0001) pre-exercise and r = 0.94 (P < 0.0001) post-exercise. The correlation between muscle biopsy glycogen quantification and high-frequency ultrasound methodology for the change in glycogen from pre- and post-exercise was r = 0.81 (P < 0.0001). These results demonstrate that skeletal muscle glycogen can be measured quickly and noninvasively through high-frequency ultrasound using MuscleSound technology.

  15. Robust, Decoupled, Flight Control Design with Rate Saturating Actuators

    NASA Technical Reports Server (NTRS)

    Snell, S. A.; Hess, R. A.

    1997-01-01

    Techniques for the design of control systems for manually controlled, high-performance aircraft must provide the following: (1) multi-input, multi-output (MIMO) solutions, (2) acceptable handling qualities including no tendencies for pilot-induced oscillations, (3) a tractable approach for compensator design, (4) performance and stability robustness in the presence of significant plant uncertainty, and (5) performance and stability robustness in the presence actuator saturation (particularly rate saturation). A design technique built upon Quantitative Feedback Theory is offered as a candidate methodology which can provide flight control systems meeting these requirements, and do so over a considerable part of the flight envelope. An example utilizing a simplified model of a supermaneuverable fighter aircraft demonstrates the proposed design methodology.

  16. Systematic design of membership functions for fuzzy-logic control: A case study on one-stage partial nitritation/anammox treatment systems.

    PubMed

    Boiocchi, Riccardo; Gernaey, Krist V; Sin, Gürkan

    2016-10-01

    A methodology is developed to systematically design the membership functions of fuzzy-logic controllers for multivariable systems. The methodology consists of a systematic derivation of the critical points of the membership functions as a function of predefined control objectives. Several constrained optimization problems corresponding to different qualitative operation states of the system are defined and solved to identify, in a consistent manner, the critical points of the membership functions for the input variables. The consistently identified critical points, together with the linguistic rules, determine the long term reachability of the control objectives by the fuzzy logic controller. The methodology is highlighted using a single-stage side-stream partial nitritation/Anammox reactor as a case study. As a result, a new fuzzy-logic controller for high and stable total nitrogen removal efficiency is designed. Rigorous simulations are carried out to evaluate and benchmark the performance of the controller. The results demonstrate that the novel control strategy is capable of rejecting the long-term influent disturbances, and can achieve a stable and high TN removal efficiency. Additionally, the controller was tested, and showed robustness, against measurement noise levels typical for wastewater sensors. A feedforward-feedback configuration using the present controller would give even better performance. In comparison, a previously developed fuzzy-logic controller using merely expert and intuitive knowledge performed worse. This proved the importance of using a systematic methodology for the derivation of the membership functions for multivariable systems. These results are promising for future applications of the controller in real full-scale plants. Furthermore, the methodology can be used as a tool to help systematically design fuzzy logic control applications for other biological processes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Closed-loop, pilot/vehicle analysis of the approach and landing task

    NASA Technical Reports Server (NTRS)

    Schmidt, D. K.; Anderson, M. R.

    1985-01-01

    Optimal-control-theoretic modeling and frequency-domain analysis is the methodology proposed to evaluate analytically the handling qualities of higher-order manually controlled dynamic systems. Fundamental to the methodology is evaluating the interplay between pilot workload and closed-loop pilot/vehicle performance and stability robustness. The model-based metric for pilot workload is the required pilot phase compensation. Pilot/vehicle performance and loop stability is then evaluated using frequency-domain techniques. When these techniques were applied to the flight-test data for thirty-two highly-augmented fighter configurations, strong correlation was obtained between the analytical and experimental results.

  18. A Novel Performance Evaluation Methodology for Single-Target Trackers.

    PubMed

    Kristan, Matej; Matas, Jiri; Leonardis, Ales; Vojir, Tomas; Pflugfelder, Roman; Fernandez, Gustavo; Nebehay, Georg; Porikli, Fatih; Cehovin, Luka

    2016-11-01

    This paper addresses the problem of single-target tracker performance evaluation. We consider the performance measures, the dataset and the evaluation system to be the most important components of tracker evaluation and propose requirements for each of them. The requirements are the basis of a new evaluation methodology that aims at a simple and easily interpretable tracker comparison. The ranking-based methodology addresses tracker equivalence in terms of statistical significance and practical differences. A fully-annotated dataset with per-frame annotations with several visual attributes is introduced. The diversity of its visual properties is maximized in a novel way by clustering a large number of videos according to their visual attributes. This makes it the most sophistically constructed and annotated dataset to date. A multi-platform evaluation system allowing easy integration of third-party trackers is presented as well. The proposed evaluation methodology was tested on the VOT2014 challenge on the new dataset and 38 trackers, making it the largest benchmark to date. Most of the tested trackers are indeed state-of-the-art since they outperform the standard baselines, resulting in a highly-challenging benchmark. An exhaustive analysis of the dataset from the perspective of tracking difficulty is carried out. To facilitate tracker comparison a new performance visualization technique is proposed.

  19. The Impact of Bundled High Performance Human Resource Practices on Intention to Leave: Mediating Role of Emotional Exhaustion

    ERIC Educational Resources Information Center

    Jyoti, Jeevan; Rani, Roomi; Gandotra, Rupali

    2015-01-01

    Purpose: The purpose of this paper is to examine the mediating effect of emotional exhaustion (EE) in between bundled high-performance human resource practices (HPHRPs) and intention to leave (ITL) in the education sector. Design/methodology/approach: A survey questionnaire method was used to collect data from a sample of 514 teachers working in…

  20. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Niiya, Karen E.; Walker, Richard E.; Pieper, Jerry L.; Nguyen, Thong V.

    1993-01-01

    This final report includes a discussion of the work accomplished during the period from Dec. 1988 through Nov. 1991. The objective of the program was to assemble existing performance and combustion stability models into a usable design methodology capable of designing and analyzing high-performance and stable LOX/hydrocarbon booster engines. The methodology was then used to design a validation engine. The capabilities and validity of the methodology were demonstrated using this engine in an extensive hot fire test program. The engine used LOX/RP-1 propellants and was tested over a range of mixture ratios, chamber pressures, and acoustic damping device configurations. This volume contains time domain and frequency domain stability plots which indicate the pressure perturbation amplitudes and frequencies from approximately 30 tests of a 50K thrust rocket engine using LOX/RP-1 propellants over a range of chamber pressures from 240 to 1750 psia with mixture ratios of from 1.2 to 7.5. The data is from test configurations which used both bitune and monotune acoustic cavities and from tests with no acoustic cavities. The engine had a length of 14 inches and a contraction ratio of 2.0 using a 7.68 inch diameter injector. The data was taken from both stable and unstable tests. All combustion instabilities were spontaneous in the first tangential mode. Although stability bombs were used and generated overpressures of approximately 20 percent, no tests were driven unstable by the bombs. The stability instrumentation included six high-frequency Kistler transducers in the combustion chamber, a high-frequency Kistler transducer in each propellant manifold, and tri-axial accelerometers. Performance data is presented, both characteristic velocity efficiencies and energy release efficiencies, for those tests of sufficient duration to record steady state values.

  1. Dribble Files: Methodologies to Evaluate Learning and Performance in Complex Environments

    ERIC Educational Resources Information Center

    Schrader, P. G.; Lawless, Kimberly A.

    2007-01-01

    Research in the area of technology learning environments is tremendously complex. Tasks performed in these contexts are highly cognitive and mostly invisible to the observer. The nature of performance in these contexts is explained not only by the outcome but also by the process. However, evaluating the learning process with respect to tasks…

  2. Integrated structure/control design - Present methodology and future opportunities

    NASA Technical Reports Server (NTRS)

    Weisshaar, T. A.; Newsom, J. R.; Zeiler, T. A.; Gilbert, M. G.

    1986-01-01

    Attention is given to current methodology applied to the integration of the optimal design process for structures and controls. Multilevel linear decomposition techniques proved to be most effective in organizing the computational efforts necessary for ISCD (integrated structures and control design) tasks. With the development of large orbiting space structures and actively controlled, high performance aircraft, there will be more situations in which this concept can be applied.

  3. A theoretical-experimental methodology for assessing the sensitivity of biomedical spectral imaging platforms, assays, and analysis methods.

    PubMed

    Leavesley, Silas J; Sweat, Brenner; Abbott, Caitlyn; Favreau, Peter; Rich, Thomas C

    2018-01-01

    Spectral imaging technologies have been used for many years by the remote sensing community. More recently, these approaches have been applied to biomedical problems, where they have shown great promise. However, biomedical spectral imaging has been complicated by the high variance of biological data and the reduced ability to construct test scenarios with fixed ground truths. Hence, it has been difficult to objectively assess and compare biomedical spectral imaging assays and technologies. Here, we present a standardized methodology that allows assessment of the performance of biomedical spectral imaging equipment, assays, and analysis algorithms. This methodology incorporates real experimental data and a theoretical sensitivity analysis, preserving the variability present in biomedical image data. We demonstrate that this approach can be applied in several ways: to compare the effectiveness of spectral analysis algorithms, to compare the response of different imaging platforms, and to assess the level of target signature required to achieve a desired performance. Results indicate that it is possible to compare even very different hardware platforms using this methodology. Future applications could include a range of optimization tasks, such as maximizing detection sensitivity or acquisition speed, providing high utility for investigators ranging from design engineers to biomedical scientists. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Lunar Exploration Architecture Level Key Drivers and Sensitivities

    NASA Technical Reports Server (NTRS)

    Goodliff, Kandyce; Cirillo, William; Earle, Kevin; Reeves, J. D.; Shyface, Hilary; Andraschko, Mark; Merrill, R. Gabe; Stromgren, Chel; Cirillo, Christopher

    2009-01-01

    Strategic level analysis of the integrated behavior of lunar transportation and lunar surface systems architecture options is performed to assess the benefit, viability, affordability, and robustness of system design choices. This analysis employs both deterministic and probabilistic modeling techniques so that the extent of potential future uncertainties associated with each option are properly characterized. The results of these analyses are summarized in a predefined set of high-level Figures of Merit (FOMs) so as to provide senior NASA Constellation Program (CxP) and Exploration Systems Mission Directorate (ESMD) management with pertinent information to better inform strategic level decision making. The strategic level exploration architecture model is designed to perform analysis at as high a level as possible but still capture those details that have major impacts on system performance. The strategic analysis methodology focuses on integrated performance, affordability, and risk analysis, and captures the linkages and feedbacks between these three areas. Each of these results leads into the determination of the high-level FOMs. This strategic level analysis methodology has been previously applied to Space Shuttle and International Space Station assessments and is now being applied to the development of the Constellation Program point-of-departure lunar architecture. This paper provides an overview of the strategic analysis methodology and the lunar exploration architecture analyses to date. In studying these analysis results, the strategic analysis team has identified and characterized key drivers affecting the integrated architecture behavior. These key drivers include inclusion of a cargo lander, mission rate, mission location, fixed-versus- variable costs/return on investment, and the requirement for probabilistic analysis. Results of sensitivity analysis performed on lunar exploration architecture scenarios are also presented.

  5. Propulsion integration of hypersonic air-breathing vehicles utilizing a top-down design methodology

    NASA Astrophysics Data System (ADS)

    Kirkpatrick, Brad Kenneth

    In recent years, a focus of aerospace engineering design has been the development of advanced design methodologies and frameworks to account for increasingly complex and integrated vehicles. Techniques such as parametric modeling, global vehicle analyses, and interdisciplinary data sharing have been employed in an attempt to improve the design process. The purpose of this study is to introduce a new approach to integrated vehicle design known as the top-down design methodology. In the top-down design methodology, the main idea is to relate design changes on the vehicle system and sub-system level to a set of over-arching performance and customer requirements. Rather than focusing on the performance of an individual system, the system is analyzed in terms of the net effect it has on the overall vehicle and other vehicle systems. This detailed level of analysis can only be accomplished through the use of high fidelity computational tools such as Computational Fluid Dynamics (CFD) or Finite Element Analysis (FEA). The utility of the top-down design methodology is investigated through its application to the conceptual and preliminary design of a long-range hypersonic air-breathing vehicle for a hypothetical next generation hypersonic vehicle (NHRV) program. System-level design is demonstrated through the development of the nozzle section of the propulsion system. From this demonstration of the methodology, conclusions are made about the benefits, drawbacks, and cost of using the methodology.

  6. Ultra-high-performance liquid chromatography/tandem high-resolution mass spectrometry analysis of sixteen red beverages containing carminic acid: identification of degradation products by using principal component analysis/discriminant analysis.

    PubMed

    Gosetti, Fabio; Chiuminatto, Ugo; Mazzucco, Eleonora; Mastroianni, Rita; Marengo, Emilio

    2015-01-15

    The study investigates the sunlight photodegradation process of carminic acid, a natural red colourant used in beverages. For this purpose, both carminic acid aqueous standard solutions and sixteen different commercial beverages, ten containing carminic acid and six containing E120 dye, were subjected to photoirradiation. The results show different patterns of degradation, not only between the standard solutions and the beverages, but also from beverage to beverage. Due to the different beverage recipes, unpredictable reactions take place between the dye and the other ingredients. To identify the dye degradation products in a very complex scenario, a methodology was used, based on the combined use of principal component analysis with discriminant analysis and ultra-high-performance liquid chromatography coupled with tandem high resolution mass spectrometry. The methodology is unaffected by beverage composition and allows the degradation products of carminic acid dye to be identified for each beverage. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. VERA Core Simulator Methodology for PWR Cycle Depletion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kochunas, Brendan; Collins, Benjamin S; Jabaay, Daniel

    2015-01-01

    This paper describes the methodology developed and implemented in MPACT for performing high-fidelity pressurized water reactor (PWR) multi-cycle core physics calculations. MPACT is being developed primarily for application within the Consortium for the Advanced Simulation of Light Water Reactors (CASL) as one of the main components of the VERA Core Simulator, the others being COBRA-TF and ORIGEN. The methods summarized in this paper include a methodology for performing resonance self-shielding and computing macroscopic cross sections, 2-D/1-D transport, nuclide depletion, thermal-hydraulic feedback, and other supporting methods. These methods represent a minimal set needed to simulate high-fidelity models of a realistic nuclearmore » reactor. Results demonstrating this are presented from the simulation of a realistic model of the first cycle of Watts Bar Unit 1. The simulation, which approximates the cycle operation, is observed to be within 50 ppm boron (ppmB) reactivity for all simulated points in the cycle and approximately 15 ppmB for a consistent statepoint. The verification and validation of the PWR cycle depletion capability in MPACT is the focus of two companion papers.« less

  8. Cation-exchange high-performance liquid chromatography for variant hemoglobins and HbF/A2: What must hematopathologists know about methodology?

    PubMed

    Sharma, Prashant; Das, Reena

    2016-03-26

    Cation-exchange high-performance liquid chromatography (CE-HPLC) is a widely used laboratory test to detect variant hemoglobins as well as quantify hemoglobins F and A2 for the diagnosis of thalassemia syndromes. It's versatility, speed, reproducibility and convenience have made CE-HPLC the method of choice to initially screen for hemoglobin disorders. Despite its popularity, several methodological aspects of the technology remain obscure to pathologists and this may have consequences in specific situations. This paper discusses the basic principles of the technique, the initial quality control steps and the interpretation of various controls and variables that are available on the instrument output. Subsequent sections are devoted to methodological considerations that arise during reporting of cases. For instance, common problems of misidentified peaks, totals crossing 100%, causes of total area being above or below acceptable limits and the importance of pre-integration region peaks are dealt with. Ultimately, CE-HPLC remains an investigation, the reporting of which combines in-depth knowledge of the biological basics with more than a working knowledge of the technological aspects of the technique.

  9. Large-scale optimization-based non-negative computational framework for diffusion equations: Parallel implementation and performance studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Justin; Karra, Satish; Nakshatrala, Kalyana B.

    It is well-known that the standard Galerkin formulation, which is often the formulation of choice under the finite element method for solving self-adjoint diffusion equations, does not meet maximum principles and the non-negative constraint for anisotropic diffusion equations. Recently, optimization-based methodologies that satisfy maximum principles and the non-negative constraint for steady-state and transient diffusion-type equations have been proposed. To date, these methodologies have been tested only on small-scale academic problems. The purpose of this paper is to systematically study the performance of the non-negative methodology in the context of high performance computing (HPC). PETSc and TAO libraries are, respectively, usedmore » for the parallel environment and optimization solvers. For large-scale problems, it is important for computational scientists to understand the computational performance of current algorithms available in these scientific libraries. The numerical experiments are conducted on the state-of-the-art HPC systems, and a single-core performance model is used to better characterize the efficiency of the solvers. Furthermore, our studies indicate that the proposed non-negative computational framework for diffusion-type equations exhibits excellent strong scaling for real-world large-scale problems.« less

  10. Large-scale optimization-based non-negative computational framework for diffusion equations: Parallel implementation and performance studies

    DOE PAGES

    Chang, Justin; Karra, Satish; Nakshatrala, Kalyana B.

    2016-07-26

    It is well-known that the standard Galerkin formulation, which is often the formulation of choice under the finite element method for solving self-adjoint diffusion equations, does not meet maximum principles and the non-negative constraint for anisotropic diffusion equations. Recently, optimization-based methodologies that satisfy maximum principles and the non-negative constraint for steady-state and transient diffusion-type equations have been proposed. To date, these methodologies have been tested only on small-scale academic problems. The purpose of this paper is to systematically study the performance of the non-negative methodology in the context of high performance computing (HPC). PETSc and TAO libraries are, respectively, usedmore » for the parallel environment and optimization solvers. For large-scale problems, it is important for computational scientists to understand the computational performance of current algorithms available in these scientific libraries. The numerical experiments are conducted on the state-of-the-art HPC systems, and a single-core performance model is used to better characterize the efficiency of the solvers. Furthermore, our studies indicate that the proposed non-negative computational framework for diffusion-type equations exhibits excellent strong scaling for real-world large-scale problems.« less

  11. Methodology for estimating human perception to tremors in high-rise buildings

    NASA Astrophysics Data System (ADS)

    Du, Wenqi; Goh, Key Seng; Pan, Tso-Chien

    2017-07-01

    Human perception to tremors during earthquakes in high-rise buildings is usually associated with psychological discomfort such as fear and anxiety. This paper presents a methodology for estimating the level of perception to tremors for occupants living in high-rise buildings subjected to ground motion excitations. Unlike other approaches based on empirical or historical data, the proposed methodology performs a regression analysis using the analytical results of two generic models of 15 and 30 stories. The recorded ground motions in Singapore are collected and modified for structural response analyses. Simple predictive models are then developed to estimate the perception level to tremors based on a proposed ground motion intensity parameter—the average response spectrum intensity in the period range between 0.1 and 2.0 s. These models can be used to predict the percentage of occupants in high-rise buildings who may perceive the tremors at a given ground motion intensity. Furthermore, the models are validated with two recent tremor events reportedly felt in Singapore. It is found that the estimated results match reasonably well with the reports in the local newspapers and from the authorities. The proposed methodology is applicable to urban regions where people living in high-rise buildings might feel tremors during earthquakes.

  12. Simulating Effects of High Angle of Attack on Turbofan Engine Performance

    NASA Technical Reports Server (NTRS)

    Liu, Yuan; Claus, Russell W.; Litt, Jonathan S.; Guo, Ten-Huei

    2013-01-01

    A method of investigating the effects of high angle of attack (AOA) flight on turbofan engine performance is presented. The methodology involves combining a suite of diverse simulation tools. Three-dimensional, steady-state computational fluid dynamics (CFD) software is used to model the change in performance of a commercial aircraft-type inlet and fan geometry due to various levels of AOA. Parallel compressor theory is then applied to assimilate the CFD data with a zero-dimensional, nonlinear, dynamic turbofan engine model. The combined model shows that high AOA operation degrades fan performance and, thus, negatively impacts compressor stability margins and engine thrust. In addition, the engine response to high AOA conditions is shown to be highly dependent upon the type of control system employed.

  13. Investigating Dynamics of Eccentricity in Turbomachines

    NASA Technical Reports Server (NTRS)

    Baun, Daniel

    2010-01-01

    A methodology (and hardware and software to implement the methodology) has been developed as a means of investigating coupling between certain rotordynamic and hydrodynamic phenomena in turbomachines. Originally, the methodology was intended for application in an investigation of coupled rotordynamic and hydrodynamic effects postulated to have caused high synchronous vibration in the space shuttle s high-pressure oxygen turbopump (HPOTP). The methodology can also be applied in investigating (for the purpose of developing means of suppressing) undesired hydrodynamic rotor/stator interactions in turbomachines in general. The methodology and the types of phenomena that can be investigated by use of the methodology are best summarized by citing the original application as an example. In that application, in consideration of the high synchronous vibration in the space-shuttle main engine (SSME) HPOTP, it was determined to be necessary to perform tests to investigate the influence of inducer eccentricity and/or synchronous whirl motion on inducer hydrodynamic forces under prescribed flow and cavitation conditions. It was believed that manufacturing tolerances of the turbopump resulted in some induced runout of the pump rotor. Such runout, if oriented with an inducer blade, would cause that blade to run with tip clearance smaller than the tip clearances of the other inducer blades. It was hypothesized that the resulting hydraulic asymmetry, coupled with alternating blade cavitation, could give rise to the observed high synchronous vibration. In tests performed to investigate this hypothesis, prescribed rotor whirl motions have been imposed on a 1/3-scale water-rig version of the SSME LPOTP inducer (which is also a 4-biased inducer having similar cavitation dynamics as the HPOTP) in a magnetic-bearing test facility. The particular magnetic-bearing test facility, through active vibration control, affords a capability to impose, on the rotor, whirl orbits having shapes and whirl rates prescribed by the user, and to simultaneously measure the resulting hydrodynamic forces generated by the impeller. Active control also made it possible to modulate the inducer-blade running tip clearance and consequently effect alternating blade cavitation. The measured hydraulic forces have been compared and correlated with shroud dynamic-pressure measurements.

  14. Systematic review of the methodological quality of controlled trials evaluating Chinese herbal medicine in patients with rheumatoid arthritis

    PubMed Central

    Pan, Xin; Lopez-Olivo, Maria A; Song, Juhee; Pratt, Gregory; Suarez-Almazor, Maria E

    2017-01-01

    Objectives We appraised the methodological and reporting quality of randomised controlled clinical trials (RCTs) evaluating the efficacy and safety of Chinese herbal medicine (CHM) in patients with rheumatoid arthritis (RA). Design For this systematic review, electronic databases were searched from inception until June 2015. The search was limited to humans and non-case report studies, but was not limited by language, year of publication or type of publication. Two independent reviewers selected RCTs, evaluating CHM in RA (herbals and decoctions). Descriptive statistics were used to report on risk of bias and their adherence to reporting standards. Multivariable logistic regression analysis was performed to determine study characteristics associated with high or unclear risk of bias. Results Out of 2342 unique citations, we selected 119 RCTs including 18 919 patients: 10 108 patients received CHM alone and 6550 received one of 11 treatment combinations. A high risk of bias was observed across all domains: 21% had a high risk for selection bias (11% from sequence generation and 30% from allocation concealment), 85% for performance bias, 89% for detection bias, 4% for attrition bias and 40% for reporting bias. In multivariable analysis, fewer authors were associated with selection bias (allocation concealment), performance bias and attrition bias, and earlier year of publication and funding source not reported or disclosed were associated with selection bias (sequence generation). Studies published in non-English language were associated with reporting bias. Poor adherence to recommended reporting standards (<60% of the studies not providing sufficient information) was observed in 11 of the 23 sections evaluated. Limitations Study quality and data extraction were performed by one reviewer and cross-checked by a second reviewer. Translation to English was performed by one reviewer in 85% of the included studies. Conclusions Studies evaluating CHM often fail to meet expected methodological criteria, and high-quality evidence is lacking. PMID:28249848

  15. Shuttle TPS thermal performance and analysis methodology

    NASA Technical Reports Server (NTRS)

    Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.

    1983-01-01

    Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.

  16. Aerodynamic Parameters of High Performance Aircraft Estimated from Wind Tunnel and Flight Test Data

    NASA Technical Reports Server (NTRS)

    Klein, Vladislav; Murphy, Patrick C.

    1998-01-01

    A concept of system identification applied to high performance aircraft is introduced followed by a discussion on the identification methodology. Special emphasis is given to model postulation using time invariant and time dependent aerodynamic parameters, model structure determination and parameter estimation using ordinary least squares an mixed estimation methods, At the same time problems of data collinearity detection and its assessment are discussed. These parts of methodology are demonstrated in examples using flight data of the X-29A and X-31A aircraft. In the third example wind tunnel oscillatory data of the F-16XL model are used. A strong dependence of these data on frequency led to the development of models with unsteady aerodynamic terms in the form of indicial functions. The paper is completed by concluding remarks.

  17. Relating seed treatments to nursery performance: Experience with southern pines

    Treesearch

    James P. Barnett

    2008-01-01

    Producing good quality seeds that perform well in the nursery continues to be challenging. High quality conifer seeds are obtained by optimizing collecting, processing, storing, and treating methodologies, and such quality is needed to consistently produce uniform nursery crops. Although new technologies are becoming available to evaluate seed quality, they have not...

  18. 76 FR 21700 - Notice of Request for Extension and Revision of a Currently Approved Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-18

    ... analyses has its own methodology and time necessary to perform the analyses. (3) Aflatoxin in Pistachios Program (A High Performance Liquid Chromatography (HPLC) method for exporting pistachios to European Union requested by the California Pistachio Committee) and the domestic program using HPLC or a test kit analysis...

  19. Safeguards Technology Development Program 1st Quarter FY 2018 Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prasad, Manoj K.

    LLNL will evaluate the performance of a stilbene-based scintillation detector array for IAEA neutron multiplicity counting (NMC) applications. This effort will combine newly developed modeling methodologies and recently acquired high-efficiency stilbene detector units to quantitatively compare the prototype system performance with the conventional He-3 counters and liquid scintillator alternatives.

  20. High Performance Object-Oriented Scientific Programming in Fortran 90

    NASA Technical Reports Server (NTRS)

    Norton, Charles D.; Decyk, Viktor K.; Szymanski, Boleslaw K.

    1997-01-01

    We illustrate how Fortran 90 supports object-oriented concepts by example of plasma particle computations on the IBM SP. Our experience shows that Fortran 90 and object-oriented methodology give high performance while providing a bridge from Fortran 77 legacy codes to modern programming principles. All of our object-oriented Fortran 90 codes execute more quickly thatn the equeivalent C++ versions, yet the abstraction modelling capabilities used for scentific programming are comparably powereful.

  1. Good-to-Great Superintendents: An Examination of Jim Collins' Good-to-Great Level Five Leadership Attributes as Demonstrated by the Leadership Behaviors of Superintendents of High-Performing California Public Single-School Districts

    ERIC Educational Resources Information Center

    Brown, James D.

    2010-01-01

    Purpose: The purpose of this study was to examine Collins' good-to-great Level Five leadership attributes, as demonstrated by the leadership behaviors of superintendents of high-performing California public single-school districts. Methodology: The researcher used a case study design to conduct this study. Personal interviews were conducted in…

  2. [The methodological assessment and qualitative evaluation of psychometric performance tests based on the example of modern tests that assess reading and spelling skills].

    PubMed

    Galuschka, Katharina; Rothe, Josefine; Schulte-Körne, Gerd

    2015-09-01

    This article looks at a means of objectively evaluating the quality of psychometric tests. This approach enables users to evaluate psychometric tests based on their methodological characteristics, in order to decide which instrument should be used. Reading and spelling assessment tools serve as examples. The paper also provides a review of German psychometric tests for the assessment of reading and spelling skills. This method facilitates the identification of psychometric tests.of high methodological quality which can be used for the assessment of reading and spelling skills. Reading performance should ideally be assessed with the following instruments: ELFE 1-6, LGVT 6-12, LESEN 6-7, LESEN 8-9, or WLLP-R. The tests to be used for the evaluation of spelling skills are DERET 1-2+, DERET 3-4+, WRT 1+, WRT 2+, WRT 3+, WRT 4+ or HSP 1-10.

  3. Novel optoelectronic methodology for testing of MOEMS

    NASA Astrophysics Data System (ADS)

    Pryputniewicz, Ryszard J.; Furlong, Cosme

    2003-01-01

    Continued demands for delivery of high performance micro-optoelectromechanical systems (MOEMS) place unprecedented requirements on methods used in their development and operation. Metrology is a major and inseparable part of these methods. Optoelectronic methodology is an essential field of metrology. Due to its scalability, optoelectronic methodology is particularly suitable for testing of MOEMS where measurements must be made with ever increasing accuracy and precision. This was particularly evident during the last few years, characterized by miniaturization of devices, when requirements for measurements have rapidly increased as the emerging technologies introduced new products, especially, optical MEMS. In this paper, a novel optoelectronic methodology for testing of MOEMS is described and its applications are illustrated with representative examples. These examples demonstrate capability to measure submicron deformations of various components of the micromirror device, under operating conditions, and show viability of the optoelectronic methodology for testing of MOEMS.

  4. Portable parallel stochastic optimization for the design of aeropropulsion components

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Rhodes, G. S.

    1994-01-01

    This report presents the results of Phase 1 research to develop a methodology for performing large-scale Multi-disciplinary Stochastic Optimization (MSO) for the design of aerospace systems ranging from aeropropulsion components to complete aircraft configurations. The current research recognizes that such design optimization problems are computationally expensive, and require the use of either massively parallel or multiple-processor computers. The methodology also recognizes that many operational and performance parameters are uncertain, and that uncertainty must be considered explicitly to achieve optimum performance and cost. The objective of this Phase 1 research was to initialize the development of an MSO methodology that is portable to a wide variety of hardware platforms, while achieving efficient, large-scale parallelism when multiple processors are available. The first effort in the project was a literature review of available computer hardware, as well as review of portable, parallel programming environments. The first effort was to implement the MSO methodology for a problem using the portable parallel programming language, Parallel Virtual Machine (PVM). The third and final effort was to demonstrate the example on a variety of computers, including a distributed-memory multiprocessor, a distributed-memory network of workstations, and a single-processor workstation. Results indicate the MSO methodology can be well-applied towards large-scale aerospace design problems. Nearly perfect linear speedup was demonstrated for computation of optimization sensitivity coefficients on both a 128-node distributed-memory multiprocessor (the Intel iPSC/860) and a network of workstations (speedups of almost 19 times achieved for 20 workstations). Very high parallel efficiencies (75 percent for 31 processors and 60 percent for 50 processors) were also achieved for computation of aerodynamic influence coefficients on the Intel. Finally, the multi-level parallelization strategy that will be needed for large-scale MSO problems was demonstrated to be highly efficient. The same parallel code instructions were used on both platforms, demonstrating portability. There are many applications for which MSO can be applied, including NASA's High-Speed-Civil Transport, and advanced propulsion systems. The use of MSO will reduce design and development time and testing costs dramatically.

  5. Development and application of stir bar sorptive extraction with polyurethane foams for the determination of testosterone and methenolone in urine matrices.

    PubMed

    Sequeiros, R C P; Neng, N R; Portugal, F C M; Pinto, M L; Pires, J; Nogueira, J M F

    2011-04-01

    This work describes the development, validation, and application of a novel methodology for the determination of testosterone and methenolone in urine matrices by stir bar sorptive extraction using polyurethane foams [SBSE(PU)] followed by liquid desorption and high-performance liquid chromatography with diode array detection. The methodology was optimized in terms of extraction time, agitation speed, pH, ionic strength and organic modifier, as well as back-extraction solvent and desorption time. Under optimized experimental conditions, convenient accuracy were achieved with average recoveries of 49.7 8.6% for testosterone and 54.2 ± 4.7% for methenolone. Additionally, the methodology showed good precision (<9%), excellent linear dynamic ranges (>0.9963) and convenient detection limits (0.2-0.3 μg/L). When comparing the efficiency obtained by SBSE(PU) and with the conventional polydimethylsiloxane phase [SBSE(PDMS)], yields up to four-fold higher are attained for the former, under the same experimental conditions. The application of the proposed methodology for the analysis of testosterone and methenolone in urine matrices showed negligible matrix effects and good analytical performance.

  6. Adaptation of Mesoscale Weather Models to Local Forecasting

    NASA Technical Reports Server (NTRS)

    Manobianco, John T.; Taylor, Gregory E.; Case, Jonathan L.; Dianic, Allan V.; Wheeler, Mark W.; Zack, John W.; Nutter, Paul A.

    2003-01-01

    Methodologies have been developed for (1) configuring mesoscale numerical weather-prediction models for execution on high-performance computer workstations to make short-range weather forecasts for the vicinity of the Kennedy Space Center (KSC) and the Cape Canaveral Air Force Station (CCAFS) and (2) evaluating the performances of the models as configured. These methodologies have been implemented as part of a continuing effort to improve weather forecasting in support of operations of the U.S. space program. The models, methodologies, and results of the evaluations also have potential value for commercial users who could benefit from tailoring their operations and/or marketing strategies based on accurate predictions of local weather. More specifically, the purpose of developing the methodologies for configuring the models to run on computers at KSC and CCAFS is to provide accurate forecasts of winds, temperature, and such specific thunderstorm-related phenomena as lightning and precipitation. The purpose of developing the evaluation methodologies is to maximize the utility of the models by providing users with assessments of the capabilities and limitations of the models. The models used in this effort thus far include the Mesoscale Atmospheric Simulation System (MASS), the Regional Atmospheric Modeling System (RAMS), and the National Centers for Environmental Prediction Eta Model ( Eta for short). The configuration of the MASS and RAMS is designed to run the models at very high spatial resolution and incorporate local data to resolve fine-scale weather features. Model preprocessors were modified to incorporate surface, ship, buoy, and rawinsonde data as well as data from local wind towers, wind profilers, and conventional or Doppler radars. The overall evaluation of the MASS, Eta, and RAMS was designed to assess the utility of these mesoscale models for satisfying the weather-forecasting needs of the U.S. space program. The evaluation methodology includes objective and subjective verification methodologies. Objective (e.g., statistical) verification of point forecasts is a stringent measure of model performance, but when used alone, it is not usually sufficient for quantifying the value of the overall contribution of the model to the weather-forecasting process. This is especially true for mesoscale models with enhanced spatial and temporal resolution that may be capable of predicting meteorologically consistent, though not necessarily accurate, fine-scale weather phenomena. Therefore, subjective (phenomenological) evaluation, focusing on selected case studies and specific weather features, such as sea breezes and precipitation, has been performed to help quantify the added value that cannot be inferred solely from objective evaluation.

  7. Guidance on the Technology Performance Level (TPL) Assessment Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weber, Jochem; Roberts, Jesse D.; Babarit, Aurelien

    This document presents the revised Technology Performance Level (TPL) assessment methodology. There are three parts to this revised methodology 1) the Stakeholder Needs and Assessment Guidance (this document), 2) the Technical Submission form, 3) the TPL scoring spreadsheet. The TPL assessment is designed to give a technology neutral or agnostic assessment of any wave energy converter technology. The focus of the TPL is on the performance of the technology in meeting the customer’s needs. The original TPL is described in [1, 2] and those references also detail the critical differences in the nature of the TPL when compared to themore » more widely used technology readiness level (TRL). (Wave energy TRL is described in [3]). The revised TPL is particularly intended to be useful to investors and also to assist technology developers to conduct comprehensive assessments in a way that is meaningful and attractive to investors. The revised TPL assessment methodology has been derived through a structured Systems Engineering approach. This was a formal process which involved analyzing customer and stakeholder needs through the discipline of Systems Engineering. The results of the process confirmed the high level of completeness of the original methodology presented in [1] (as used in the Wave Energy Prize judging) and now add a significantly increased level of detail in the assessment and an improved more investment focused structure. The revised TPL also incorporates the feedback of the Wave Energy Prize judges.« less

  8. Nondestructive laboratory measurement of geotechnical and geoacoustic properties through intact core-liner

    USGS Publications Warehouse

    Kayen, R.E.; Edwards, B.D.; Lee, H.J.

    1999-01-01

    High-resolution automated measurement of the geotechnical and geoacoustic properties of soil at the U.S. Geological Survey (USGS) is performed with a state-of-the-art multi-sensor whole-core logging device. The device takes measurements, directly through intact sample-tube wall, of p-wave acoustic velocity, of soil wet bulk density, and magnetic susceptibility. This paper summarizes our methodology for determining soil-sound speed and wet-bulk density for material encased in an unsplit liner. Our methodology for nondestructive measurement allows for rapid, accurate, and high-resolution (1 cm-spaced) mapping of the mass physical properties of soil prior to sample extrusion.

  9. Determining radiated sound power of building structures by means of laser Doppler vibrometry

    NASA Astrophysics Data System (ADS)

    Roozen, N. B.; Labelle, L.; Rychtáriková, M.; Glorieux, C.

    2015-06-01

    This paper introduces a methodology that makes use of laser Doppler vibrometry to assess the acoustic insulation performance of a building element. The sound power radiated by the surface of the element is numerically determined from the vibrational pattern, offering an alternative for classical microphone measurements. Compared to the latter the proposed analysis is not sensitive to room acoustical effects. This allows the proposed methodology to be used at low frequencies, where the standardized microphone based approach suffers from a high uncertainty due to a low acoustic modal density. Standardized measurements as well as laser Doppler vibrometry measurements and computations have been performed on two test panels, a light-weight wall and a gypsum block wall and are compared and discussed in this paper. The proposed methodology offers an adequate solution for the assessment of the acoustic insulation of building elements at low frequencies. This is crucial in the framework of recent proposals of acoustic standards for measurement approaches and single number sound insulation performance ratings to take into account frequencies down to 50 Hz.

  10. Performance and cost evaluation of health information systems using micro-costing and discrete-event simulation.

    PubMed

    Rejeb, Olfa; Pilet, Claire; Hamana, Sabri; Xie, Xiaolan; Durand, Thierry; Aloui, Saber; Doly, Anne; Biron, Pierre; Perrier, Lionel; Augusto, Vincent

    2018-06-01

    Innovation and health-care funding reforms have contributed to the deployment of Information and Communication Technology (ICT) to improve patient care. Many health-care organizations considered the application of ICT as a crucial key to enhance health-care management. The purpose of this paper is to provide a methodology to assess the organizational impact of high-level Health Information System (HIS) on patient pathway. We propose an integrated performance evaluation of HIS approach through the combination of formal modeling using the Architecture of Integrated Information Systems (ARIS) models, a micro-costing approach for cost evaluation, and a Discrete-Event Simulation (DES) approach. The methodology is applied to the consultation for cancer treatment process. Simulation scenarios are established to conclude about the impact of HIS on patient pathway. We demonstrated that although high level HIS lengthen the consultation, occupation rate of oncologists are lower and quality of service is higher (through the number of available information accessed during the consultation to formulate the diagnostic). The provided method allows also to determine the most cost-effective ICT elements to improve the care process quality while minimizing costs. The methodology is flexible enough to be applied to other health-care systems.

  11. High RAP mixes design methodology with balanced performance.

    DOT National Transportation Integrated Search

    2011-11-01

    "The use of reclaimed asphalt pavement (RAP) and recycled asphalt shingles (RAS) can significantly reduce the increasing cost of hot-mix asphalt paving, conserve energy, and protect the environment. This report presents a comprehensive study focusing...

  12. Methodology Developed for Modeling the Fatigue Crack Growth Behavior of Single-Crystal, Nickel-Base Superalloys

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Because of their superior high-temperature properties, gas generator turbine airfoils made of single-crystal, nickel-base superalloys are fast becoming the standard equipment on today's advanced, high-performance aerospace engines. The increased temperature capabilities of these airfoils has allowed for a significant increase in the operating temperatures in turbine sections, resulting in superior propulsion performance and greater efficiencies. However, the previously developed methodologies for life-prediction models are based on experience with polycrystalline alloys and may not be applicable to single-crystal alloys under certain operating conditions. One of the main areas where behavior differences between single-crystal and polycrystalline alloys are readily apparent is subcritical fatigue crack growth (FCG). The NASA Lewis Research Center's work in this area enables accurate prediction of the subcritical fatigue crack growth behavior in single-crystal, nickel-based superalloys at elevated temperatures.

  13. HPC Programming on Intel Many-Integrated-Core Hardware with MAGMA Port to Xeon Phi

    DOE PAGES

    Dongarra, Jack; Gates, Mark; Haidar, Azzam; ...

    2015-01-01

    This paper presents the design and implementation of several fundamental dense linear algebra (DLA) algorithms for multicore with Intel Xeon Phi coprocessors. In particular, we consider algorithms for solving linear systems. Further, we give an overview of the MAGMA MIC library, an open source, high performance library, that incorporates the developments presented here and, more broadly, provides the DLA functionality equivalent to that of the popular LAPACK library while targeting heterogeneous architectures that feature a mix of multicore CPUs and coprocessors. The LAPACK-compliance simplifies the use of the MAGMA MIC library in applications, while providing them with portably performant DLA.more » High performance is obtained through the use of the high-performance BLAS, hardware-specific tuning, and a hybridization methodology whereby we split the algorithm into computational tasks of various granularities. Execution of those tasks is properly scheduled over the heterogeneous hardware by minimizing data movements and mapping algorithmic requirements to the architectural strengths of the various heterogeneous hardware components. Our methodology and programming techniques are incorporated into the MAGMA MIC API, which abstracts the application developer from the specifics of the Xeon Phi architecture and is therefore applicable to algorithms beyond the scope of DLA.« less

  14. The Practice of Co-Creating Leadership in High- and Low-Performing High Schools

    ERIC Educational Resources Information Center

    Jarrett, Ehren; Wasonga, Teresa; Murphy, John

    2010-01-01

    Purpose: The purpose of this paper is to examine teacher perceptions of the practice of co-creating leadership and its potential impacts on student achievement. Design/methodology/approach: Using a quantitative approach, the study compared the levels of the practice of co-creating leadership dispositional values and institutional conditions that…

  15. Successful Principalship of High-Performance Schools in High-Poverty Communities

    ERIC Educational Resources Information Center

    Mulford, Bill; Kendall, Diana; Ewington, John; Edmunds, Bill; Kendall, Lawrie; Silins, Halia

    2008-01-01

    Purpose--The purpose of this article is to review literature in certain areas and report on related results from a study of successful school principalship in the Australian state of Tasmania. Design/methodology/approach--Surveys on successful school principalship were distributed to a population of 195 government schools (excluding colleges and…

  16. Onboard FPGA-based SAR processing for future spaceborne systems

    NASA Technical Reports Server (NTRS)

    Le, Charles; Chan, Samuel; Cheng, Frank; Fang, Winston; Fischman, Mark; Hensley, Scott; Johnson, Robert; Jourdan, Michael; Marina, Miguel; Parham, Bruce; hide

    2004-01-01

    We present a real-time high-performance and fault-tolerant FPGA-based hardware architecture for the processing of synthetic aperture radar (SAR) images in future spaceborne system. In particular, we will discuss the integrated design approach, from top-level algorithm specifications and system requirements, design methodology, functional verification and performance validation, down to hardware design and implementation.

  17. Electropyroelectric technique: A methodology free of fitting procedures for thermal effusivity determination in liquids.

    PubMed

    Ivanov, R; Marin, E; Villa, J; Gonzalez, E; Rodríguez, C I; Olvera, J E

    2015-06-01

    This paper describes an alternative methodology to determine the thermal effusivity of a liquid sample using the recently proposed electropyroelectric technique, without fitting the experimental data with a theoretical model and without having to know the pyroelectric sensor related parameters, as in most previous reported approaches. The method is not absolute, because a reference liquid with known thermal properties is needed. Experiments have been performed that demonstrate the high reliability and accuracy of the method with measurement uncertainties smaller than 3%.

  18. Focus control enhancement and on-product focus response analysis methodology

    NASA Astrophysics Data System (ADS)

    Kim, Young Ki; Chen, Yen-Jen; Hao, Xueli; Samudrala, Pavan; Gomez, Juan-Manuel; Mahoney, Mark O.; Kamalizadeh, Ferhad; Hanson, Justin K.; Lee, Shawn; Tian, Ye

    2016-03-01

    With decreasing CDOF (Critical Depth Of Focus) for 20/14nm technology and beyond, focus errors are becoming increasingly critical for on-product performance. Current on product focus control techniques in high volume manufacturing are limited; It is difficult to define measurable focus error and optimize focus response on product with existing methods due to lack of credible focus measurement methodologies. Next to developments in imaging and focus control capability of scanners and general tool stability maintenance, on-product focus control improvements are also required to meet on-product imaging specifications. In this paper, we discuss focus monitoring, wafer (edge) fingerprint correction and on-product focus budget analysis through diffraction based focus (DBF) measurement methodology. Several examples will be presented showing better focus response and control on product wafers. Also, a method will be discussed for a focus interlock automation system on product for a high volume manufacturing (HVM) environment.

  19. Possible Improvements of the ACE Diversity Interchange Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etingov, Pavel V.; Zhou, Ning; Makarov, Yuri V.

    2010-07-26

    North American Electric Reliability Corporation (NERC) grid is operated by about 131 balancing authorities (BA). Within each BA, operators are responsible for managing the unbalance (caused by both load and wind). As wind penetration levels increase, the challenges of managing power variation increases. Working independently, balancing area with limited regulating/load following generation and high wind power penetration faces significant challenges. The benefits of BA cooperation and consolidation increase when there is a significant wind energy penetration. To explore the benefits of BA cooperation, this paper investigates ACE sharing approach. A technology called ACE diversity interchange (ADI) is already in usemore » in the western interconnection. A new methodology extending ADI is proposed in the paper. The proposed advanced ADI overcoming some limitations existing in conventional ADI. Simulations using real statistical data of CAISO and BPA have shown high performance of the proposed advanced ADI methodology.« less

  20. Analysis of torque transmitting behavior and wheel slip prevention control during regenerative braking for high speed EMU trains

    NASA Astrophysics Data System (ADS)

    Xu, Kun; Xu, Guo-Qing; Zheng, Chun-Hua

    2016-04-01

    The wheel-rail adhesion control for regenerative braking systems of high speed electric multiple unit trains is crucial to maintaining the stability, improving the adhesion utilization, and achieving deep energy recovery. There remain technical challenges mainly because of the nonlinear, uncertain, and varying features of wheel-rail contact conditions. This research analyzes the torque transmitting behavior during regenerative braking, and proposes a novel methodology to detect the wheel-rail adhesion stability. Then, applications to the wheel slip prevention during braking are investigated, and the optimal slip ratio control scheme is proposed, which is based on a novel optimal reference generation of the slip ratio and a robust sliding mode control. The proposed methodology achieves the optimal braking performance without the wheel-rail contact information. Numerical simulation results for uncertain slippery rails verify the effectiveness of the proposed methodology.

  1. Obtaining optic disc center and pixel region by automatic thresholding methods on morphologically processed fundus images.

    PubMed

    Marin, Diego; Gegundez-Arias, Manuel E; Suero, Angel; Bravo, Jose M

    2015-02-01

    Development of automatic retinal disease diagnosis systems based on retinal image computer analysis can provide remarkably quicker screening programs for early detection. Such systems are mainly focused on the detection of the earliest ophthalmic signs of illness and require previous identification of fundal landmark features such as optic disc (OD), fovea or blood vessels. A methodology for accurate center-position location and OD retinal region segmentation on digital fundus images is presented in this paper. The methodology performs a set of iterative opening-closing morphological operations on the original retinography intensity channel to produce a bright region-enhanced image. Taking blood vessel confluence at the OD into account, a 2-step automatic thresholding procedure is then applied to obtain a reduced region of interest, where the center and the OD pixel region are finally obtained by performing the circular Hough transform on a set of OD boundary candidates generated through the application of the Prewitt edge detector. The methodology was evaluated on 1200 and 1748 fundus images from the publicly available MESSIDOR and MESSIDOR-2 databases, acquired from diabetic patients and thus being clinical cases of interest within the framework of automated diagnosis of retinal diseases associated to diabetes mellitus. This methodology proved highly accurate in OD-center location: average Euclidean distance between the methodology-provided and actual OD-center position was 6.08, 9.22 and 9.72 pixels for retinas of 910, 1380 and 1455 pixels in size, respectively. On the other hand, OD segmentation evaluation was performed in terms of Jaccard and Dice coefficients, as well as the mean average distance between estimated and actual OD boundaries. Comparison with the results reported by other reviewed OD segmentation methodologies shows our proposal renders better overall performance. Its effectiveness and robustness make this proposed automated OD location and segmentation method a suitable tool to be integrated into a complete prescreening system for early diagnosis of retinal diseases. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. Assessing the environmental performance of English arable and livestock holdings using data from the Farm Accountancy Data Network (FADN).

    PubMed

    Westbury, D B; Park, J R; Mauchline, A L; Crane, R T; Mortimer, S R

    2011-03-01

    Agri-environment schemes (AESs) have been implemented across EU member states in an attempt to reconcile agricultural production methods with protection of the environment and maintenance of the countryside. To determine the extent to which such policy objectives are being fulfilled, participating countries are obliged to monitor and evaluate the environmental, agricultural and socio-economic impacts of their AESs. However, few evaluations measure precise environmental outcomes and critically, there are no agreed methodologies to evaluate the benefits of particular agri-environmental measures, or to track the environmental consequences of changing agricultural practices. In response to these issues, the Agri-Environmental Footprint project developed a common methodology for assessing the environmental impact of European AES. The Agri-Environmental Footprint Index (AFI) is a farm-level, adaptable methodology that aggregates measurements of agri-environmental indicators based on Multi-Criteria Analysis (MCA) techniques. The method was developed specifically to allow assessment of differences in the environmental performance of farms according to participation in agri-environment schemes. The AFI methodology is constructed so that high values represent good environmental performance. This paper explores the use of the AFI methodology in combination with Farm Business Survey data collected in England for the Farm Accountancy Data Network (FADN), to test whether its use could be extended for the routine surveillance of environmental performance of farming systems using established data sources. Overall, the aim was to measure the environmental impact of three different types of agriculture (arable, lowland livestock and upland livestock) in England and to identify differences in AFI due to participation in agri-environment schemes. However, because farm size, farmer age, level of education and region are also likely to influence the environmental performance of a holding, these factors were also considered. Application of the methodology revealed that only arable holdings participating in agri-environment schemes had a greater environmental performance, although responses differed between regions. Of the other explanatory variables explored, the key factors determining the environmental performance for lowland livestock holdings were farm size, farmer age and level of education. In contrast, the AFI value of upland livestock holdings differed only between regions. The paper demonstrates that the AFI methodology can be used readily with English FADN data and therefore has the potential to be applied more widely to similar data sources routinely collected across the EU-27 in a standardised manner. Copyright © 2010 Elsevier Ltd. All rights reserved.

  3. A theoretical and experimental investigation of propeller performance methodologies

    NASA Technical Reports Server (NTRS)

    Korkan, K. D.; Gregorek, G. M.; Mikkelson, D. C.

    1980-01-01

    This paper briefly covers aspects related to propeller performance by means of a review of propeller methodologies; presentation of wind tunnel propeller performance data taken in the NASA Lewis Research Center 10 x 10 wind tunnel; discussion of the predominent limitations of existing propeller performance methodologies; and a brief review of airfoil developments appropriate for propeller applications.

  4. Verification of nonlinear dynamic structural test results by combined image processing and acoustic analysis

    NASA Astrophysics Data System (ADS)

    Tene, Yair; Tene, Noam; Tene, G.

    1993-08-01

    An interactive data fusion methodology of video, audio, and nonlinear structural dynamic analysis for potential application in forensic engineering is presented. The methodology was developed and successfully demonstrated in the analysis of heavy transportable bridge collapse during preparation for testing. Multiple bridge elements failures were identified after the collapse, including fracture, cracks and rupture of high performance structural materials. Videotape recording by hand held camcorder was the only source of information about the collapse sequence. The interactive data fusion methodology resulted in extracting relevant information form the videotape and from dynamic nonlinear structural analysis, leading to full account of the sequence of events during the bridge collapse.

  5. Integrated corridor management modeling results report : Dallas, Minneapolis, and San Diego.

    DOT National Transportation Integrated Search

    2012-02-01

    This executive summary documents the analysis methodologies, tools, and performance measures used to analyze Integrated Corridor Management (ICM) strategies; and presents high-level results for the successful implementation of ICM at three Stage 2 Pi...

  6. Sandia National Laboratories performance assessment methodology for long-term environmental programs : the history of nuclear waste management.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marietta, Melvin Gary; Anderson, D. Richard; Bonano, Evaristo J.

    2011-11-01

    Sandia National Laboratories (SNL) is the world leader in the development of the detailed science underpinning the application of a probabilistic risk assessment methodology, referred to in this report as performance assessment (PA), for (1) understanding and forecasting the long-term behavior of a radioactive waste disposal system, (2) estimating the ability of the disposal system and its various components to isolate the waste, (3) developing regulations, (4) implementing programs to estimate the safety that the system can afford to individuals and to the environment, and (5) demonstrating compliance with the attendant regulatory requirements. This report documents the evolution of themore » SNL PA methodology from inception in the mid-1970s, summarizing major SNL PA applications including: the Subseabed Disposal Project PAs for high-level radioactive waste; the Waste Isolation Pilot Plant PAs for disposal of defense transuranic waste; the Yucca Mountain Project total system PAs for deep geologic disposal of spent nuclear fuel and high-level radioactive waste; PAs for the Greater Confinement Borehole Disposal boreholes at the Nevada National Security Site; and PA evaluations for disposal of high-level wastes and Department of Energy spent nuclear fuels stored at Idaho National Laboratory. In addition, the report summarizes smaller PA programs for long-term cover systems implemented for the Monticello, Utah, mill-tailings repository; a PA for the SNL Mixed Waste Landfill in support of environmental restoration; PA support for radioactive waste management efforts in Egypt, Iraq, and Taiwan; and, most recently, PAs for analysis of alternative high-level radioactive waste disposal strategies including repositories deep borehole disposal and geologic repositories in shale and granite. Finally, this report summarizes the extension of the PA methodology for radioactive waste disposal toward development of an enhanced PA system for carbon sequestration and storage systems. These efforts have produced a generic PA methodology for the evaluation of waste management systems that has gained wide acceptance within the international community. This report documents how this methodology has been used as an effective management tool to evaluate different disposal designs and sites; inform development of regulatory requirements; identify, prioritize, and guide research aimed at reducing uncertainties for objective estimations of risk; and support safety assessments.« less

  7. Analysis and methodology for aeronautical systems technology program planning

    NASA Technical Reports Server (NTRS)

    White, M. J.; Gershkoff, I.; Lamkin, S.

    1983-01-01

    A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.

  8. Human region segmentation and description methods for domiciliary healthcare monitoring using chromatic methodology

    NASA Astrophysics Data System (ADS)

    Al-Temeemy, Ali A.

    2018-03-01

    A descriptor is proposed for use in domiciliary healthcare monitoring systems. The descriptor is produced from chromatic methodology to extract robust features from the monitoring system's images. It has superior discrimination capabilities, is robust to events that normally disturb monitoring systems, and requires less computational time and storage space to achieve recognition. A method of human region segmentation is also used with this descriptor. The performance of the proposed descriptor was evaluated using experimental data sets, obtained through a series of experiments performed in the Centre for Intelligent Monitoring Systems, University of Liverpool. The evaluation results show high recognition performance for the proposed descriptor in comparison to traditional descriptors, such as moments invariant. The results also show the effectiveness of the proposed segmentation method regarding distortion effects associated with domiciliary healthcare systems.

  9. Assessing the Impact of Clothing and Individual Equipment (CIE) on Soldier Physical, Biomechanical, and Cognitive Performance Part 1: Test Methodology

    DTIC Science & Technology

    2018-02-01

    29 during Soldier Equipment Configuration Impact on Performance: Establishing a Test Methodology for the...Performance of Medium Rucksack Prototypes An investigation: Comparison of live-fire and weapon simulator test methodologies and the of three extremity armor

  10. Performance Management in Healthcare Organizations: Concept and Practicum.

    PubMed

    Dimitropoulos, Panagiotis E

    2017-01-01

    Organizational performance can create and sustain competitive advantages for corporations and even improve their sustainability and future prospects. Health care organizations present a sector where performance management is structured by multiple dimensions. The scope of this study is to analyze the issue of performance management in healthcare organizations and specifically the implementation of the Balanced Scorecard (BSC) methodology on organizations providing health services. The study provides a discussion on the BSC development process, the steps that management has to take in order to prepare the implementation of the BSC and finally discusses a practical example of a scorecard with specific strategic goals and performance indicators. Managers of healthcare organizations and specifically those providing services to the elderly and the general population could use the propositions of the study as a roadmap for processing, analyzing, evaluating and implementing the balanced scorecard approach in their organizations' daily operations. BSC methodology can give an advantage in terms of enhanced stakeholder management and preservation within a highly volatile and competitive economic environment.

  11. SOAC - State-of-the-Art Car Engineering Tests at Department of Transportation High Speed Ground Test Center : Volume 2. Performance Tests.

    DOT National Transportation Integrated Search

    1975-01-01

    The six-volume report presents the technical methodology, data samples, and results of tests conducted on the SOAC on the Rail Transit Test Track at the High Speed Ground Test Center in Pueblo, Colorado during the period April to July 1973. The Test ...

  12. One-to-One Computing and Student Achievement in Ohio High Schools

    ERIC Educational Resources Information Center

    Williams, Nancy L.; Larwin, Karen H.

    2016-01-01

    This study explores the impact of one-to-one computing on student achievement in Ohio high schools as measured by performance on the Ohio Graduation Test (OGT). The sample included 24 treatment schools that were individually paired with a similar control school. An interrupted time series methodology was deployed to examine OGT data over a period…

  13. Faculty Trust in the Principal: An Essential Ingredient in High-Performing Schools

    ERIC Educational Resources Information Center

    Tschannen-Moran, Megan; Gareis, Christopher R.

    2015-01-01

    Purpose: The purpose of this paper is to explore the relationships among faculty trust in the principal, principal leadership behaviors, school climate, and student achievement. Design/methodology/approach: Data from 64 elementary, middle, and high schools in two school districts formed the basis of the study (n = 3,215 teachers), allowing for…

  14. Journal impact factor and methodological quality of surgical randomized controlled trials: an empirical study.

    PubMed

    Ahmed Ali, Usama; Reiber, Beata M M; Ten Hove, Joren R; van der Sluis, Pieter C; Gooszen, Hein G; Boermeester, Marja A; Besselink, Marc G

    2017-11-01

    The journal impact factor (IF) is often used as a surrogate marker for methodological quality. The objective of this study is to evaluate the relation between the journal IF and methodological quality of surgical randomized controlled trials (RCTs). Surgical RCTs published in PubMed in 1999 and 2009 were identified. According to IF, RCTs were divided into groups of low (<2), median (2-3) and high IF (>3), as well as into top-10 vs all other journals. Methodological quality characteristics and factors concerning funding, ethical approval and statistical significance of outcomes were extracted and compared between the IF groups. Additionally, a multivariate regression was performed. The median IF was 2.2 (IQR 2.37). The percentage of 'low-risk of bias' RCTs was 13% for top-10 journals vs 4% for other journals in 1999 (P < 0.02), and 30 vs 12% in 2009 (P < 0.02). Similar results were observed for high vs low IF groups. The presence of sample-size calculation, adequate generation of allocation and intention-to-treat analysis were independently associated with publication in higher IF journals; as were multicentre trials and multiple authors. Publication of RCTs in high IF journals is associated with moderate improvement in methodological quality compared to RCTs published in lower IF journals. RCTs with adequate sample-size calculation, generation of allocation or intention-to-treat analysis were associated with publication in a high IF journal. On the other hand, reporting a statistically significant outcome and being industry funded were not independently associated with publication in a higher IF journal.

  15. Diagnostic methods for CW laser damage testing

    NASA Astrophysics Data System (ADS)

    Stewart, Alan F.; Shah, Rashmi S.

    2004-06-01

    High performance optical coatings are an enabling technology for many applications - navigation systems, telecom, fusion, advanced measurement systems of many types as well as directed energy weapons. The results of recent testing of superior optical coatings conducted at high flux levels will be presented. The diagnostics used in this type of nondestructive testing and the analysis of the data demonstrates the evolution of test methodology. Comparison of performance data under load to the predictions of thermal and optical models shows excellent agreement. These tests serve to anchor the models and validate the performance of the materials and coatings.

  16. Extended cooperative control synthesis

    NASA Technical Reports Server (NTRS)

    Davidson, John B.; Schmidt, David K.

    1994-01-01

    This paper reports on research for extending the Cooperative Control Synthesis methodology to include a more accurate modeling of the pilot's controller dynamics. Cooperative Control Synthesis (CCS) is a methodology that addresses the problem of how to design control laws for piloted, high-order, multivariate systems and/or non-conventional dynamic configurations in the absence of flying qualities specifications. This is accomplished by emphasizing the parallel structure inherent in any pilot-controlled, augmented vehicle. The original CCS methodology is extended to include the Modified Optimal Control Model (MOCM), which is based upon the optimal control model of the human operator developed by Kleinman, Baron, and Levison in 1970. This model provides a modeling of the pilot's compensation dynamics that is more accurate than the simplified pilot dynamic representation currently in the CCS methodology. Inclusion of the MOCM into the CCS also enables the modeling of pilot-observation perception thresholds and pilot-observation attention allocation affects. This Extended Cooperative Control Synthesis (ECCS) allows for the direct calculation of pilot and system open- and closed-loop transfer functions in pole/zero form and is readily implemented in current software capable of analysis and design for dynamic systems. Example results based upon synthesizing an augmentation control law for an acceleration command system in a compensatory tracking task using the ECCS are compared with a similar synthesis performed by using the original CCS methodology. The ECCS is shown to provide augmentation control laws that yield more favorable, predicted closed-loop flying qualities and tracking performance than those synthesized using the original CCS methodology.

  17. Building high-performance system for processing a daily large volume of Chinese satellites imagery

    NASA Astrophysics Data System (ADS)

    Deng, Huawu; Huang, Shicun; Wang, Qi; Pan, Zhiqiang; Xin, Yubin

    2014-10-01

    The number of Earth observation satellites from China increases dramatically recently and those satellites are acquiring a large volume of imagery daily. As the main portal of image processing and distribution from those Chinese satellites, the China Centre for Resources Satellite Data and Application (CRESDA) has been working with PCI Geomatics during the last three years to solve two issues in this regard: processing the large volume of data (about 1,500 scenes or 1 TB per day) in a timely manner and generating geometrically accurate orthorectified products. After three-year research and development, a high performance system has been built and successfully delivered. The high performance system has a service oriented architecture and can be deployed to a cluster of computers that may be configured with high end computing power. The high performance is gained through, first, making image processing algorithms into parallel computing by using high performance graphic processing unit (GPU) cards and multiple cores from multiple CPUs, and, second, distributing processing tasks to a cluster of computing nodes. While achieving up to thirty (and even more) times faster in performance compared with the traditional practice, a particular methodology was developed to improve the geometric accuracy of images acquired from Chinese satellites (including HJ-1 A/B, ZY-1-02C, ZY-3, GF-1, etc.). The methodology consists of fully automatic collection of dense ground control points (GCP) from various resources and then application of those points to improve the photogrammetric model of the images. The delivered system is up running at CRESDA for pre-operational production and has been and is generating good return on investment by eliminating a great amount of manual labor and increasing more than ten times of data throughput daily with fewer operators. Future work, such as development of more performance-optimized algorithms, robust image matching methods and application workflows, is identified to improve the system in the coming years.

  18. Comparison of 250 MHz electron spin echo and continuous wave oxygen EPR imaging methods for in vivo applications

    PubMed Central

    Epel, Boris; Sundramoorthy, Subramanian V.; Barth, Eugene D.; Mailer, Colin; Halpern, Howard J.

    2011-01-01

    Purpose: The authors compare two electron paramagnetic resonance imaging modalities at 250 MHz to determine advantages and disadvantages of those modalities for in vivo oxygen imaging. Methods: Electron spin echo (ESE) and continuous wave (CW) methodologies were used to obtain three-dimensional images of a narrow linewidth, water soluble, nontoxic oxygen-sensitive trityl molecule OX063 in vitro and in vivo. The authors also examined sequential images obtained from the same animal injected intravenously with trityl spin probe to determine temporal stability of methodologies. Results: A study of phantoms with different oxygen concentrations revealed a threefold advantage of the ESE methodology in terms of reduced imaging time and more precise oxygen resolution for samples with less than 70 torr oxygen partial pressure. Above∼100 torr, CW performed better. The images produced by both methodologies showed pO2 distributions with similar mean values. However, ESE images demonstrated superior performance in low pO2 regions while missing voxels in high pO2 regions. Conclusions: ESE and CW have different areas of applicability. ESE is superior for hypoxia studies in tumors. PMID:21626937

  19. Systematic review of the methodological quality of controlled trials evaluating Chinese herbal medicine in patients with rheumatoid arthritis.

    PubMed

    Pan, Xin; Lopez-Olivo, Maria A; Song, Juhee; Pratt, Gregory; Suarez-Almazor, Maria E

    2017-03-01

    We appraised the methodological and reporting quality of randomised controlled clinical trials (RCTs) evaluating the efficacy and safety of Chinese herbal medicine (CHM) in patients with rheumatoid arthritis (RA). For this systematic review, electronic databases were searched from inception until June 2015. The search was limited to humans and non-case report studies, but was not limited by language, year of publication or type of publication. Two independent reviewers selected RCTs, evaluating CHM in RA (herbals and decoctions). Descriptive statistics were used to report on risk of bias and their adherence to reporting standards. Multivariable logistic regression analysis was performed to determine study characteristics associated with high or unclear risk of bias. Out of 2342 unique citations, we selected 119 RCTs including 18 919 patients: 10 108 patients received CHM alone and 6550 received one of 11 treatment combinations. A high risk of bias was observed across all domains: 21% had a high risk for selection bias (11% from sequence generation and 30% from allocation concealment), 85% for performance bias, 89% for detection bias, 4% for attrition bias and 40% for reporting bias. In multivariable analysis, fewer authors were associated with selection bias (allocation concealment), performance bias and attrition bias, and earlier year of publication and funding source not reported or disclosed were associated with selection bias (sequence generation). Studies published in non-English language were associated with reporting bias. Poor adherence to recommended reporting standards (<60% of the studies not providing sufficient information) was observed in 11 of the 23 sections evaluated. Study quality and data extraction were performed by one reviewer and cross-checked by a second reviewer. Translation to English was performed by one reviewer in 85% of the included studies. Studies evaluating CHM often fail to meet expected methodological criteria, and high-quality evidence is lacking. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  20. Methodology for CFD Design Analysis of National Launch System Nozzle Manifold

    NASA Technical Reports Server (NTRS)

    Haire, Scot L.

    1993-01-01

    The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.

  1. Albendazole nanocrystals with improved pharmacokinetic performance in mice.

    PubMed

    Paredes, Alejandro J; Bruni, Sergio Sánchez; Allemandi, Daniel; Lanusse, Carlos; Palma, Santiago D

    2018-02-01

    Albendazole (ABZ) is a broad-spectrum antiparasitic agent with poor aqueous solubility, which leads to poor/erratic bioavailability and therapeutic failures. Here, we aimed to produce a novel formulation of ABZ nanocrystals (ABZNC) and assess its pharmacokinetic performance in mice. Results/methodology: ABZNC were prepared by high-pressure homogenization and spray-drying processes. Redispersion capacity and solid yield were measured in order to obtain an optimized product. The final particle size was 415.69±7.40 nm and the solid yield was 72.32%. The pharmacokinetic parameters obtained in a mice model for ABZNC were enhanced (p < 0.05) with respect to the control formulation. ABZNC with improved pharmacokinetic behavior were produced by a simple, inexpensive and potentially scalable methodology.

  2. The Pennsylvania Trauma Outcomes Study Risk-Adjusted Mortality Model: Results of a Statewide Benchmarking Program

    PubMed Central

    WIEBE, DOUGLAS J.; HOLENA, DANIEL N.; DELGADO, M. KIT; McWILLIAMS, NATHAN; ALTENBURG, JULIET; CARR, BRENDAN G.

    2018-01-01

    Trauma centers need objective feedback on performance to inform quality improvement efforts. The Trauma Quality Improvement Program recently published recommended methodology for case mix adjustment and benchmarking performance. We tested the feasibility of applying this methodology to develop risk-adjusted mortality models for a statewide trauma system. We performed a retrospective cohort study of patients ≥16 years old at Pennsylvania trauma centers from 2011 to 2013 (n = 100,278). Our main outcome measure was observed-to-expected mortality ratios (overall and within blunt, penetrating, multisystem, isolated head, and geriatric subgroups). Patient demographic variables, physiology, mechanism of injury, transfer status, injury severity, and pre-existing conditions were included as predictor variables. The statistical model had excellent discrimination (area under the curve = 0.94). Funnel plots of observed-to-expected identified five centers with lower than expected mortality and two centers with higher than expected mortality. No centers were outliers for management of penetrating trauma, but five centers had lower and three had higher than expected mortality for blunt trauma. It is feasible to use Trauma Quality Improvement Program methodology to develop risk-adjusted models for statewide trauma systems. Even with smaller numbers of trauma centers that are available in national datasets, it is possible to identify high and low outliers in performance. PMID:28541852

  3. The Pennsylvania Trauma Outcomes Study Risk-Adjusted Mortality Model: Results of a Statewide Benchmarking Program.

    PubMed

    Wiebe, Douglas J; Holena, Daniel N; Delgado, M Kit; McWilliams, Nathan; Altenburg, Juliet; Carr, Brendan G

    2017-05-01

    Trauma centers need objective feedback on performance to inform quality improvement efforts. The Trauma Quality Improvement Program recently published recommended methodology for case mix adjustment and benchmarking performance. We tested the feasibility of applying this methodology to develop risk-adjusted mortality models for a statewide trauma system. We performed a retrospective cohort study of patients ≥16 years old at Pennsylvania trauma centers from 2011 to 2013 (n = 100,278). Our main outcome measure was observed-to-expected mortality ratios (overall and within blunt, penetrating, multisystem, isolated head, and geriatric subgroups). Patient demographic variables, physiology, mechanism of injury, transfer status, injury severity, and pre-existing conditions were included as predictor variables. The statistical model had excellent discrimination (area under the curve = 0.94). Funnel plots of observed-to-expected identified five centers with lower than expected mortality and two centers with higher than expected mortality. No centers were outliers for management of penetrating trauma, but five centers had lower and three had higher than expected mortality for blunt trauma. It is feasible to use Trauma Quality Improvement Program methodology to develop risk-adjusted models for statewide trauma systems. Even with smaller numbers of trauma centers that are available in national datasets, it is possible to identify high and low outliers in performance.

  4. An Approach for Performance Based Glove Mobility Requirements

    NASA Technical Reports Server (NTRS)

    Aitchison, Lindsay; Benson, Elizabeth; England, Scott

    2016-01-01

    The Space Suit Assembly (SSA) Development Team at NASA Johnson Space Center has invested heavily in the advancement of rear-entry planetary exploration suit design but largely deferred development of extravehicular activity (EVA) glove designs, and accepted the risk of using the current flight gloves, Phase VI, for exploration missions. However, as design reference missions mature, the risks of using heritage hardware have highlighted the need for developing robust new glove technologies. To address the technology gap, the NASA Space Technology Mission Directorate's Game-Changing Development Program provided start-up funding for the High Performance EVA Glove (HPEG) Element as part of the Next Generation Life Support (NGLS) Project in the fall of 2013. The overarching goal of the HPEG Element is to develop a robust glove design that increases human performance during EVA and creates pathway for implementation of emergent technologies, with specific aims of increasing pressurized mobility to 60% of barehanded capability, increasing the durability in on-pristine environments, and decreasing the potential of gloves to cause injury during use. The HPEG Element focused initial efforts on developing quantifiable and repeatable methodologies for assessing glove performance with respect to mobility, injury potential, thermal conductivity, and abrasion resistance. The team used these methodologies to establish requirements against which emerging technologies and glove designs can be assessed at both the component and assembly levels. The mobility performance testing methodology was an early focus for the HPEG team as it stems from collaborations between the SSA Development team and the JSC Anthropometry and Biomechanics Facility (ABF) that began investigating new methods for suited mobility and fit early in the Constellation Program. The combined HPEG and ABF team used lessons learned from the previous efforts as well as additional reviews of methodologies in physical and occupational therapy arenas to develop a protocol that assesses gloved range of motion, strength, dexterity, tactility, and fit in comparative quantitative terms and also provides qualitative insight to direct hardware design iterations. The protocol was evaluated using five experienced test subjects wearing the EMU pressurized to 4.3psid with three different glove configurations. The results of the testing are presented to illustrate where the protocol is and is not valid for benchmark comparisons. The process for requirements development based upon the results is also presented along with suggested performance values for the High Performance EVA Gloves currently in development.

  5. An Approach for Performance Based Glove Mobility Requirements

    NASA Technical Reports Server (NTRS)

    Aitchison, Lindsay; Benson, Elizabeth; England, Scott

    2015-01-01

    The Space Suit Assembly (SSA) Development Team at NASA Johnson Space Center has invested heavily in the advancement of rear-entry planetary exploration suit design but largely deferred development of extravehicular activity (EVA) glove designs, and accepted the risk of using the current flight gloves, Phase VI, for exploration missions. However, as design reference missions mature, the risks of using heritage hardware have highlighted the need for developing robust new glove technologies. To address the technology gap, the NASA Space Technology Mission Directorate's Game-Changing Development Program provided start-up funding for the High Performance EVA Glove (HPEG) Element as part of the Next Generation Life Support (NGLS) Project in the fall of 2013. The overarching goal of the HPEG Element is to develop a robust glove design that increases human performance during EVA and creates pathway for implementation of emergent technologies, with specific aims of increasing pressurized mobility to 60% of barehanded capability, increasing the durability in on-pristine environments, and decreasing the potential of gloves to cause injury during use. The HPEG Element focused initial efforts on developing quantifiable and repeatable methodologies for assessing glove performance with respect to mobility, injury potential, thermal conductivity, and abrasion resistance. The team used these methodologies to establish requirements against which emerging technologies and glove designs can be assessed at both the component and assembly levels. The mobility performance testing methodology was an early focus for the HPEG team as it stems from collaborations between the SSA Development team and the JSC Anthropometry and Biomechanics Facility (ABF) that began investigating new methods for suited mobility and fit early in the Constellation Program. The combined HPEG and ABF team used lessons learned from the previous efforts as well as additional reviews of methodologies in physical and occupational therapy arenas to develop a protocol that assesses gloved range of motion, strength, dexterity, tactility, and fit in comparative quantitative terms and also provides qualitative insight to direct hardware design iterations. The protocol was evaluated using five experienced test subjects wearing the EMU pressurized to 4.3psid with three different glove configurations. The results of the testing are presented to illustrate where the protocol is and is not valid for benchmark comparisons. The process for requirements development based upon the results is also presented along with suggested performance values for the High Performance EVA Gloves to be procured in fiscal year 2015.

  6. "Assessing the methodological quality of systematic reviews in radiation oncology: A systematic review".

    PubMed

    Hasan, Haroon; Muhammed, Taaha; Yu, Jennifer; Taguchi, Kelsi; Samargandi, Osama A; Howard, A Fuchsia; Lo, Andrea C; Olson, Robert; Goddard, Karen

    2017-10-01

    The objective of our study was to evaluate the methodological quality of systematic reviews and meta-analyses in Radiation Oncology. A systematic literature search was conducted for all eligible systematic reviews and meta-analyses in Radiation Oncology from 1966 to 2015. Methodological characteristics were abstracted from all works that satisfied the inclusion criteria and quality was assessed using the critical appraisal tool, AMSTAR. Regression analyses were performed to determine factors associated with a higher score of quality. Following exclusion based on a priori criteria, 410 studies (157 systematic reviews and 253 meta-analyses) satisfied the inclusion criteria. Meta-analyses were found to be of fair to good quality while systematic reviews were found to be of less than fair quality. Factors associated with higher scores of quality in the multivariable analysis were including primary studies consisting of randomized control trials, performing a meta-analysis, and applying a recommended guideline related to establishing a systematic review protocol and/or reporting. Systematic reviews and meta-analyses may introduce a high risk of bias if applied to inform decision-making based on AMSTAR. We recommend that decision-makers in Radiation Oncology scrutinize the methodological quality of systematic reviews and meta-analyses prior to assessing their utility to inform evidence-based medicine and researchers adhere to methodological standards outlined in validated guidelines when embarking on a systematic review. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. A cost-effective methodology for the design of massively-parallel VLSI functional units

    NASA Technical Reports Server (NTRS)

    Venkateswaran, N.; Sriram, G.; Desouza, J.

    1993-01-01

    In this paper we propose a generalized methodology for the design of cost-effective massively-parallel VLSI Functional Units. This methodology is based on a technique of generating and reducing a massive bit-array on the mask-programmable PAcube VLSI array. This methodology unifies (maintains identical data flow and control) the execution of complex arithmetic functions on PAcube arrays. It is highly regular, expandable and uniform with respect to problem-size and wordlength, thereby reducing the communication complexity. The memory-functional unit interface is regular and expandable. Using this technique functional units of dedicated processors can be mask-programmed on the naked PAcube arrays, reducing the turn-around time. The production cost of such dedicated processors can be drastically reduced since the naked PAcube arrays can be mass-produced. Analysis of the the performance of functional units designed by our method yields promising results.

  8. Brain Dynamics: Methodological Issues and Applications in Psychiatric and Neurologic Diseases

    NASA Astrophysics Data System (ADS)

    Pezard, Laurent

    The human brain is a complex dynamical system generating the EEG signal. Numerical methods developed to study complex physical dynamics have been used to characterize EEG since the mid-eighties. This endeavor raised several issues related to the specificity of EEG. Firstly, theoretical and methodological studies should address the major differences between the dynamics of the human brain and physical systems. Secondly, this approach of EEG signal should prove to be relevant for dealing with physiological or clinical problems. A set of studies performed in our group is presented here within the context of these two problematic aspects. After the discussion of methodological drawbacks, we review numerical simulations related to the high dimension and spatial extension of brain dynamics. Experimental studies in neurologic and psychiatric disease are then presented. We conclude that if it is now clear that brain dynamics changes in relation with clinical situations, methodological problems remain largely unsolved.

  9. Fuzzy logic modeling of high performance rechargeable batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, P.; Fennie, C. Jr.; Reisner, D.E.

    1998-07-01

    Accurate battery state-of-charge (SOC) measurements are critical in many portable electronic device applications. Yet conventional techniques for battery SOC estimation are limited in their accuracy, reliability, and flexibility. In this paper the authors present a powerful new approach to estimate battery SOC using a fuzzy logic-based methodology. This approach provides a universally applicable, accurate method for battery SOC estimation either integrated within, or as an external monitor to, an electronic device. The methodology is demonstrated in modeling impedance measurements on Ni-MH cells and discharge voltage curves of Li-ion cells.

  10. From Fault-Diagnosis and Performance Recovery of a Controlled System to Chaotic Secure Communication

    NASA Astrophysics Data System (ADS)

    Hsu, Wen-Teng; Tsai, Jason Sheng-Hong; Guo, Fang-Cheng; Guo, Shu-Mei; Shieh, Leang-San

    Chaotic systems are often applied to encryption on secure communication, but they may not provide high-degree security. In order to improve the security of communication, chaotic systems may need to add other secure signals, but this may cause the system to diverge. In this paper, we redesign a communication scheme that could create secure communication with additional secure signals, and the proposed scheme could keep system convergence. First, we introduce the universal state-space adaptive observer-based fault diagnosis/estimator and the high-performance tracker for the sampled-data linear time-varying system with unanticipated decay factors in actuators/system states. Besides, robustness, convergence in the mean, and tracking ability are given in this paper. A residual generation scheme and a mechanism for auto-tuning switched gain is also presented, so that the introduced methodology is applicable for the fault detection and diagnosis (FDD) for actuator and state faults to yield a high tracking performance recovery. The evolutionary programming-based adaptive observer is then applied to the problem of secure communication. Whenever the tracker induces a large control input which might not conform to the input constraint of some physical systems, the proposed modified linear quadratic optimal tracker (LQT) can effectively restrict the control input within the specified constraint interval, under the acceptable tracking performance. The effectiveness of the proposed design methodology is illustrated through tracking control simulation examples.

  11. Passenger rail vehicle safety assessment methodology. Volume I, Summary of safe performance limits.

    DOT National Transportation Integrated Search

    2000-04-01

    This report presents a methodology based on computer simulation that asseses the safe dyamic performance limits of commuter passenger vehicles. The methodology consists of determining the critical design parameters and characteristic properties of bo...

  12. A Negative Selection Immune System Inspired Methodology for Fault Diagnosis of Wind Turbines.

    PubMed

    Alizadeh, Esmaeil; Meskin, Nader; Khorasani, Khashayar

    2017-11-01

    High operational and maintenance costs represent as major economic constraints in the wind turbine (WT) industry. These concerns have made investigation into fault diagnosis of WT systems an extremely important and active area of research. In this paper, an immune system (IS) inspired methodology for performing fault detection and isolation (FDI) of a WT system is proposed and developed. The proposed scheme is based on a self nonself discrimination paradigm of a biological IS. Specifically, the negative selection mechanism [negative selection algorithm (NSA)] of the human body is utilized. In this paper, a hierarchical bank of NSAs are designed to detect and isolate both individual as well as simultaneously occurring faults common to the WTs. A smoothing moving window filter is then utilized to further improve the reliability and performance of the FDI scheme. Moreover, the performance of our proposed scheme is compared with another state-of-the-art data-driven technique, namely the support vector machines (SVMs) to demonstrate and illustrate the superiority and advantages of our proposed NSA-based FDI scheme. Finally, a nonparametric statistical comparison test is implemented to evaluate our proposed methodology with that of the SVM under various fault severities.

  13. CFD-RANS prediction of individual exposure from continuous release of hazardous airborne materials in complex urban environments

    NASA Astrophysics Data System (ADS)

    Efthimiou, G. C.; Andronopoulos, S.; Bartzis, J. G.; Berbekar, E.; Harms, F.; Leitl, B.

    2017-02-01

    One of the key issues of recent research on the dispersion inside complex urban environments is the ability to predict individual exposure (maximum dosages) of an airborne material which is released continuously from a point source. The present work addresses the question whether the computational fluid dynamics (CFD)-Reynolds-averaged Navier-Stokes (RANS) methodology can be used to predict individual exposure for various exposure times. This is feasible by providing the two RANS concentration moments (mean and variance) and a turbulent time scale to a deterministic model. The whole effort is focused on the prediction of individual exposure inside a complex real urban area. The capabilities of the proposed methodology are validated against wind-tunnel data (CUTE experiment). The present simulations were performed 'blindly', i.e. the modeller had limited information for the inlet boundary conditions and the results were kept unknown until the end of the COST Action ES1006. Thus, a high uncertainty of the results was expected. The general performance of the methodology due to this 'blind' strategy is good. The validation metrics fulfil the acceptance criteria. The effect of the grid and the turbulence model on the model performance is examined.

  14. Probabilistic design of fibre concrete structures

    NASA Astrophysics Data System (ADS)

    Pukl, R.; Novák, D.; Sajdlová, T.; Lehký, D.; Červenka, J.; Červenka, V.

    2017-09-01

    Advanced computer simulation is recently well-established methodology for evaluation of resistance of concrete engineering structures. The nonlinear finite element analysis enables to realistically predict structural damage, peak load, failure, post-peak response, development of cracks in concrete, yielding of reinforcement, concrete crushing or shear failure. The nonlinear material models can cover various types of concrete and reinforced concrete: ordinary concrete, plain or reinforced, without or with prestressing, fibre concrete, (ultra) high performance concrete, lightweight concrete, etc. Advanced material models taking into account fibre concrete properties such as shape of tensile softening branch, high toughness and ductility are described in the paper. Since the variability of the fibre concrete material properties is rather high, the probabilistic analysis seems to be the most appropriate format for structural design and evaluation of structural performance, reliability and safety. The presented combination of the nonlinear analysis with advanced probabilistic methods allows evaluation of structural safety characterized by failure probability or by reliability index respectively. Authors offer a methodology and computer tools for realistic safety assessment of concrete structures; the utilized approach is based on randomization of the nonlinear finite element analysis of the structural model. Uncertainty of the material properties or their randomness obtained from material tests are accounted in the random distribution. Furthermore, degradation of the reinforced concrete materials such as carbonation of concrete, corrosion of reinforcement, etc. can be accounted in order to analyze life-cycle structural performance and to enable prediction of the structural reliability and safety in time development. The results can serve as a rational basis for design of fibre concrete engineering structures based on advanced nonlinear computer analysis. The presented methodology is illustrated on results from two probabilistic studies with different types of concrete structures related to practical applications and made from various materials (with the parameters obtained from real material tests).

  15. A frontier analysis approach for benchmarking hospital performance in the treatment of acute myocardial infarction.

    PubMed

    Stanford, Robert E

    2004-05-01

    This paper uses a non-parametric frontier model and adaptations of the concepts of cross-efficiency and peer-appraisal to develop a formal methodology for benchmarking provider performance in the treatment of Acute Myocardial Infarction (AMI). Parameters used in the benchmarking process are the rates of proper recognition of indications of six standard treatment processes for AMI; the decision making units (DMUs) to be compared are the Medicare eligible hospitals of a particular state; the analysis produces an ordinal ranking of individual hospital performance scores. The cross-efficiency/peer-appraisal calculation process is constructed to accommodate DMUs that experience no patients in some of the treatment categories. While continuing to rate highly the performances of DMUs which are efficient in the Pareto-optimal sense, our model produces individual DMU performance scores that correlate significantly with good overall performance, as determined by a comparison of the sums of the individual DMU recognition rates for the six standard treatment processes. The methodology is applied to data collected from 107 state Medicare hospitals.

  16. Sustaining School Improvement in a High-Need School: Longitudinal Analysis of Robbins Elementary School (USA) from 1993 to 2015

    ERIC Educational Resources Information Center

    Okilwa, Nathern; Barnett, Bruce

    2017-01-01

    Purpose: The purpose of this paper is to examine how Robbins ES has sustained high academic performance over almost 20 years despite several changes in principals. Design/methodology/approach: The paper analyzed longitudinal data based on: state-level academic and demographic data; two earlier studies of the school; and recent interviews with…

  17. A Simple and Reliable Method of Design for Standalone Photovoltaic Systems

    NASA Astrophysics Data System (ADS)

    Srinivasarao, Mantri; Sudha, K. Rama; Bhanu, C. V. K.

    2017-06-01

    Standalone photovoltaic (SAPV) systems are seen as a promoting method of electrifying areas of developing world that lack power grid infrastructure. Proliferations of these systems require a design procedure that is simple, reliable and exhibit good performance over its life time. The proposed methodology uses simple empirical formulae and easily available parameters to design SAPV systems, that is, array size with energy storage. After arriving at the different array size (area), performance curves are obtained for optimal design of SAPV system with high amount of reliability in terms of autonomy at a specified value of loss of load probability (LOLP). Based on the array to load ratio (ALR) and levelized energy cost (LEC) through life cycle cost (LCC) analysis, it is shown that the proposed methodology gives better performance, requires simple data and is more reliable when compared with conventional design using monthly average daily load and insolation.

  18. Methodologic Guide for Evaluating Clinical Performance and Effect of Artificial Intelligence Technology for Medical Diagnosis and Prediction.

    PubMed

    Park, Seong Ho; Han, Kyunghwa

    2018-03-01

    The use of artificial intelligence in medicine is currently an issue of great interest, especially with regard to the diagnostic or predictive analysis of medical images. Adoption of an artificial intelligence tool in clinical practice requires careful confirmation of its clinical utility. Herein, the authors explain key methodology points involved in a clinical evaluation of artificial intelligence technology for use in medicine, especially high-dimensional or overparameterized diagnostic or predictive models in which artificial deep neural networks are used, mainly from the standpoints of clinical epidemiology and biostatistics. First, statistical methods for assessing the discrimination and calibration performances of a diagnostic or predictive model are summarized. Next, the effects of disease manifestation spectrum and disease prevalence on the performance results are explained, followed by a discussion of the difference between evaluating the performance with use of internal and external datasets, the importance of using an adequate external dataset obtained from a well-defined clinical cohort to avoid overestimating the clinical performance as a result of overfitting in high-dimensional or overparameterized classification model and spectrum bias, and the essentials for achieving a more robust clinical evaluation. Finally, the authors review the role of clinical trials and observational outcome studies for ultimate clinical verification of diagnostic or predictive artificial intelligence tools through patient outcomes, beyond performance metrics, and how to design such studies. © RSNA, 2018.

  19. Massive parallelization of serial inference algorithms for a complex generalized linear model

    PubMed Central

    Suchard, Marc A.; Simpson, Shawn E.; Zorych, Ivan; Ryan, Patrick; Madigan, David

    2014-01-01

    Following a series of high-profile drug safety disasters in recent years, many countries are redoubling their efforts to ensure the safety of licensed medical products. Large-scale observational databases such as claims databases or electronic health record systems are attracting particular attention in this regard, but present significant methodological and computational concerns. In this paper we show how high-performance statistical computation, including graphics processing units, relatively inexpensive highly parallel computing devices, can enable complex methods in large databases. We focus on optimization and massive parallelization of cyclic coordinate descent approaches to fit a conditioned generalized linear model involving tens of millions of observations and thousands of predictors in a Bayesian context. We find orders-of-magnitude improvement in overall run-time. Coordinate descent approaches are ubiquitous in high-dimensional statistics and the algorithms we propose open up exciting new methodological possibilities with the potential to significantly improve drug safety. PMID:25328363

  20. Aeroelastic optimization methodology for viscous and turbulent flows

    NASA Astrophysics Data System (ADS)

    Barcelos Junior, Manuel Nascimento Dias

    2007-12-01

    In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.

  1. Methodology to evaluate the performance of simulation models for alternative compiler and operating system configurations

    USDA-ARS?s Scientific Manuscript database

    Simulation modelers increasingly require greater flexibility for model implementation on diverse operating systems, and they demand high computational speed for efficient iterative simulations. Additionally, model users may differ in preference for proprietary versus open-source software environment...

  2. 75 FR 53586 - Bifenazate; Pesticide Tolerances

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-01

    ... characterized and were seen at dose(s) that produce evidence of overt systemic toxicity. These effects included... system, and these findings may be due to secondary effect of overt systemic toxicity. Further, there is... Adequate enforcement methodology is available to enforce the tolerance expression. High-performance liquid...

  3. High-Order Moving Overlapping Grid Methodology in a Spectral Element Method

    NASA Astrophysics Data System (ADS)

    Merrill, Brandon E.

    A moving overlapping mesh methodology that achieves spectral accuracy in space and up to second-order accuracy in time is developed for solution of unsteady incompressible flow equations in three-dimensional domains. The targeted applications are in aerospace and mechanical engineering domains and involve problems in turbomachinery, rotary aircrafts, wind turbines and others. The methodology is built within the dual-session communication framework initially developed for stationary overlapping meshes. The methodology employs semi-implicit spectral element discretization of equations in each subdomain and explicit treatment of subdomain interfaces with spectrally-accurate spatial interpolation and high-order accurate temporal extrapolation, and requires few, if any, iterations, yet maintains the global accuracy and stability of the underlying flow solver. Mesh movement is enabled through the Arbitrary Lagrangian-Eulerian formulation of the governing equations, which allows for prescription of arbitrary velocity values at discrete mesh points. The stationary and moving overlapping mesh methodologies are thoroughly validated using two- and three-dimensional benchmark problems in laminar and turbulent flows. The spatial and temporal global convergence, for both methods, is documented and is in agreement with the nominal order of accuracy of the underlying solver. Stationary overlapping mesh methodology was validated to assess the influence of long integration times and inflow-outflow global boundary conditions on the performance. In a turbulent benchmark of fully-developed turbulent pipe flow, the turbulent statistics are validated against the available data. Moving overlapping mesh simulations are validated on the problems of two-dimensional oscillating cylinder and a three-dimensional rotating sphere. The aerodynamic forces acting on these moving rigid bodies are determined, and all results are compared with published data. Scaling tests, with both methodologies, show near linear strong scaling, even for moderately large processor counts. The moving overlapping mesh methodology is utilized to investigate the effect of an upstream turbulent wake on a three-dimensional oscillating NACA0012 extruded airfoil. A direct numerical simulation (DNS) at Reynolds Number 44,000 is performed for steady inflow incident upon the airfoil oscillating between angle of attack 5.6° and 25° with reduced frequency k=0.16. Results are contrasted with subsequent DNS of the same oscillating airfoil in a turbulent wake generated by a stationary upstream cylinder.

  4. A performance-oriented power transformer design methodology using multi-objective evolutionary optimization

    PubMed Central

    Adly, Amr A.; Abd-El-Hafiz, Salwa K.

    2014-01-01

    Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper. PMID:26257939

  5. A performance-oriented power transformer design methodology using multi-objective evolutionary optimization.

    PubMed

    Adly, Amr A; Abd-El-Hafiz, Salwa K

    2015-05-01

    Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper.

  6. Teaching clinical research methodology to the academic medical community: a fifteen-year retrospective of a comprehensive curriculum.

    PubMed

    Supino, Phyllis G; Borer, Jeffrey S

    2007-05-01

    Due to inadequate preparation, many medical professionals are unable to critically evaluate published research articles or properly design, execute and present their own research. To increase exposure among physicians, medical students, and allied health professionals to diverse methodological issues involved in performing research. A comprehensive course on research methodology was newly designed for physicians and other members of an academic medical community, and has been successfully implemented beginning 1991. The role of the study hypothesis is highlighted; interactive pedagogical techniques are employed to promote audience engagement. Participants complete an annual evaluation to assess course quality and perceived outcomes. Outcomes also are assessed qualitatively by faculty. More than 500 physicians/other professionals have participated. Ratings have been consistently high. Topics deemed most valuable are investigational planning, hypothesis construction and study designs. An enhancement of capacity to define hypotheses and apply methodological concepts in the criticism of scientific papers and development of protocols/manuscripts has been observed. Participants and faculty believe the course improves critical appraisal skills and ability to conduct research. Our experience shows it is feasible to accomplish these objectives, with a high level of satisfaction, through a didactic program targeted to the general academic community.

  7. Universal Approach to Estimate Perfluorocarbons Emissions During Individual High-Voltage Anode Effect for Prebaked Cell Technologies

    NASA Astrophysics Data System (ADS)

    Dion, Lukas; Gaboury, Simon; Picard, Frédéric; Kiss, Laszlo I.; Poncsak, Sandor; Morais, Nadia

    2018-04-01

    Recent investigations on aluminum electrolysis cell demonstrated limitations to the commonly used tier-3 slope methodology to estimate perfluorocarbon (PFC) emissions from high-voltage anode effects (HVAEs). These limitations are greater for smelters with a reduced HVAE frequency. A novel approach is proposed to estimate the specific emissions using a tier 2 model resulting from individual HVAE instead of estimating monthly emissions for pot lines with the slope methodology. This approach considers the nonlinear behavior of PFC emissions as a function of the polarized anode effect duration but also integrates the change in behavior attributed to cell productivity. Validation was performed by comparing the new approach and the slope methodology with measurement campaigns from different smelters. The results demonstrate a good agreement between measured and estimated emissions as well as more accurately reflect individual HVAE dynamics occurring over time. Finally, the possible impact of this approach for the aluminum industry is discussed.

  8. Automated combinatorial method for fast and robust prediction of lattice thermal conductivity

    NASA Astrophysics Data System (ADS)

    Plata, Jose J.; Nath, Pinku; Usanmaz, Demet; Toher, Cormac; Fornari, Marco; Buongiorno Nardelli, Marco; Curtarolo, Stefano

    The lack of computationally inexpensive and accurate ab-initio based methodologies to predict lattice thermal conductivity, κl, without computing the anharmonic force constants or performing time-consuming ab-initio molecular dynamics, is one of the obstacles preventing the accelerated discovery of new high or low thermal conductivity materials. The Slack equation is the best alternative to other more expensive methodologies but is highly dependent on two variables: the acoustic Debye temperature, θa, and the Grüneisen parameter, γ. Furthermore, different definitions can be used for these two quantities depending on the model or approximation. Here, we present a combinatorial approach based on the quasi-harmonic approximation to elucidate which definitions of both variables produce the best predictions of κl. A set of 42 compounds was used to test accuracy and robustness of all possible combinations. This approach is ideal for obtaining more accurate values than fast screening models based on the Debye model, while being significantly less expensive than methodologies that solve the Boltzmann transport equation.

  9. A simple and highly selective molecular imprinting polymer-based methodology for propylparaben monitoring in personal care products and industrial waste waters.

    PubMed

    Vicario, Ana; Aragón, Leslie; Wang, Chien C; Bertolino, Franco; Gomez, María R

    2018-02-05

    In this work, a novel molecularly imprinted polymer (MIP) proposed as solid phase extraction sorbent was developed for the determination of propylparaben (PP) in diverse cosmetic samples. The use of parabens (PAs) is authorized by regulatory agencies as microbiological preservative; however, recently several studies claim that large-scale use of these preservatives can be a potential health risk and harmful to the environment. Diverse factors that influence on polymer synthesis were studied, including template, functional monomer, porogen and crosslinker used. Morphological characterization of the MIP was performed using SEM and BET analysis. Parameters affecting the molecularly imprinted solid phase extraction (MISPE) and elution efficiency of PP were evaluated. After sample clean-up, the analyte was analyzed by high performance liquid chromatography (HPLC). The whole procedure was validated, showing satisfactory analytical parameters. After applying the MISPE methodology, the extraction recoveries were always better than 86.15%; the obtained precision expressed as RSD% was always lower than 2.19 for the corrected peak areas. Good linear relationship was obtained within the range 8-500ngmL -1 of PP, r 2 =0.99985. Lower limits of detection and quantification after MISPE procedure of 2.4 and 8ngmL -1 , respectively were reached, in comparison with previously reported methodologies. The development of MISPE-HPLC methodology provided a simple an economic way for accomplishing a clean-up/preconcentration step and the subsequent determination of PP in a complex matrix. The performance of the proposed method was compared against C-18 and silica solid phase extraction (SPE) cartridges. The recovery factors obtained after applying extraction methods were 96.6, 64.8 and 0.79 for MISPE, C18-SPE and silica-SPE procedures, respectively. The proposed methodology improves the retention capability of SPE material plus robustness and possibility of reutilization, enabling it to be used for PP routine monitoring in diverse personal care products (PCP) and environmental samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Use of Taguchi methodology to enhance the yield of caffeine removal with growing cultures of Pseudomonas pseudoalcaligenes.

    PubMed

    Ashengroph, Morahem; Ababaf, Sajad

    2014-12-01

    Microbial caffeine removal is a green solution for treatment of caffeinated products and agro-industrial effluents. We directed this investigation to optimizing a bio-decaffeination process with growing cultures of Pseudomonas pseudoalcaligenes through Taguchi methodology which is a structured statistical approach that can be lowered variations in a process through Design of Experiments (DOE). Five parameters, i.e. initial fructose, tryptone, Zn(+2) ion and caffeine concentrations and also incubation time selected and an L16 orthogonal array was applied to design experiments with four 4-level factors and one 3-level factor (4(4) × 1(3)). Data analysis was performed using the statistical analysis of variance (ANOVA) method. Furthermore, the optimal conditions were determined by combining the optimal levels of the significant factors and verified by a confirming experiment. Measurement of residual caffeine concentration in the reaction mixture was performed using high-performance liquid chromatography (HPLC). Use of Taguchi methodology for optimization of design parameters resulted in about 86.14% reduction of caffeine in 48 h incubation when 5g/l fructose, 3 mM Zn(+2) ion and 4.5 g/l of caffeine are present in the designed media. Under the optimized conditions, the yield of degradation of caffeine (4.5 g/l) by the native strain of Pseudomonas pseudoalcaligenes TPS8 has been increased from 15.8% to 86.14% which is 5.4 fold higher than the normal yield. According to the experimental results, Taguchi methodology provides a powerful methodology for identifying the favorable parameters on caffeine removal using strain TPS8 which suggests that the approach also has potential application with similar strains to improve the yield of caffeine removal from caffeine containing solutions.

  11. Stability Result For Dynamic Inversion Devised to Control Large Flexible Aircraft

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.

    2001-01-01

    High performance aircraft of the future will be designed lighter, more maneuverable, and operate over an ever expanding flight envelope. One of the largest differences from the flight control perspective between current and future advanced aircraft is elasticity. Over the last decade, dynamic inversion methodology has gained considerable popularity in application to highly maneuverable fighter aircraft, which were treated as rigid vehicles. This paper is an initial attempt to establish global stability results for dynamic inversion methodology as applied to a large, flexible aircraft. This work builds on a previous result for rigid fighter aircraft and adds a new level of complexity that is the flexible aircraft dynamics, which cannot be ignored even in the most basic flight control. The results arise from observations of the control laws designed for a new generation of the High-Speed Civil Transport aircraft.

  12. Assessing the effect of increased managed care on hospitals.

    PubMed

    Mowll, C A

    1998-01-01

    This study uses a new relative risk methodology developed by the author to assess and compare certain performance indicators to determine a hospital's relative degree of financial vulnerability, based on its location, to the effects of increased managed care market penetration. The study also compares nine financial measures to determine whether hospital in states with a high degree of managed-care market penetration experience lower levels of profitability, liquidity, debt service, and overall viability than hospitals in low managed care states. A Managed Care Relative Financial Risk Assessment methodology composed of nine measures of hospital financial and utilization performance is used to develop a high managed care state Composite Index and to determine the Relative Financial Risk and the Overall Risk Ratio for hospitals in a particular state. Additionally, financial performance of hospitals in the five highest managed care states is compared to hospitals in the five lowest states. While data from Colorado and Massachusetts indicates that hospital profitability diminishes as the level of managed care market penetration increases, the overall study results indicate that hospitals in high managed care states demonstrate a better cash position and higher profitability than hospitals in low managed care states. Hospitals in high managed care states are, however, more heavily indebted in relation to equity and have a weaker debt service coverage capacity. Moreover, the overall financial health and viability of hospitals in high managed care states is superior to that of hospitals in low managed care states.

  13. Strategy Guideline: Quality Management in Existing Homes; Cantilever Floor Example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taggart, J.; Sikora, J.; Wiehagen, J.

    2011-12-01

    This guideline is designed to highlight the QA process that can be applied to any residential building retrofit activity. The cantilevered floor retrofit detailed in this guideline is included only to provide an actual retrofit example to better illustrate the QA activities being presented. The goal of existing home high performing remodeling quality management systems (HPR-QMS) is to establish practices and processes that can be used throughout any remodeling project. The research presented in this document provides a comparison of a selected retrofit activity as typically done versus that same retrofit activity approached from an integrated high performance remodeling andmore » quality management perspective. It highlights some key quality management tools and approaches that can be adopted incrementally by a high performance remodeler for this or any high performance retrofit. This example is intended as a template and establishes a methodology that can be used to develop a portfolio of high performance remodeling strategies.« less

  14. Asynchronous threat awareness by observer trials using crowd simulation

    NASA Astrophysics Data System (ADS)

    Dunau, Patrick; Huber, Samuel; Stein, Karin U.; Wellig, Peter

    2016-10-01

    The last few years showed that a high risk of asynchronous threats is given in every day life. Especially in large crowds a high probability of asynchronous attacks is evident. High observational abilities to detect threats are desirable. Consequently highly trained security and observation personal is needed. This paper evaluates the effectiveness of a training methodology to enhance performance of observation personnel engaging in a specific target identification task. For this purpose a crowd simulation video is utilized. The study first provides a measurement of the base performance before the training sessions. Furthermore a training procedure will be performed. Base performance will then be compared to the after training performance in order to look for a training effect. A thorough evaluation of both the training sessions as well as the overall performance will be done in this paper. A specific hypotheses based metric is used. Results will be discussed in order to provide guidelines for the design of training for observational tasks.

  15. Contemporary Methodology for Protein Structure Determination.

    ERIC Educational Resources Information Center

    Hunkapiller, Michael W.; And Others

    1984-01-01

    Describes the nature and capabilities of methods used to characterize protein and peptide structure, indicating that they have undergone changes which have improved the speed, reliability, and applicability of the process. Also indicates that high-performance liquid chromatography and gel electrophoresis have made purifying proteins and peptides a…

  16. Effect of Accessory Power Take-off Variation on a Turbofan Engine Performance

    DTIC Science & Technology

    2012-09-26

    amount of energy from the low pressure spool shaft. A high bypass turbofan engine was modeled using the Numerical Propulsion System Simulation ( NPSS ...4 II.2 Power Extraction Techniques ..........................................................................8 II.3 NPSS ...Methodology and Simulation Setup ...........................................................................25 III.1 Engine NPSS Model

  17. Flexible thermal protection materials for entry systems

    NASA Astrophysics Data System (ADS)

    Kourtides, Demetrius A.

    1993-02-01

    Current programs addressed in aeroassist flight experiment are: (1) evaluation of thermal performance of advanced rigid and flexible insulations and reflective coating; (2) investigation of lighter than baseline materials; (3) investigation of rigid insulations which perform well; (4) study of flexible insulations which require ceramic coating; and (5) study of reflective coating effective at greater than 15 percent. In National Aerospace Plane (NASP), the programs addressed are: (1) high and low temperature insulations; and (2) attachment/standoff methodology critical which affects thermal performance.

  18. Flexible thermal protection materials for entry systems

    NASA Technical Reports Server (NTRS)

    Kourtides, Demetrius A.

    1993-01-01

    Current programs addressed in aeroassist flight experiment are: (1) evaluation of thermal performance of advanced rigid and flexible insulations and reflective coating; (2) investigation of lighter than baseline materials; (3) investigation of rigid insulations which perform well; (4) study of flexible insulations which require ceramic coating; and (5) study of reflective coating effective at greater than 15 percent. In National Aerospace Plane (NASP), the programs addressed are: (1) high and low temperature insulations; and (2) attachment/standoff methodology critical which affects thermal performance.

  19. Aircraft flight test trajectory control

    NASA Technical Reports Server (NTRS)

    Menon, P. K. A.; Walker, R. A.

    1988-01-01

    Two control law design techniques are compared and the performance of the resulting controllers evaluated. The design requirement is for a flight test trajectory controller (FTTC) capable of closed-loop, outer-loop control of an F-15 aircraft performing high-quality research flight test maneuvers. The maneuver modeling, linearization, and design methodologies utilized in this research, are detailed. The results of applying these FTTCs to a nonlinear F-15 simulation are presented.

  20. Numerical Solutions for a Cylindrical Laser Diffuser Flowfield

    DTIC Science & Technology

    1990-06-01

    exhaust conditions with minimum losses to optimize performance of the system. Thus, the handling of the system of shock waves to decelerate the flow...requirement for exhaustive experimental work will result in significant savings of both time and resources. As more advanced computers are developed, the...Mach number (ɚ.5) flows. Recent interest in hypersonic engine inlet performance has resulted in an extension of the methodology to high Mach number

  1. Model and Algorithm for Substantiating Solutions for Organization of High-Rise Construction Project

    NASA Astrophysics Data System (ADS)

    Anisimov, Vladimir; Anisimov, Evgeniy; Chernysh, Anatoliy

    2018-03-01

    In the paper the models and the algorithm for the optimal plan formation for the organization of the material and logistical processes of the high-rise construction project and their financial support are developed. The model is based on the representation of the optimization procedure in the form of a non-linear problem of discrete programming, which consists in minimizing the execution time of a set of interrelated works by a limited number of partially interchangeable performers while limiting the total cost of performing the work. The proposed model and algorithm are the basis for creating specific organization management methodologies for the high-rise construction project.

  2. Analysis of Flowfields over Four-Engine DC-X Rockets

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Cornelison, Joni

    1996-01-01

    The objective of this study is to validate a computational methodology for the aerodynamic performance of an advanced conical launch vehicle configuration. The computational methodology is based on a three-dimensional, viscous flow, pressure-based computational fluid dynamics formulation. Both wind-tunnel and ascent flight-test data are used for validation. Emphasis is placed on multiple-engine power-on effects. Computational characterization of the base drag in the critical subsonic regime is the focus of the validation effort; until recently, almost no multiple-engine data existed for a conical launch vehicle configuration. Parametric studies using high-order difference schemes are performed for the cold-flow tests, whereas grid studies are conducted for the flight tests. The computed vehicle axial force coefficients, forebody, aftbody, and base surface pressures compare favorably with those of tests. The results demonstrate that with adequate grid density and proper distribution, a high-order difference scheme, finite rate afterburning kinetics to model the plume chemistry, and a suitable turbulence model to describe separated flows, plume/air mixing, and boundary layers, computational fluid dynamics is a tool that can be used to predict the low-speed aerodynamic performance for rocket design and operations.

  3. Using the Malcolm Baldrige "are we making progress" survey for organizational self-assessment and performance improvement.

    PubMed

    Shields, Judith A; Jennings, Jerry L

    2013-01-01

    A national healthcare company applied the Malcolm Baldrige Criteria for Performance Excellence and its "Are We Making Progress?" survey as an annual organizational self-assessment to identify areas for improvement. For 6 years, Liberty Healthcare Corporation reviewed the survey results on an annual basis to analyze positive and negative trends, monitor company progress toward targeted goals and develop new initiatives to address emerging areas for improvement. As such, the survey provided a simple and inexpensive methodology to gain useful information from employees at all levels and from multiple service sites and business sectors. In particular, it provided a valuable framework for assessing and improving the employees' commitment to the company's mission and values, high standards and ethics, quality of work, and customer satisfaction. The methodology also helped the company to incorporate the philosophy and principles of continuous quality improvement in a unified fashion. Corporate and local leadership used the same measure to evaluate the performance of individual programs relative to each other, to the company as a whole, and to the "best practices" standard of highly successful companies that received the Malcolm Baldrige National Quality Award. © 2012 National Association for Healthcare Quality.

  4. A Novel Clustering Methodology Based on Modularity Optimisation for Detecting Authorship Affinities in Shakespearean Era Plays

    PubMed Central

    Craig, Hugh; Berretta, Regina; Moscato, Pablo

    2016-01-01

    In this study we propose a novel, unsupervised clustering methodology for analyzing large datasets. This new, efficient methodology converts the general clustering problem into the community detection problem in graph by using the Jensen-Shannon distance, a dissimilarity measure originating in Information Theory. Moreover, we use graph theoretic concepts for the generation and analysis of proximity graphs. Our methodology is based on a newly proposed memetic algorithm (iMA-Net) for discovering clusters of data elements by maximizing the modularity function in proximity graphs of literary works. To test the effectiveness of this general methodology, we apply it to a text corpus dataset, which contains frequencies of approximately 55,114 unique words across all 168 written in the Shakespearean era (16th and 17th centuries), to analyze and detect clusters of similar plays. Experimental results and comparison with state-of-the-art clustering methods demonstrate the remarkable performance of our new method for identifying high quality clusters which reflect the commonalities in the literary style of the plays. PMID:27571416

  5. Integrating automated support for a software management cycle into the TAME system

    NASA Technical Reports Server (NTRS)

    Sunazuka, Toshihiko; Basili, Victor R.

    1989-01-01

    Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.

  6. Changes in clinical trials methodology over time: a systematic review of six decades of research in psychopharmacology.

    PubMed

    Brunoni, André R; Tadini, Laura; Fregni, Felipe

    2010-03-03

    There have been many changes in clinical trials methodology since the introduction of lithium and the beginning of the modern era of psychopharmacology in 1949. The nature and importance of these changes have not been fully addressed to date. As methodological flaws in trials can lead to false-negative or false-positive results, the objective of our study was to evaluate the impact of methodological changes in psychopharmacology clinical research over the past 60 years. We performed a systematic review from 1949 to 2009 on MEDLINE and Web of Science electronic databases, and a hand search of high impact journals on studies of seven major drugs (chlorpromazine, clozapine, risperidone, lithium, fluoxetine and lamotrigine). All controlled studies published 100 months after the first trial were included. Ninety-one studies met our inclusion criteria. We analyzed the major changes in abstract reporting, study design, participants' assessment and enrollment, methodology and statistical analysis. Our results showed that the methodology of psychiatric clinical trials changed substantially, with quality gains in abstract reporting, results reporting, and statistical methodology. Recent trials use more informed consent, periods of washout, intention-to-treat approach and parametric tests. Placebo use remains high and unchanged over time. Clinical trial quality of psychopharmacological studies has changed significantly in most of the aspects we analyzed. There was significant improvement in quality reporting and internal validity. These changes have increased study efficiency; however, there is room for improvement in some aspects such as rating scales, diagnostic criteria and better trial reporting. Therefore, despite the advancements observed, there are still several areas that can be improved in psychopharmacology clinical trials.

  7. A comprehensive evaluation of tyrosol and hydroxytyrosol derivatives in extra virgin olive oil by microwave-assisted hydrolysis and HPLC-MS/MS.

    PubMed

    Bartella, Lucia; Mazzotti, Fabio; Napoli, Anna; Sindona, Giovanni; Di Donna, Leonardo

    2018-03-01

    A rapid and reliable method to assay the total amount of tyrosol and hydroxytyrosol derivatives in extra virgin olive oil has been developed. The methodology intends to establish the nutritional quality of this edible oil addressing recent international health claim legislations (the European Commission Regulation No. 432/2012) and changing the classification of extra virgin olive oil to the status of nutraceutical. The method is based on the use of high-performance liquid chromatography coupled with tandem mass spectrometry and labeled internal standards preceded by a fast hydrolysis reaction step performed through the aid of microwaves under acid conditions. The overall process is particularly time saving, much shorter than any methodology previously reported. The developed approach represents a mix of rapidity and accuracy whose values have been found near 100% on different fortified vegetable oils, while the RSD% values, calculated from repeatability and reproducibility experiments, are in all cases under 7%. Graphical abstract Schematic of the methodology applied to the determination of tyrosol and hydroxytyrosol ester conjugates.

  8. Use of lean and six sigma methodology to improve operating room efficiency in a high-volume tertiary-care academic medical center.

    PubMed

    Cima, Robert R; Brown, Michael J; Hebl, James R; Moore, Robin; Rogers, James C; Kollengode, Anantha; Amstutz, Gwendolyn J; Weisbrod, Cheryl A; Narr, Bradly J; Deschamps, Claude

    2011-07-01

    Operating rooms (ORs) are resource-intense and costly hospital units. Maximizing OR efficiency is essential to maintaining an economically viable institution. OR efficiency projects often focus on a limited number of ORs or cases. Efforts across an entire OR suite have not been reported. Lean and Six Sigma methodologies were developed in the manufacturing industry to increase efficiency by eliminating non-value-added steps. We applied Lean and Six Sigma methodologies across an entire surgical suite to improve efficiency. A multidisciplinary surgical process improvement team constructed a value stream map of the entire surgical process from the decision for surgery to discharge. Each process step was analyzed in 3 domains, ie, personnel, information processed, and time. Multidisciplinary teams addressed 5 work streams to increase value at each step: minimizing volume variation; streamlining the preoperative process; reducing nonoperative time; eliminating redundant information; and promoting employee engagement. Process improvements were implemented sequentially in surgical specialties. Key performance metrics were collected before and after implementation. Across 3 surgical specialties, process redesign resulted in substantial improvements in on-time starts and reduction in number of cases past 5 pm. Substantial gains were achieved in nonoperative time, staff overtime, and ORs saved. These changes resulted in substantial increases in margin/OR/day. Use of Lean and Six Sigma methodologies increased OR efficiency and financial performance across an entire operating suite. Process mapping, leadership support, staff engagement, and sharing performance metrics are keys to enhancing OR efficiency. The performance gains were substantial, sustainable, positive financially, and transferrable to other specialties. Copyright © 2011 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  9. Experimental evaluation of the Continuous Risk Profile (CRP) approach to the current Caltrans methodology for high collision concentration location identification

    DOT National Transportation Integrated Search

    2012-03-31

    This report evaluates the performance of Continuous Risk Profile (CRP) compared with the : Sliding Window Method (SWM) and Peak Searching (PS) methods. These three network : screening methods all require the same inputs: traffic collision data and Sa...

  10. Experimental evaluation of the Continuous Risk Profile (CRP) approach to the current Caltrans methodology for high collision concentration location identification.

    DOT National Transportation Integrated Search

    2012-03-01

    This report evaluates the performance of Continuous Risk Profile (CRP) compared with the : Sliding Window Method (SWM) and Peak Searching (PS) methods. These three network : screening methods all require the same inputs: traffic collision data and Sa...

  11. OPERA: A QSAR tool for physicochemical properties and environmental fate predictions (ACS Spring meeting)

    EPA Science Inventory

    The collection of chemical structures and associated experimental data for QSAR modeling is facilitated by the increasing number and size of public databases. However, the performance of QSAR models highly depends on the quality of the data used and the modeling methodology. The ...

  12. In Search of Effective Methodology for Organizational Learning: A Japanese Experience

    ERIC Educational Resources Information Center

    Tsuchiya, Shigehisa

    2011-01-01

    The author's personal journey regarding simulation and gaming started about 25 years ago when he happened to realize how powerful computerized simulation could be for organizational change. The metaphors created by computerized simulation enabled him to transform a stagnant university into a high-performance organization. Through extensive…

  13. Redirecting Under-Utilised Computer Laboratories into Cluster Computing Facilities

    ERIC Educational Resources Information Center

    Atkinson, John S.; Spenneman, Dirk H. R.; Cornforth, David

    2005-01-01

    Purpose: To provide administrators at an Australian university with data on the feasibility of redirecting under-utilised computer laboratories facilities into a distributed high performance computing facility. Design/methodology/approach: The individual log-in records for each computer located in the computer laboratories at the university were…

  14. Applying operational research and data mining to performance based medical personnel motivation system.

    PubMed

    Niaksu, Olegas; Zaptorius, Jonas

    2014-01-01

    This paper presents the methodology suitable for creation of a performance related remuneration system in healthcare sector, which would meet requirements for efficiency and sustainable quality of healthcare services. Methodology for performance indicators selection, ranking and a posteriori evaluation has been proposed and discussed. Priority Distribution Method is applied for unbiased performance criteria weighting. Data mining methods are proposed to monitor and evaluate the results of motivation system.We developed a method for healthcare specific criteria selection consisting of 8 steps; proposed and demonstrated application of Priority Distribution Method for the selected criteria weighting. Moreover, a set of data mining methods for evaluation of the motivational system outcomes was proposed. The described methodology for calculating performance related payment needs practical approbation. We plan to develop semi-automated tools for institutional and personal performance indicators monitoring. The final step would be approbation of the methodology in a healthcare facility.

  15. Full-field modal analysis during base motion excitation using high-speed 3D digital image correlation

    NASA Astrophysics Data System (ADS)

    Molina-Viedma, Ángel J.; López-Alba, Elías; Felipe-Sesé, Luis; Díaz, Francisco A.

    2017-10-01

    In recent years, many efforts have been made to exploit full-field measurement optical techniques for modal identification. Three-dimensional digital image correlation using high-speed cameras has been extensively employed for this purpose. Modal identification algorithms are applied to process the frequency response functions (FRF), which relate the displacement response of the structure to the excitation force. However, one of the most common tests for modal analysis involves the base motion excitation of a structural element instead of force excitation. In this case, the relationship between response and excitation is typically based on displacements, which are known as transmissibility functions. In this study, a methodology for experimental modal analysis using high-speed 3D digital image correlation and base motion excitation tests is proposed. In particular, a cantilever beam was excited from its base with a random signal, using a clamped edge join. Full-field transmissibility functions were obtained through the beam and converted into FRF for proper identification, considering a single degree-of-freedom theoretical conversion. Subsequently, modal identification was performed using a circle-fit approach. The proposed methodology facilitates the management of the typically large amounts of data points involved in the DIC measurement during modal identification. Moreover, it was possible to determine the natural frequencies, damping ratios and full-field mode shapes without requiring any additional tests. Finally, the results were experimentally validated by comparing them with those obtained by employing traditional accelerometers, analytical models and finite element method analyses. The comparison was performed by using the quantitative indicator modal assurance criterion. The results showed a high level of correspondence, consolidating the proposed experimental methodology.

  16. Computational Fluid Dynamics (CFD) Analysis for the Reduction of Impeller Discharge Flow Distortion

    NASA Technical Reports Server (NTRS)

    Garcia, R.; McConnaughey, P. K.; Eastland, A.

    1993-01-01

    The use of Computational Fluid Dynamics (CFD) in the design and analysis of high performance rocket engine pumps has increased in recent years. This increase has been aided by the activities of the Marshall Space Flight Center (MSFC) Pump Stage Technology Team (PSTT). The team's goals include assessing the accuracy and efficiency of several methodologies and then applying the appropriate methodology(s) to understand and improve the flow inside a pump. The PSTT's objectives, team membership, and past activities are discussed in Garcia1 and Garcia2. The PSTT is one of three teams that form the NASA/MSFC CFD Consortium for Applications in Propulsion Technology (McConnaughey3). The PSTT first applied CFD in the design of the baseline consortium impeller. This impeller was designed for the Space Transportation Main Engine's (STME) fuel turbopump. The STME fuel pump was designed with three impeller stages because a two-stage design was deemed to pose a high developmental risk. The PSTT used CFD to design an impeller whose performance allowed for a two-stage STME fuel pump design. The availability of this design would have lead to a reduction in parts, weight, and cost had the STME reached production. One sample of the baseline consortium impeller was manufactured and tested in a water rig. The test data showed that the impeller performance was as predicted and that a two-stage design for the STME fuel pump was possible with minimal risk. The test data also verified another CFD predicted characteristic of the design that was not desirable. The classical 'jet-wake' pattern at the impeller discharge was strengthened by two aspects of the design: by the high head coefficient necessary for the required pressure rise and by the relatively few impeller exit blades, 12, necessary to reduce manufacturing cost. This 'jet-wake pattern produces an unsteady loading on the diffuser vanes and has, in past rocket engine programs, lead to diffuser structural failure. In industrial applications, this problem is typically avoided by increasing the space between the impeller and the diffuser to allow the dissipation of this pattern and, hence, the reduction of diffuser vane unsteady loading. This approach leads to small performance losses and, more importantly in rocket engine applications, to significant increases in the pump's size and weight. This latter consideration typically makes this approach unacceptable in high performance rocket engines.

  17. Methodological considerations for documenting the energy demand of dance activity: a review

    PubMed Central

    Beck, Sarah; Redding, Emma; Wyon, Matthew A.

    2015-01-01

    Previous research has explored the intensity of dance class, rehearsal, and performance and attempted to document the body's physiological adaptation to these activities. Dance activity is frequently described as: complex, diverse, non-steady state, intermittent, of moderate to high intensity, and with notable differences between training and performance intensities and durations. Many limitations are noted in the methodologies of previous studies creating barriers to consensual conclusion. The present study therefore aims to examine the previous body of literature and in doing so, seeks to highlight important methodological considerations for future research in this area to strengthen our knowledge base. Four recommendations are made for future research. Firstly, research should continue to be dance genre specific, with detailed accounts of technical and stylistic elements of the movement vocabulary examined given wherever possible. Secondly, a greater breadth of performance repertoire, within and between genres, needs to be closely examined. Thirdly, a greater focus on threshold measurements is recommended due to the documented complex interplay between aerobic and anaerobic energy systems. Lastly, it is important for research to begin to combine temporal data relating to work and rest periods with real-time measurement of metabolic data in work and rest, in order to be able to quantify demand more accurately. PMID:25999885

  18. Employment of High-Performance Thin-Layer Chromatography for the Quantification of Oleuropein in Olive Leaves and the Selection of a Suitable Solvent System for Its Isolation with Centrifugal Partition Chromatography.

    PubMed

    Boka, Vasiliki-Ioanna; Argyropoulou, Aikaterini; Gikas, Evangelos; Angelis, Apostolis; Aligiannis, Nektarios; Skaltsounis, Alexios-Leandros

    2015-11-01

    A high-performance thin-layer chromatographic methodology was developed and validated for the isolation and quantitative determination of oleuropein in two extracts of Olea europaea leaves. OLE_A was a crude acetone extract, while OLE_AA was its defatted residue. Initially, high-performance thin-layer chromatography was employed for the purification process of oleuropein with fast centrifugal partition chromatography, replacing high-performance liquid-chromatography, in the stage of the determination of the distribution coefficient and the retention volume. A densitometric method was developed for the determination of the distribution coefficients, KC = CS/CM. The total concentrations of the target compound in the stationary phase (CS) and in the mobile phase (CM) were calculated by the area measured in the high-performance thin-layer chromatogram. The estimated Kc was also used for the calculation of the retention volume, VR, with a chromatographic retention equation. The obtained data were successfully applied for the purification of oleuropein and the experimental results confirmed the theoretical predictions, indicating that high-performance thin-layer chromatography could be an important counterpart in the phytochemical study of natural products. The isolated oleuropein (purity > 95%) was subsequently used for the estimation of its content in each extract with a simple, sensitive and accurate high-performance thin-layer chromatography method. The best fit calibration curve from 1.0 µg/track to 6.0 µg/track of oleuropein was polynomial and the quantification was achieved by UV detection at λ 240 nm. The method was validated giving rise to an efficient and high-throughput procedure, with the relative standard deviation % of repeatability and intermediate precision not exceeding 4.9% and accuracy between 92% and 98% (recovery rates). Moreover, the method was validated for robustness, limit of quantitation, and limit of detection. The amount of oleuropein for OLE_A, OLE_AA, and an aqueous extract of olive leaves was estimated to be 35.5% ± 2.7, 51.5% ± 1.4, and 12.5% ± 0.12, respectively. Statistical analysis proved that the method is repeatable and selective, and can be effectively applied for the estimation of oleuropein in olive leaves' extracts, and could potentially replace high-performance liquid chromatography methodologies developed so far. Thus, the phytochemical investigation of oleuropein could be based on high-performance thin-layer chromatography coupled with separation processes, such as fast centrifugal partition chromatography, showing efficacy and credibility. Georg Thieme Verlag KG Stuttgart · New York.

  19. Modified Dynamic Inversion to Control Large Flexible Aircraft: What's Going On?

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.

    1999-01-01

    High performance aircraft of the future will be designed lighter, more maneuverable, and operate over an ever expanding flight envelope. One of the largest differences from the flight control perspective between current and future advanced aircraft is elasticity. Over the last decade, dynamic inversion methodology has gained considerable popularity in application to highly maneuverable fighter aircraft, which were treated as rigid vehicles. This paper explores dynamic inversion application to an advanced highly flexible aircraft. An initial application has been made to a large flexible supersonic aircraft. In the course of controller design for this advanced vehicle, modifications were made to the standard dynamic inversion methodology. The results of this application were deemed rather promising. An analytical study has been undertaken to better understand the nature of the made modifications and to determine its general applicability. This paper presents the results of this initial analytical look at the modifications to dynamic inversion to control large flexible aircraft.

  20. Development and evaluation of a high-performance liquid chromatography/isotope ratio mass spectrometry methodology for delta13C analyses of amino sugars in soil.

    PubMed

    Bodé, Samuel; Denef, Karolien; Boeckx, Pascal

    2009-08-30

    Amino sugars have been used as biomarkers to assess the relative contribution of dead microbial biomass of different functional groups of microorganisms to soil carbon pools. However, little is known about the dynamics of these compounds in soil. The isotopic composition of individual amino sugars can be used as a tool to determine the turnover of these compounds. Methods to determine the delta(13)C of amino sugars using gas chromatography/combustion/isotope ratio mass spectrometry (GC/C/IRMS) have been proposed in literature. However, due to derivatization, the uncertainty on the obtained delta(13)C is too high to be used for natural abundance studies. Therefore, a new high-performance liquid chromatography/isotope ratio mass spectrometry (HPLC/IRMS) methodology, with increased accuracy and precision, has been developed. The repeatability on the obtained delta(13)C values when pure amino sugars were analyzed were not significantly concentration-dependent as long as the injected amount was higher than 1.5 nmol. The delta(13)C value of the same amino sugar spiked to a soil deviated by only 0.3 per thousand from the theoretical value. 2009 John Wiley & Sons, Ltd.

  1. Multi-class methodology to determine pesticides and mycotoxins in green tea and royal jelly supplements by liquid chromatography coupled to Orbitrap high resolution mass spectrometry.

    PubMed

    Martínez-Domínguez, Gerardo; Romero-González, Roberto; Garrido Frenich, Antonia

    2016-04-15

    A multi-class methodology was developed to determine pesticides and mycotoxins in food supplements. The extraction was performed using acetonitrile acidified with formic acid (1%, v/v). Different clean-up sorbents were tested, and the best results were obtained using C18 and zirconium oxide for green tea and royal jelly, respectively. The compounds were determined using ultra high performance liquid chromatography (UHPLC) coupled to Exactive-Orbitrap high resolution mass spectrometry (HRMS). The recovery rates obtained were between 70% and 120% for most of the compounds studied with a relative standard deviation <25%, at three different concentration levels. The calculated limits of quantification (LOQ) were <10 μg/kg. The method was applied to green tea (10) and royal jelly (8) samples. Nine (eight of green tea and one of royal jelly) samples were found to be positive for pesticides at concentrations ranging from 10.6 (cinosulfuron) to 47.9 μg/kg (paclobutrazol). The aflatoxin B1 (5.4 μg/kg) was also found in one of the green tea samples. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Evaluating stakeholder management performance using a stakeholder report card: the next step in theory and practice.

    PubMed

    Malvey, Donna; Fottler, Myron D; Slovensky, Donna J

    2002-01-01

    In the highly competitive health care environment, the survival of an organization may depend on how well powerful stakeholders are managed. Yet, the existing strategic stakeholder management process does not include evaluation of stakeholder management performance. To address this critical gap, this paper proposes a systematic method for evaluation using a stakeholder report card. An example of a physician report card based on this methodology is presented.

  3. Incremental wind tunnel testing of high lift systems

    NASA Astrophysics Data System (ADS)

    Victor, Pricop Mihai; Mircea, Boscoianu; Daniel-Eugeniu, Crunteanu

    2016-06-01

    Efficiency of trailing edge high lift systems is essential for long range future transport aircrafts evolving in the direction of laminar wings, because they have to compensate for the low performance of the leading edge devices. Modern high lift systems are subject of high performance requirements and constrained to simple actuation, combined with a reduced number of aerodynamic elements. Passive or active flow control is thus required for the performance enhancement. An experimental investigation of reduced kinematics flap combined with passive flow control took place in a low speed wind tunnel. The most important features of the experimental setup are the relatively large size, corresponding to a Reynolds number of about 2 Million, the sweep angle of 30 degrees corresponding to long range airliners with high sweep angle wings and the large number of flap settings and mechanical vortex generators. The model description, flap settings, methodology and results are presented.

  4. New methodology to reconstruct in 2-D the cuspal enamel of modern human lower molars.

    PubMed

    Modesto-Mata, Mario; García-Campos, Cecilia; Martín-Francés, Laura; Martínez de Pinillos, Marina; García-González, Rebeca; Quintino, Yuliet; Canals, Antoni; Lozano, Marina; Dean, M Christopher; Martinón-Torres, María; Bermúdez de Castro, José María

    2017-08-01

    In the last years different methodologies have been developed to reconstruct worn teeth. In this article, we propose a new 2-D methodology to reconstruct the worn enamel of lower molars. Our main goals are to reconstruct molars with a high level of accuracy when measuring relevant histological variables and to validate the methodology calculating the errors associated with the measurements. This methodology is based on polynomial regression equations, and has been validated using two different dental variables: cuspal enamel thickness and crown height of the protoconid. In order to perform the validation process, simulated worn modern human molars were employed. The associated errors of the measurements were also estimated applying methodologies previously proposed by other authors. The mean percentage error estimated in reconstructed molars for these two variables in comparison with their own real values is -2.17% for the cuspal enamel thickness of the protoconid and -3.18% for the crown height of the protoconid. This error significantly improves the results of other methodologies, both in the interobserver error and in the accuracy of the measurements. The new methodology based on polynomial regressions can be confidently applied to the reconstruction of cuspal enamel of lower molars, as it improves the accuracy of the measurements and reduces the interobserver error. The present study shows that it is important to validate all methodologies in order to know the associated errors. This new methodology can be easily exportable to other modern human populations, the human fossil record and forensic sciences. © 2017 Wiley Periodicals, Inc.

  5. ASTM and VAMAS activities in titanium matrix composites test methods development

    NASA Technical Reports Server (NTRS)

    Johnson, W. S.; Harmon, D. M.; Bartolotta, P. A.; Russ, S. M.

    1994-01-01

    Titanium matrix composites (TMC's) are being considered for a number of aerospace applications ranging from high performance engine components to airframe structures in areas that require high stiffness to weight ratios at temperatures up to 400 C. TMC's exhibit unique mechanical behavior due to fiber-matrix interface failures, matrix cracks bridged by fibers, thermo-viscoplastic behavior of the matrix at elevated temperatures, and the development of significant thermal residual stresses in the composite due to fabrication. Standard testing methodology must be developed to reflect the uniqueness of this type of material systems. The purpose of this paper is to review the current activities in ASTM and Versailles Project on Advanced Materials and Standards (VAMAS) that are directed toward the development of standard test methodology for titanium matrix composites.

  6. Horizon Mission Methodology - A tool for the study of technology innovation and new paradigms

    NASA Technical Reports Server (NTRS)

    Anderson, John L.

    1993-01-01

    The Horizon Mission (HM) methodology was developed to provide a means of identifying and evaluating highly innovative, breakthrough technology concepts (BTCs) and for assessing their potential impact on advanced space missions. The methodology is based on identifying new capabilities needed by hypothetical 'horizon' space missions having performance requirements that cannot be met even by extrapolating known space technologies. Normal human evaluation of new ideas such as BTCs appears to be governed (and limited) by 'inner models of reality' defined as paradigms. Thus, new ideas are evaluated by old models. This paper describes the use of the HM Methodology to define possible future paradigms that would provide alternatives to evaluation by current paradigms. The approach is to represent a future paradigm by a set of new BTC-based capabilities - called a paradigm abstract. The paper describes methods of constructing and using the abstracts for evaluating BTCs for space applications and for exploring the concept of paradigms and paradigm shifts as a representation of technology innovation.

  7. Learning outcomes of "The Oncology Patient" study among nursing students: A comparison of teaching strategies.

    PubMed

    Roca, Judith; Reguant, Mercedes; Canet, Olga

    2016-11-01

    Teaching strategies are essential in order to facilitate meaningful learning and the development of high-level thinking skills in students. To compare three teaching methodologies (problem-based learning, case-based teaching and traditional methods) in terms of the learning outcomes achieved by nursing students. This quasi-experimental research was carried out in the Nursing Degree programme in a group of 74 students who explored the subject of The Oncology Patient through the aforementioned strategies. A performance test was applied based on Bloom's Revised Taxonomy. A significant correlation was found between the intragroup theoretical and theoretical-practical dimensions. Likewise, intergroup differences were related to each teaching methodology. Hence, significant differences were estimated between the traditional methodology (x-=9.13), case-based teaching (x-=12.96) and problem-based learning (x-=14.84). Problem-based learning was shown to be the most successful learning method, followed by case-based teaching and the traditional methodology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Enhanced methodology of focus control and monitoring on scanner tool

    NASA Astrophysics Data System (ADS)

    Chen, Yen-Jen; Kim, Young Ki; Hao, Xueli; Gomez, Juan-Manuel; Tian, Ye; Kamalizadeh, Ferhad; Hanson, Justin K.

    2017-03-01

    As the demand of the technology node shrinks from 14nm to 7nm, the reliability of tool monitoring techniques in advanced semiconductor fabs to achieve high yield and quality becomes more critical. Tool health monitoring methods involve periodic sampling of moderately processed test wafers to detect for particles, defects, and tool stability in order to ensure proper tool health. For lithography TWINSCAN scanner tools, the requirements for overlay stability and focus control are very strict. Current scanner tool health monitoring methods include running BaseLiner to ensure proper tool stability on a periodic basis. The focus measurement on YIELDSTAR by real-time or library-based reconstruction of critical dimensions (CD) and side wall angle (SWA) has been demonstrated as an accurate metrology input to the control loop. The high accuracy and repeatability of the YIELDSTAR focus measurement provides a common reference of scanner setup and user process. In order to further improve the metrology and matching performance, Diffraction Based Focus (DBF) metrology enabling accurate, fast, and non-destructive focus acquisition, has been successfully utilized for focus monitoring/control of TWINSCAN NXT immersion scanners. The optimal DBF target was determined to have minimized dose crosstalk, dynamic precision, set-get residual, and lens aberration sensitivity. By exploiting this new measurement target design, 80% improvement in tool-to-tool matching, >16% improvement in run-to-run mean focus stability, and >32% improvement in focus uniformity have been demonstrated compared to the previous BaseLiner methodology. Matching <2.4 nm across multiple NXT immersion scanners has been achieved with the new methodology of set baseline reference. This baseline technique, with either conventional BaseLiner low numerical aperture (NA=1.20) mode or advanced illumination high NA mode (NA=1.35), has also been evaluated to have consistent performance. This enhanced methodology of focus control and monitoring on multiple illumination conditions, opens an avenue to significantly reduce Focus-Exposure Matrix (FEM) wafer exposure for new product/layer best focus (BF) setup.

  9. Engineering and programming manual: Two-dimensional kinetic reference computer program (TDK)

    NASA Technical Reports Server (NTRS)

    Nickerson, G. R.; Dang, L. D.; Coats, D. E.

    1985-01-01

    The Two Dimensional Kinetics (TDK) computer program is a primary tool in applying the JANNAF liquid rocket thrust chamber performance prediction methodology. The development of a methodology that includes all aspects of rocket engine performance from analytical calculation to test measurements, that is physically accurate and consistent, and that serves as an industry and government reference is presented. Recent interest in rocket engines that operate at high expansion ratio, such as most Orbit Transfer Vehicle (OTV) engine designs, has required an extension of the analytical methods used by the TDK computer program. Thus, the version of TDK that is described in this manual is in many respects different from the 1973 version of the program. This new material reflects the new capabilities of the TDK computer program, the most important of which are described.

  10. Rational Design Methodology.

    DTIC Science & Technology

    1978-09-01

    This report describes an effort to specify a software design methodology applicable to the Air Force software environment . Available methodologies...of techniques for proof of correctness, design specification, and performance assessment of static designs. The rational methodology selected is a

  11. An optimization methodology for heterogeneous minor actinides transmutation

    NASA Astrophysics Data System (ADS)

    Kooyman, Timothée; Buiron, Laurent; Rimpault, Gérald

    2018-04-01

    In the case of a closed fuel cycle, minor actinides transmutation can lead to a strong reduction in spent fuel radiotoxicity and decay heat. In the heterogeneous approach, minor actinides are loaded in dedicated targets located at the core periphery so that long-lived minor actinides undergo fission and are turned in shorter-lived fission products. However, such targets require a specific design process due to high helium production in the fuel, high flux gradient at the core periphery and low power production. Additionally, the targets are generally manufactured with a high content in minor actinides in order to compensate for the low flux level at the core periphery. This leads to negative impacts on the fuel cycle in terms of neutron source and decay heat of the irradiated targets, which penalize their handling and reprocessing. In this paper, a simplified methodology for the design of targets is coupled with a method for the optimization of transmutation which takes into account both transmutation performances and fuel cycle impacts. The uncertainties and performances of this methodology are evaluated and shown to be sufficient to carry out scoping studies. An illustration is then made by considering the use of moderating material in the targets, which has a positive impact on the minor actinides consumption but a negative impact both on fuel cycle constraints (higher decay heat and neutron) and on assembly design (higher helium production and lower fuel volume fraction). It is shown that the use of moderating material is an optimal solution of the transmutation problem with regards to consumption and fuel cycle impacts, even when taking geometrical design considerations into account.

  12. Evaluation of operational, economic, and environmental performance of mixed and selective collection of municipal solid waste: Porto case study.

    PubMed

    Teixeira, Carlos A; Russo, Mário; Matos, Cristina; Bentes, Isabel

    2014-12-01

    This article describes an accurate methodology for an operational, economic, and environmental assessment of municipal solid waste collection. The proposed methodological tool uses key performance indicators to evaluate independent operational and economic efficiency and performance of municipal solid waste collection practices. These key performance indicators are then used in life cycle inventories and life cycle impact assessment. Finally, the life cycle assessment environmental profiles provide the environmental assessment. We also report a successful application of this tool through a case study in the Portuguese city of Porto. Preliminary results demonstrate the applicability of the methodological tool to real cases. Some of the findings focus a significant difference between average mixed and selective collection effective distance (2.14 km t(-1); 16.12 km t(-1)), fuel consumption (3.96 L t(-1); 15.37 L t(-1)), crew productivity (0.98 t h(-1) worker(-1); 0.23 t h(-1) worker(-1)), cost (45.90 € t(-1); 241.20 € t(-1)), and global warming impact (19.95 kg CO2eq t(-1); 57.47 kg CO2eq t(-1)). Preliminary results consistently indicate: (a) higher global performance of mixed collection as compared with selective collection; (b) dependency of collection performance, even in urban areas, on the waste generation rate and density; (c) the decline of selective collection performances with decreasing source-separated material density and recycling collection rate; and (d) that the main threats to collection route efficiency are the extensive collection distances, high fuel consumption vehicles, and reduced crew productivity. © The Author(s) 2014.

  13. 75 FR 8649 - Request for Comments on Methodology for Conducting an Independent Study of the Burden of Patent...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-25

    ...] Request for Comments on Methodology for Conducting an Independent Study of the Burden of Patent-Related... methodologies for performing such a study (Methodology Report). ICF has now provided the USPTO with its Methodology Report, in which ICF recommends methodologies for addressing various topics about estimating the...

  14. Statistical Anomalies of Bitflips in SRAMs to Discriminate SBUs From MCUs

    NASA Astrophysics Data System (ADS)

    Clemente, Juan Antonio; Franco, Francisco J.; Villa, Francesca; Baylac, Maud; Rey, Solenne; Mecha, Hortensia; Agapito, Juan A.; Puchner, Helmut; Hubert, Guillaume; Velazco, Raoul

    2016-08-01

    Recently, the occurrence of multiple events in static tests has been investigated by checking the statistical distribution of the difference between the addresses of the words containing bitflips. That method has been successfully applied to Field Programmable Gate Arrays (FPGAs) and the original authors indicate that it is also valid for SRAMs. This paper presents a modified methodology that is based on checking the XORed addresses with bitflips, rather than on the difference. Irradiation tests on CMOS 130 & 90 nm SRAMs with 14-MeV neutrons have been performed to validate this methodology. Results in high-altitude environments are also presented and cross-checked with theoretical predictions. In addition, this methodology has also been used to detect modifications in the organization of said memories. Theoretical predictions have been validated with actual data provided by the manufacturer.

  15. Development of performance assessment methodology for nuclear waste isolation in geologic media

    NASA Astrophysics Data System (ADS)

    Bonano, E. J.; Chu, M. S. Y.; Cranwell, R. M.; Davis, P. A.

    The burial of nuclear wastes in deep geologic formations as a means for their disposal is an issue of significant technical and social impact. The analysis of the processes involved can be performed only with reliable mathematical models and computer codes as opposed to conducting experiments because the time scales associated are on the order of tens of thousands of years. These analyses are concerned primarily with the migration of radioactive contaminants from the repository to the environment accessible to humans. Modeling of this phenomenon depends on a large number of other phenomena taking place in the geologic porous and/or fractured medium. These are ground-water flow, physicochemical interactions of the contaminants with the rock, heat transfer, and mass transport. Once the radionuclides have reached the accessible environment, the pathways to humans and health effects are estimated. A performance assessment methodology for a potential high-level waste repository emplaced in a basalt formation has been developed for the U.S. Nuclear Regulatory Commission.

  16. Weapons and Tactics Instructor Course 2-16 Sleep and Performance Study

    DTIC Science & Technology

    2017-03-01

    assessments showed a significant increase in self-reported fatigue as the course progressed. This thesis outlines a detailed methodology and lessons...increase in self-reported fatigue as the course progressed. This thesis outlines a detailed methodology and lessons learned for follow-on studies of...Performance as a Result of Insufficient Sleep .........23  III.  METHODOLOGY

  17. Carboxyl-rich plasma polymer surfaces in surface plasmon resonance immunosensing

    NASA Astrophysics Data System (ADS)

    Makhneva, Ekaterina; Obrusník, Adam; Farka, Zdeněk; Skládal, Petr; Vandenbossche, Marianne; Hegemann, Dirk; Zajíčková, Lenka

    2018-01-01

    Stable carboxyl-rich plasma polymers (PPs) were deposited onto the gold surface of surface plasmon resonance (SPR) chips under conditions that were chosen based on lumped kinetic model results. Carboxyl-rich films are of high interest for bio-applications thanks to their high reactivity, allowing the formation of covalent linkages between biomolecules and a surface. Accordingly, the monoclonal antibody, specific to human serum albumin (HSA), was immobilized and the performance of SPR immunosensors was evaluated by the immunoassay flow test. The developed sensors performed high level of stability and provided selective and high response to the HSA antigen solutions. The achieved results confirmed that the presented methodologies for the grafting of biomolecules on the gold surfaces have great potential for biosensing applications.

  18. Developing an Index to Measure Health System Performance: Measurement for Districts of Nepal.

    PubMed

    Kandel, N; Fric, A; Lamichhane, J

    2014-01-01

    Various frameworks for measuring health system performance have been proposed and discussed. The scope of using performance indicators are broad, ranging from examining national health system to individual patients at various levels of health system. Development of innovative and easy index is essential to measure multidimensionality of health systems. We used indicators, which also serve as proxy to the set of activities, whose primary goal is to maintain and improve health. We used eleven indicators of MDGs, which represent all dimensions of health to develop index. These indicators are computed with similar methodology that of human development index. We used published data of Nepal for computation of the index for districts of Nepal as an illustration. To validate our finding, we compared the indices of these districts with other development indices of Nepal. An index for each district has been computed from eleven indicators. Then indices are compared with that of human development index, socio-economic and infrastructure development indices and findings has shown the similarity on distribution of districts. Categories of low and high performing districts on health system performance are also having low and high human development, socio-economic, and infrastructure indices respectively. This methodology of computing index from various indicators could assist policy makers and program managers to prioritize activities based on their performance. Validation of the findings with that of other development indicators show that this can be one of the tools, which can assist on assessing health system performance for policy makers, program managers and others.

  19. Application-specific coarse-grained reconfigurable array: architecture and design methodology

    NASA Astrophysics Data System (ADS)

    Zhou, Li; Liu, Dongpei; Zhang, Jianfeng; Liu, Hengzhu

    2015-06-01

    Coarse-grained reconfigurable arrays (CGRAs) have shown potential for application in embedded systems in recent years. Numerous reconfigurable processing elements (PEs) in CGRAs provide flexibility while maintaining high performance by exploring different levels of parallelism. However, a difference remains between the CGRA and the application-specific integrated circuit (ASIC). Some application domains, such as software-defined radios (SDRs), require flexibility with performance demand increases. More effective CGRA architectures are expected to be developed. Customisation of a CGRA according to its application can improve performance and efficiency. This study proposes an application-specific CGRA architecture template composed of generic PEs (GPEs) and special PEs (SPEs). The hardware of the SPE can be customised to accelerate specific computational patterns. An automatic design methodology that includes pattern identification and application-specific function unit generation is also presented. A mapping algorithm based on ant colony optimisation is provided. Experimental results on the SDR target domain show that compared with other ordinary and application-specific reconfigurable architectures, the CGRA generated by the proposed method performs more efficiently for given applications.

  20. Methodology for estimating helicopter performance and weights using limited data

    NASA Technical Reports Server (NTRS)

    Baserga, Claudio; Ingalls, Charles; Lee, Henry; Peyran, Richard

    1990-01-01

    Methodology is developed and described for estimating the flight performance and weights of a helicopter for which limited data are available. The methodology is based on assumptions which couple knowledge of the technology of the helicopter under study with detailed data from well documented helicopters thought to be of similar technology. The approach, analysis assumptions, technology modeling, and the use of reference helicopter data are discussed. Application of the methodology is illustrated with an investigation of the Agusta A129 Mangusta.

  1. Methodological quality evaluation of systematic reviews or meta-analyses on ERCC1 in non-small cell lung cancer: a systematic review.

    PubMed

    Tao, Huan; Zhang, Yueyuan; Li, Qian; Chen, Jin

    2017-11-01

    To assess the methodological quality of systematic reviews (SRs) or meta-analysis concerning the predictive value of ERCC1 in platinum chemotherapy of non-small cell lung cancer. We searched the PubMed, EMbase, Cochrane library, international prospective register of systematic reviews, Chinese BioMedical Literature Database, China National Knowledge Infrastructure, Wan Fang and VIP database for SRs or meta-analysis. The methodological quality of included literatures was evaluated by risk of bias in systematic review (ROBIS) scale. Nineteen eligible SRs/meta-analysis were included. The most frequently searched databases were EMbase (74%), PubMed, Medline and CNKI. Fifteen SRs did additional retrieval manually, but none of them retrieved the registration platform. 47% described the two-reviewers model in the screening for eligible original articles, and seven SRs described the two reviewers to extract data. In methodological quality assessment, inter-rater reliability Kappa was 0.87 between two reviewers. Research question were well related to all SRs in phase 1 and the eligibility criteria was suitable for each SR, and rated as 'low' risk bias. But the 'high' risk bias existed in all the SRs regarding methods used to identify and/or select studies, and data collection and study appraisal. More than two-third of SRs or meta-analysis were finished with high risk of bias in the synthesis, findings and the final phase. The study demonstrated poor methodological quality of SRs/meta-analysis assessing the predictive value of ERCC1 in chemotherapy among the NSCLC patients, especially the high performance bias. Registration or publishing the protocol is recommended in future research.

  2. A study of high-temperature heat pipes with multiple heat sources and sinks. I - Experimental methodology and frozen startup profiles. II - Analysis of continuum transient and steady-state experimental data with numerical predictions

    NASA Technical Reports Server (NTRS)

    Faghri, A.; Cao, Y.; Buchko, M.

    1991-01-01

    Experimental profiles for heat pipe startup from the frozen state were obtained, using a high-temperature sodium/stainless steel pipe with multiple heat sources and sinks to investigate the startup behavior of the heat pipe for various heat loads and input locations, with both low and high heat rejection rates at the condensor. The experimental results of the performance characteristics for the continuum transient and steady-state operation of the heat pipe were analyzed, and the performance limits for operation with varying heat fluxes and location are determined.

  3. Examining Middle School Science Student Self-Regulated Learning in a Hypermedia Learning Environment through Microanalysis

    NASA Astrophysics Data System (ADS)

    Mandell, Brian E.

    The purpose of the present embedded mixed method study was to examine the self-regulatory processes used by high, average, and low achieving seventh grade students as they learned about a complex science topic from a hypermedia learning environment. Thirty participants were sampled. Participants were administered a number of measures to assess their achievement and self-efficacy. In addition, a microanalytic methodology, grounded in Zimmerman's cyclical model of self-regulated learning, was used to assess student self-regulated learning. It was hypothesized that there would be modest positive correlations between Zimmerman's three phases of self-regulated learning, that high achieving science students would deploy more self-regulatory subprocesses than average and low achieving science students, that high achieving science students would have higher self-efficacy beliefs to engage in self-regulated learning than average and low achieving science students, and that low achieving science students would over-estimate their self-efficacy for performance beliefs, average achieving science students would slightly overestimate their self-efficacy for performance beliefs, and high achieving science students would under-estimate their self-efficacy for performance beliefs. All hypotheses were supported except for the high achieving science students who under-estimated their self-efficacy for performance beliefs on the Declarative Knowledge Measure and slightly overestimated their self-efficacy for performance beliefs on the Conceptual Knowledge Measure. Finally, all measures of self-regulated learning were combined and entered into a regression formula to predict the students' scores on the two science tests, and it was revealed that the combined measure predicted 91% of the variance on the Declarative Knowledge Measure and 92% of the variance on the Conceptual Knowledge Measure. This study adds hypermedia learning environments to the contexts that the microanalytic methodology has been successfully administered. Educational implications and limitations to the study are also discussed.

  4. E-Learning as an Emerging Technology in India

    ERIC Educational Resources Information Center

    Grover, Pooja; Gupta, Nehta

    2010-01-01

    E-learning is a combination of learning services and technology that allow us to provide high value integrated learning any time, any place. It is about a new blend of resources, interactivity, performance support and structured learning activities. This methodology makes use of various types of technologies in order to enhance or transform the…

  5. Students' Achievements and Misunderstandings When Solving Problems Using Electronics Models--A Case Study

    ERIC Educational Resources Information Center

    Trotskovsky, Elena; Sabag, Nissim; Waks, Shlomo

    2015-01-01

    This paper examines students' achievements in solving problems and their misunderstandings when using models. A mixed research methodology was applied. Quantitative research investigated how the performance of students with various levels of high school GPAs correlated with their rating of their lecturers' teaching proficiency. Four lecturers and…

  6. Running R Statistical Computing Environment Software on the Peregrine

    Science.gov Websites

    for the development of new statistical methodologies and enjoys a large user base. Please consult the distribution details. Natural language support but running in an English locale R is a collaborative project programming paradigms to better leverage modern HPC systems. The CRAN task view for High Performance Computing

  7. Free online access to experimental and predicted chemical properties through the EPA’s CompTox Chemistry Dashboard (ACS Spring meeting)

    EPA Science Inventory

    The increasing number and size of public databases is facilitating the collection of chemical structures and associated experimental data for QSAR modeling. However, the performance of QSAR models is highly dependent not only on the modeling methodology, but also on the quality o...

  8. Methodological Issues in Curriculum-Based Reading Assessment.

    ERIC Educational Resources Information Center

    Fuchs, Lynn S.; And Others

    1984-01-01

    Three studies involving elementary students examined methodological issues in curriculum-based reading assessment. Results indicated that (1) whereas sample duration did not affect concurrent validity, increasing duration reduced performance instability and increased performance slopes and (2) domain size was related inversely to performance slope…

  9. The methodological quality of systematic reviews of animal studies in dentistry.

    PubMed

    Faggion, C M; Listl, S; Giannakopoulos, N N

    2012-05-01

    Systematic reviews and meta-analyses of animal studies are important for improving estimates of the effects of treatment and for guiding future clinical studies on humans. The purpose of this systematic review was to assess the methodological quality of systematic reviews and meta-analyses of animal studies in dentistry through using a validated checklist. A literature search was conducted independently and in duplicate in the PubMed and LILACS databases. References in selected systematic reviews were assessed to identify other studies not captured by the electronic searches. The methodological quality of studies was assessed independently and in duplicate by using the AMSTAR checklist; the quality was scored as low, moderate, or high. The reviewers were calibrated before the assessment and agreement between them was assessed using Cohen's Kappa statistic. Of 444 studies retrieved, 54 systematic reviews were selected after full-text assessment. Agreement between the reviewers was regarded as excellent. Only two studies were scored as high quality; 17 and 35 studies were scored as medium and low quality, respectively. There is room for improvement of the methodological quality of systematic reviews of animal studies in dentistry. Checklists, such as AMSTAR, can guide researchers in planning and executing systematic reviews and meta-analyses. For determining the need for additional investigations in animals and in order to provide good data for potential application in human, such reviews should be based on animal experiments performed according to sound methodological principles. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Frequency of data extraction errors and methods to increase data extraction quality: a methodological review.

    PubMed

    Mathes, Tim; Klaßen, Pauline; Pieper, Dawid

    2017-11-28

    Our objective was to assess the frequency of data extraction errors and its potential impact on results in systematic reviews. Furthermore, we evaluated the effect of different extraction methods, reviewer characteristics and reviewer training on error rates and results. We performed a systematic review of methodological literature in PubMed, Cochrane methodological registry, and by manual searches (12/2016). Studies were selected by two reviewers independently. Data were extracted in standardized tables by one reviewer and verified by a second. The analysis included six studies; four studies on extraction error frequency, one study comparing different reviewer extraction methods and two studies comparing different reviewer characteristics. We did not find a study on reviewer training. There was a high rate of extraction errors (up to 50%). Errors often had an influence on effect estimates. Different data extraction methods and reviewer characteristics had moderate effect on extraction error rates and effect estimates. The evidence base for established standards of data extraction seems weak despite the high prevalence of extraction errors. More comparative studies are needed to get deeper insights into the influence of different extraction methods.

  11. High-Performance Buildings – Value, Messaging, Financial and Policy Mechanisms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCabe, Molly

    At the request of the Pacific Northwest National Laboratory, an in-depth analysis of the rapidly evolving state of real estate investments, high-performance building technology, and interest in efficiency was conducted by HaydenTanner, LLC, for the U.S. Department of Energy (DOE) Building Technologies Program. The analysis objectives were • to evaluate the link between high-performance buildings and their market value • to identify core messaging to motivate owners, investors, financiers, and others in the real estate sector to appropriately value and deploy high-performance strategies and technologies across new and existing buildings • to summarize financial mechanisms that facilitate increased investment inmore » these buildings. To meet these objectives, work consisted of a literature review of relevant writings, examination of existing and emergent financial and policy mechanisms, interviews with industry stakeholders, and an evaluation of the value implications through financial modeling. This report documents the analysis methodology and findings, conclusion and recommendations. Its intent is to support and inform the DOE Building Technologies Program on policy and program planning for the financing of high-performance new buildings and building retrofit projects.« less

  12. Analog design optimization methodology for ultralow-power circuits using intuitive inversion-level and saturation-level parameters

    NASA Astrophysics Data System (ADS)

    Eimori, Takahisa; Anami, Kenji; Yoshimatsu, Norifumi; Hasebe, Tetsuya; Murakami, Kazuaki

    2014-01-01

    A comprehensive design optimization methodology using intuitive nondimensional parameters of inversion-level and saturation-level is proposed, especially for ultralow-power, low-voltage, and high-performance analog circuits with mixed strong, moderate, and weak inversion metal-oxide-semiconductor transistor (MOST) operations. This methodology is based on the synthesized charge-based MOST model composed of Enz-Krummenacher-Vittoz (EKV) basic concepts and advanced-compact-model (ACM) physics-based equations. The key concept of this methodology is that all circuit and system characteristics are described as some multivariate functions of inversion-level parameters, where the inversion level is used as an independent variable representative of each MOST. The analog circuit design starts from the first step of inversion-level design using universal characteristics expressed by circuit currents and inversion-level parameters without process-dependent parameters, followed by the second step of foundry-process-dependent design and the last step of verification using saturation-level criteria. This methodology also paves the way to an intuitive and comprehensive design approach for many kinds of analog circuit specifications by optimization using inversion-level log-scale diagrams and saturation-level criteria. In this paper, we introduce an example of our design methodology for a two-stage Miller amplifier.

  13. A Measurement and Simulation Based Methodology for Cache Performance Modeling and Tuning

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    We present a cache performance modeling methodology that facilitates the tuning of uniprocessor cache performance for applications executing on shared memory multiprocessors by accurately predicting the effects of source code level modifications. Measurements on a single processor are initially used for identifying parts of code where cache utilization improvements may significantly impact the overall performance. Cache simulation based on trace-driven techniques can be carried out without gathering detailed address traces. Minimal runtime information for modeling cache performance of a selected code block includes: base virtual addresses of arrays, virtual addresses of variables, and loop bounds for that code block. Rest of the information is obtained from the source code. We show that the cache performance predictions are as reliable as those obtained through trace-driven simulations. This technique is particularly helpful to the exploration of various "what-if' scenarios regarding the cache performance impact for alternative code structures. We explain and validate this methodology using a simple matrix-matrix multiplication program. We then apply this methodology to predict and tune the cache performance of two realistic scientific applications taken from the Computational Fluid Dynamics (CFD) domain.

  14. COTS Ceramic Chip Capacitors: An Evaluation of the Parts and Assurance Methodologies

    NASA Technical Reports Server (NTRS)

    Brusse, Jay A.; Sampson, Michael J.

    2004-01-01

    Commercial-Off-The-Shelf (COTS) multilayer ceramic chip capacitors (MLCCs) are continually evolving to reduce physical size and increase volumetric efficiency. Designers of high reliability aerospace and military systems are attracted to these attributes of COTS MLCCs and would like to take advantage of them while maintaining the high standards for long-term reliable operation they are accustomed io when selecting military qualified established reliability (MIL-ER) MLCCs. However, MIL-ER MLCCs are not available in the full range of small chip sizes with high capacitance as found in today's COTS MLCCs. The objectives for this evaluation were to assess the long-term performance of small case size COTS MLCCs and to identify effective, lower-cost product assurance methodologies. Fifteen (15) lots of COTS X7R dielectric MLCCs from four (4) different manufacturers and two (2) MIL-ER BX dielectric MLCCs from two (2) of the same manufacturers were evaluated. Both 0805 and 0402 chip sizes were included. Several voltage ratings were tested ranging from a high of 50 volts to a low of 6.3 volts. The evaluation consisted of a comprehensive screening and qualification test program based upon MIL-PRF-55681 (i.e., voltage conditioning, thermal shock, moisture resistance, 2000-hour life test, etc.). In addition, several lot characterization tests were performed including Destructive Physical Analysis (DPA), Highly Accelerated Life Test (HALT) and Dielectric Voltage Breakdown Strength. The data analysis included a comparison of the 2000-hour life test results (used as a metric for long-term performance) relative to the screening and characterization test results. Results of this analysis indicate that the long-term life performance of COTS MLCCs is variable -- some lots perform well, some lots perform poorly. DPA and HALT were found to be promising lot characterization tests to identify substandard COTS MLCC lots prior to conducting more expensive screening and qualification tests. The results indicate that lot- specific screening and qualification are still recommended for high reliability applications. One significant and concerning observation is that MIL- type voltage conditioning (100 hours at twice rated voltage, 125 C) was not an effective screen in removing infant mortality parts for the particular lots of COTS MLCCs evaluated.

  15. A novel high-throughput method for supported liquid extraction of retinol and alpha-tocopherol from human serum and simultaneous quantitation by liquid chromatography tandem mass spectrometry.

    PubMed

    Hinchliffe, Edward; Rudge, James; Reed, Paul

    2016-07-01

    Measurement of vitamin A (retinol) and E (alpha-tocopherol) in UK clinical laboratories is currently performed exclusively by high-performance liquid chromatography with ultraviolet detection. We investigated whether retinol and alpha-tocopherol could be measured simultaneously by liquid chromatography tandem mass spectrometry. Serum samples (100 μL) were extracted using Isolute + Supported Liquid Extraction plates. Chromatography was performed on a Phenomenex Kinetex Biphenyl 2.6 μm, 50 × 2.1 mm column, and liquid chromatography tandem mass spectrometry on a Waters Acquity TQD. Injection-to-injection time was 4.3 min. The assay was validated according to published guidelines. Patient samples were used to compare liquid chromatography tandem mass spectrometry and high-performance liquid chromatography with ultraviolet detection methods. For retinol and alpha-tocopherol, respectively, the assay was linear up to 6.0 and 80.0 μmol/L, and lower limit of quantification was 0.07 and 0.26 μmol/L. Intra and interassay imprecision were within desirable analytical specifications. Analysis of quality control material aligned to NIST SRM 968e, and relative spiked recovery from human serum, both yielded results within 15% of target values. Method comparison with high-performance liquid chromatography with ultraviolet detection methodology demonstrated a negative bias for retinol and alpha-tocopherol by the liquid chromatography tandem mass spectrometry method. Analysis of United Kingdom National External Quality Assurance Scheme samples yielded mean bias from the target value of +3.0% for retinol and -11.2% for alpha-tocopherol. We have developed a novel, high-throughput method for extraction of retinol and alpha-tocopherol from human serum followed by simultaneous quantitation by liquid chromatography tandem mass spectrometry. The method offers a rapid, sensitive, specific and cost-effective alternative to high-performance liquid chromatography with ultraviolet detection methodology, and is suitable for routine clinical monitoring of patients predisposed to fat-soluble vitamin malabsorption. © The Author(s) 2015.

  16. Architecture, Design, and System; Performance Assessment and Development Methodology for Computer-Based Systems. Volume 1. Methodology Description, Discussion, and Assessment,

    DTIC Science & Technology

    1983-12-30

    AD-Ri46 57? ARCHITECTURE DESIGN AND SYSTEM; PERFORMANCE ASSESSMENT i/i AND DEVELOPMENT ME..(U) NAVAL SURFACE WEAPONS CENTER SILYER SPRING MD J...AD-A 146 577 NSIWC TR 83-324 ARCHITECTURE , DESIGN , AND SYSTEM; PERFORMANCE ASSESSMENT AND DEVELOPMENT METHODOLOGY...REPORT NUMBER 12. GOVT ACCESSION NO.3. RECIPIENT’S CATALOG NUMBER NSWC TR 83-324 10- 1 1 51’ 4. ?ITLE (and subtitle) ARCHITECTURE , DESIGN , AND SYSTEM; S

  17. From LCAs to simplified models: a generic methodology applied to wind power electricity.

    PubMed

    Padey, Pierryves; Girard, Robin; le Boulch, Denis; Blanc, Isabelle

    2013-02-05

    This study presents a generic methodology to produce simplified models able to provide a comprehensive life cycle impact assessment of energy pathways. The methodology relies on the application of global sensitivity analysis to identify key parameters explaining the impact variability of systems over their life cycle. Simplified models are built upon the identification of such key parameters. The methodology is applied to one energy pathway: onshore wind turbines of medium size considering a large sample of possible configurations representative of European conditions. Among several technological, geographical, and methodological parameters, we identified the turbine load factor and the wind turbine lifetime as the most influent parameters. Greenhouse Gas (GHG) performances have been plotted as a function of these key parameters identified. Using these curves, GHG performances of a specific wind turbine can be estimated, thus avoiding the undertaking of an extensive Life Cycle Assessment (LCA). This methodology should be useful for decisions makers, providing them a robust but simple support tool for assessing the environmental performance of energy systems.

  18. Dataflow computing approach in high-speed digital simulation

    NASA Technical Reports Server (NTRS)

    Ercegovac, M. D.; Karplus, W. J.

    1984-01-01

    New computational tools and methodologies for the digital simulation of continuous systems were explored. Programmability, and cost effective performance in multiprocessor organizations for real time simulation was investigated. Approach is based on functional style languages and data flow computing principles, which allow for the natural representation of parallelism in algorithms and provides a suitable basis for the design of cost effective high performance distributed systems. The objectives of this research are to: (1) perform comparative evaluation of several existing data flow languages and develop an experimental data flow language suitable for real time simulation using multiprocessor systems; (2) investigate the main issues that arise in the architecture and organization of data flow multiprocessors for real time simulation; and (3) develop and apply performance evaluation models in typical applications.

  19. A methodology for the validated design space exploration of fuel cell powered unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Moffitt, Blake Almy

    Unmanned Aerial Vehicles (UAVs) are the most dynamic growth sector of the aerospace industry today. The need to provide persistent intelligence, surveillance, and reconnaissance for military operations is driving the planned acquisition of over 5,000 UAVs over the next five years. The most pressing need is for quiet, small UAVs with endurance beyond what is capable with advanced batteries or small internal combustion propulsion systems. Fuel cell systems demonstrate high efficiency, high specific energy, low noise, low temperature operation, modularity, and rapid refuelability making them a promising enabler of the small, quiet, and persistent UAVs that military planners are seeking. Despite the perceived benefits, the actual near-term performance of fuel cell powered UAVs is unknown. Until the auto industry began spending billions of dollars in research, fuel cell systems were too heavy for useful flight applications. However, the last decade has seen rapid development with fuel cell gravimetric and volumetric power density nearly doubling every 2--3 years. As a result, a few design studies and demonstrator aircraft have appeared, but overall the design methodology and vehicles are still in their infancy. The design of fuel cell aircraft poses many challenges. Fuel cells differ fundamentally from combustion based propulsion in how they generate power and interact with other aircraft subsystems. As a result, traditional multidisciplinary analysis (MDA) codes are inappropriate. Building new MDAs is difficult since fuel cells are rapidly changing in design, and various competitive architectures exist for balance of plant, hydrogen storage, and all electric aircraft subsystems. In addition, fuel cell design and performance data is closely protected which makes validation difficult and uncertainty significant. Finally, low specific power and high volumes compared to traditional combustion based propulsion result in more highly constrained design spaces that are problematic for design space exploration. To begin addressing the current gaps in fuel cell aircraft development, a methodology has been developed to explore and characterize the near-term performance of fuel cell powered UAVs. The first step of the methodology is the development of a valid MDA. This is accomplished by using propagated uncertainty estimates to guide the decomposition of a MDA into key contributing analyses (CAs) that can be individually refined and validated to increase the overall accuracy of the MDA. To assist in MDA development, a flexible framework for simultaneously solving the CAs is specified. This enables the MDA to be easily adapted to changes in technology and the changes in data that occur throughout a design process. Various CAs that model a polymer electrolyte membrane fuel cell (PEMFC) UAV are developed, validated, and shown to be in agreement with hardware-in-the-loop simulations of a fully developed fuel cell propulsion system. After creating a valid MDA, the final step of the methodology is the synthesis of the MDA with an uncertainty propagation analysis, an optimization routine, and a chance constrained problem formulation. This synthesis allows an efficient calculation of the probabilistic constraint boundaries and Pareto frontiers that will govern the design space and influence design decisions relating to optimization and uncertainty mitigation. A key element of the methodology is uncertainty propagation. The methodology uses Systems Sensitivity Analysis (SSA) to estimate the uncertainty of key performance metrics due to uncertainties in design variables and uncertainties in the accuracy of the CAs. A summary of SSA is provided and key rules for properly decomposing a MDA for use with SSA are provided. Verification of SSA uncertainty estimates via Monte Carlo simulations is provided for both an example problem as well as a detailed MDA of a fuel cell UAV. Implementation of the methodology was performed on a small fuel cell UAV designed to carry a 2.2 kg payload with 24 hours of endurance. Uncertainty distributions for both design variables and the CAs were estimated based on experimental results and were found to dominate the design space. To reduce uncertainty and test the flexibility of the MDA framework, CAs were replaced with either empirical, or semi-empirical relationships during the optimization process. The final design was validated via a hardware-in-the loop simulation. Finally, the fuel cell UAV probabilistic design space was studied. A graphical representation of the design space was generated and the optima due to deterministic and probabilistic constraints were identified. The methodology was used to identify Pareto frontiers of the design space which were shown on contour plots of the design space. Unanticipated discontinuities of the Pareto fronts were observed as different constraints became active providing useful information on which to base design and development decisions.

  20. Performance-Driven Hybrid Full-Body Character Control for Navigation and Interaction in Virtual Environments

    NASA Astrophysics Data System (ADS)

    Mousas, Christos; Anagnostopoulos, Christos-Nikolaos

    2017-06-01

    This paper presents a hybrid character control interface that provides the ability to synthesize in real-time a variety of actions based on the user's performance capture. The proposed methodology enables three different performance interaction modules: the performance animation control that enables the direct mapping of the user's pose to the character, the motion controller that synthesizes the desired motion of the character based on an activity recognition methodology, and the hybrid control that lies within the performance animation and the motion controller. With the methodology presented, the user will have the freedom to interact within the virtual environment, as well as the ability to manipulate the character and to synthesize a variety of actions that cannot be performed directly by him/her, but which the system synthesizes. Therefore, the user is able to interact with the virtual environment in a more sophisticated fashion. This paper presents examples of different scenarios based on the three different full-body character control methodologies.

  1. Kalman approach to accuracy management for interoperable heterogeneous model abstraction within an HLA-compliant simulation

    NASA Astrophysics Data System (ADS)

    Leskiw, Donald M.; Zhau, Junmei

    2000-06-01

    This paper reports on results from an ongoing project to develop methodologies for representing and managing multiple, concurrent levels of detail and enabling high performance computing using parallel arrays within distributed object-based simulation frameworks. At this time we present the methodology for representing and managing multiple, concurrent levels of detail and modeling accuracy by using a representation based on the Kalman approach for estimation. The Kalman System Model equations are used to represent model accuracy, Kalman Measurement Model equations provide transformations between heterogeneous levels of detail, and interoperability among disparate abstractions is provided using a form of the Kalman Update equations.

  2. Determination of organic acids in tissues and exudates of maize, lupin, and chickpea by high-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Erro, Javier; Zamarreño, Angel M; Yvin, Jean-Claude; Garcia-Mina, Jose M

    2009-05-27

    This article describes a fast and simple methodology for the extraction and determination of organic acids in tissues and root exudates of maize, lupin, and chickpea by LC/MS/MS. Its main advantage is that it does not require sample prepurification before HPLC analysis or sample derivatization to improve sensibility. The results obtained showed good precision and accuracy, a recovery close to 100%, and no significant matrix effect. Moreover, the sensibility of the method is in general better than that of previously described methodologies, with detection limits between 15 and 900 pg injected.

  3. Parallel computation with the force

    NASA Technical Reports Server (NTRS)

    Jordan, H. F.

    1985-01-01

    A methodology, called the force, supports the construction of programs to be executed in parallel by a force of processes. The number of processes in the force is unspecified, but potentially very large. The force idea is embodied in a set of macros which produce multiproceossor FORTRAN code and has been studied on two shared memory multiprocessors of fairly different character. The method has simplified the writing of highly parallel programs within a limited class of parallel algorithms and is being extended to cover a broader class. The individual parallel constructs which comprise the force methodology are discussed. Of central concern are their semantics, implementation on different architectures and performance implications.

  4. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 2

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chin-Yere; Onyebueke, Landon

    1996-01-01

    The structural design, or the design of machine elements, has been traditionally based on deterministic design methodology. The deterministic method considers all design parameters to be known with certainty. This methodology is, therefore, inadequate to design complex structures that are subjected to a variety of complex, severe loading conditions. A nonlinear behavior that is dependent on stress, stress rate, temperature, number of load cycles, and time is observed on all components subjected to complex conditions. These complex conditions introduce uncertainties; hence, the actual factor of safety margin remains unknown. In the deterministic methodology, the contingency of failure is discounted; hence, there is a use of a high factor of safety. It may be most useful in situations where the design structures are simple. The probabilistic method is concerned with the probability of non-failure performance of structures or machine elements. It is much more useful in situations where the design is characterized by complex geometry, possibility of catastrophic failure, sensitive loads and material properties. Also included: Comparative Study of the use of AGMA Geometry Factors and Probabilistic Design Methodology in the Design of Compact Spur Gear Set.

  5. Nonlinear stability and control study of highly maneuverable high performance aircraft

    NASA Technical Reports Server (NTRS)

    Mohler, R. R.

    1993-01-01

    This project is intended to research and develop new nonlinear methodologies for the control and stability analysis of high-performance, high angle-of-attack aircraft such as HARV (F18). Past research (reported in our Phase 1, 2, and 3 progress reports) is summarized and more details of final Phase 3 research is provided. While research emphasis is on nonlinear control, other tasks such as associated model development, system identification, stability analysis, and simulation are performed in some detail as well. An overview of various models that were investigated for different purposes such as an approximate model reference for control adaptation, as well as another model for accurate rigid-body longitudinal motion is provided. Only a very cursory analysis was made relative to type 8 (flexible body dynamics). Standard nonlinear longitudinal airframe dynamics (type 7) with the available modified F18 stability derivatives, thrust vectoring, actuator dynamics, and control constraints are utilized for simulated flight evaluation of derived controller performance in all cases studied.

  6. Suomi-NPP VIIRS Day-Night Band On-Orbit Calibration and Performance

    NASA Technical Reports Server (NTRS)

    Chen, Hongda; Xiong, Xiaoxiong; Sun, Chengbo; Chen, Xuexia; Chiang, Kwofu

    2017-01-01

    The Suomi national polar-orbiting partnership Visible Infrared Imaging Radiometer Suite (VIIRS) instrument has successfully operated since its launch in October 2011. The VIIRS day-night band (DNB) is a panchromatic channel covering wavelengths from 0.5 to 0.9 microns that is capable of observing Earth scenes during both daytime and nighttime at a spatial resolution of 750 m. To cover the large dynamic range, the DNB operates at low-, middle-, and high-gain stages, and it uses an on-board solar diffuser (SD) for its low-gain stage calibration. The SD observations also provide a means to compute the gain ratios of low-to-middle and middle-to-high gain stages. This paper describes the DNB on-orbit calibration methodology used by the VIIRS characterization support team in supporting the NASA Earth science community with consistent VIIRS sensor data records made available by the land science investigator-led processing systems. It provides an assessment and update of the DNB on-orbit performance, including the SD degradation in the DNB spectral range, detector gain and gain ratio trending, and stray-light contamination and its correction. Also presented in this paper are performance validations based on Earth scenes and lunar observations, and comparisons to the calibration methodology used by the operational interface data processing segment.

  7. A design methodology for portable software on parallel computers

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Miller, Keith W.; Chrisman, Dan A.

    1993-01-01

    This final report for research that was supported by grant number NAG-1-995 documents our progress in addressing two difficulties in parallel programming. The first difficulty is developing software that will execute quickly on a parallel computer. The second difficulty is transporting software between dissimilar parallel computers. In general, we expect that more hardware-specific information will be included in software designs for parallel computers than in designs for sequential computers. This inclusion is an instance of portability being sacrificed for high performance. New parallel computers are being introduced frequently. Trying to keep one's software on the current high performance hardware, a software developer almost continually faces yet another expensive software transportation. The problem of the proposed research is to create a design methodology that helps designers to more precisely control both portability and hardware-specific programming details. The proposed research emphasizes programming for scientific applications. We completed our study of the parallelizability of a subsystem of the NASA Earth Radiation Budget Experiment (ERBE) data processing system. This work is summarized in section two. A more detailed description is provided in Appendix A ('Programming Practices to Support Eventual Parallelism'). Mr. Chrisman, a graduate student, wrote and successfully defended a Ph.D. dissertation proposal which describes our research associated with the issues of software portability and high performance. The list of research tasks are specified in the proposal. The proposal 'A Design Methodology for Portable Software on Parallel Computers' is summarized in section three and is provided in its entirety in Appendix B. We are currently studying a proposed subsystem of the NASA Clouds and the Earth's Radiant Energy System (CERES) data processing system. This software is the proof-of-concept for the Ph.D. dissertation. We have implemented and measured the performance of a portion of this subsystem on the Intel iPSC/2 parallel computer. These results are provided in section four. Our future work is summarized in section five, our acknowledgements are stated in section six, and references for published papers associated with NAG-1-995 are provided in section seven.

  8. Identification and quantification of ethyl carbamate occurring in urea complexation processes commonly utilized for polyunsaturated fatty acid concentration.

    PubMed

    Vázquez, Luis; Prados, Isabel M; Reglero, Guillermo; Torres, Carlos F

    2017-08-15

    The concentration of polyunsaturated fatty acids by formation of urea adducts from three different sources was studied to elucidate the formation of ethyl carbamates in the course of these procedures. Two different methodologies were performed: with ethanol at high temperature and with hexane/ethanol mixtures at room temperature. It was proved that the amount of urethanes generated at high temperature was higher than at room temperature. Besides, subsequent washing steps of the PUFA fraction with water were efficient to remove the urethanes from the final products. The methodology at room temperature with 0.4mL ethanol and 3g urea provided good relationship between concentration and yield of the main bioactive PUFA, with the lowest formation of ethyl carbamates in the process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Event-Based Tone Mapping for Asynchronous Time-Based Image Sensor

    PubMed Central

    Simon Chane, Camille; Ieng, Sio-Hoi; Posch, Christoph; Benosman, Ryad B.

    2016-01-01

    The asynchronous time-based neuromorphic image sensor ATIS is an array of autonomously operating pixels able to encode luminance information with an exceptionally high dynamic range (>143 dB). This paper introduces an event-based methodology to display data from this type of event-based imagers, taking into account the large dynamic range and high temporal accuracy that go beyond available mainstream display technologies. We introduce an event-based tone mapping methodology for asynchronously acquired time encoded gray-level data. A global and a local tone mapping operator are proposed. Both are designed to operate on a stream of incoming events rather than on time frame windows. Experimental results on real outdoor scenes are presented to evaluate the performance of the tone mapping operators in terms of quality, temporal stability, adaptation capability, and computational time. PMID:27642275

  10. The effects of expressivity and flight task on cockpit communication and resource management

    NASA Technical Reports Server (NTRS)

    Jensen, R. S.

    1986-01-01

    The results of an investigation to develop a methodology for evaluating crew communication behavior on the flight deck and a flight simulator experiment to test the effects of crew member expressivity, as measured by the Personal Attributes Questionnarie, and flight task on crew communication and flight performance are discussed. A methodology for coding and assessing flight crew communication behavior as well as a model for predicting that behavior is advanced. Although not enough crews were found to provide valid statistical tests, the results of the study tend to indicate that crews in which the captain has high expressivity perform better than those whose captain is low in expressivity. There appears to be a strong interaction between captains and first officers along the level of command dimension of communication. The PAQ appears to identify those pilots who offer disagreements and inititate new subjects for discussion.

  11. Space network scheduling benchmark: A proof-of-concept process for technology transfer

    NASA Technical Reports Server (NTRS)

    Moe, Karen; Happell, Nadine; Hayden, B. J.; Barclay, Cathy

    1993-01-01

    This paper describes a detailed proof-of-concept activity to evaluate flexible scheduling technology as implemented in the Request Oriented Scheduling Engine (ROSE) and applied to Space Network (SN) scheduling. The criteria developed for an operational evaluation of a reusable scheduling system is addressed including a methodology to prove that the proposed system performs at least as well as the current system in function and performance. The improvement of the new technology must be demonstrated and evaluated against the cost of making changes. Finally, there is a need to show significant improvement in SN operational procedures. Successful completion of a proof-of-concept would eventually lead to an operational concept and implementation transition plan, which is outside the scope of this paper. However, a high-fidelity benchmark using actual SN scheduling requests has been designed to test the ROSE scheduling tool. The benchmark evaluation methodology, scheduling data, and preliminary results are described.

  12. Autonomous Aerobraking: Thermal Analysis and Response Surface Development

    NASA Technical Reports Server (NTRS)

    Dec, John A.; Thornblom, Mark N.

    2011-01-01

    A high-fidelity thermal model of the Mars Reconnaissance Orbiter was developed for use in an autonomous aerobraking simulation study. Response surface equations were derived from the high-fidelity thermal model and integrated into the autonomous aerobraking simulation software. The high-fidelity thermal model was developed using the Thermal Desktop software and used in all phases of the analysis. The use of Thermal Desktop exclusively, represented a change from previously developed aerobraking thermal analysis methodologies. Comparisons were made between the Thermal Desktop solutions and those developed for the previous aerobraking thermal analyses performed on the Mars Reconnaissance Orbiter during aerobraking operations. A variable sensitivity screening study was performed to reduce the number of variables carried in the response surface equations. Thermal analysis and response surface equation development were performed for autonomous aerobraking missions at Mars and Venus.

  13. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nachtigal, Noel M.; Fruetel, Julia A.; Gleason, Nathaniel J.

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in themore » risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.« less

  14. Multifunctional Mesoporous Ionic Gels and Scaffolds Derived from Polyhedral Oligomeric Silsesquioxanes.

    PubMed

    Lee, Jin Hong; Lee, Albert S; Lee, Jong-Chan; Hong, Soon Man; Hwang, Seung Sang; Koo, Chong Min

    2017-02-01

    A new methodology for fabrication of inorganic-organic hybrid ionogels and scaffolds is developed through facile cross-linking and solution extraction of a newly developed ionic polyhedral oligomeric silsesquioxane with inorganic core. Through design of various cationic tertiary amines, as well as cross-linkable functional groups on each arm of the inorganic core, high-performance ionogels are fabricated with excellent electrochemical stability and unique ion conduction behavior, giving superior lithium ion battery performance. Moreover, through solvent extraction of the liquid components, hybrid scaffolds with well-defined, interconnected mesopores are utilized as heterogeneous catalysts for the CO 2 -catalyzed cycloaddition of epoxides. Excellent catalytic performance, as well as highly efficient recyclability are observed when compared to other previous literature materials.

  15. Quality evaluation of moluodan concentrated pill using high-performance liquid chromatography fingerprinting coupled with chemometrics.

    PubMed

    Tao, Lingyan; Zhang, Qing; Wu, Yongjiang; Liu, Xuesong

    2016-12-01

    In this study, a fast and effective high-performance liquid chromatography method was developed to obtain a fingerprint chromatogram and quantitative analysis simultaneously of four indexes including gallic acid, chlorogenic acid, albiflorin and paeoniflorin of the traditional Chinese medicine Moluodan Concentrated Pill. The method was performed by using a Waters X-bridge C 18 reversed phase column on an Agilent 1200S high-performance liquid chromatography system coupled with diode array detection. The mobile phase of the high-performance liquid chromatography method was composed of 20 mmol/L phosphate solution and acetonitrile with a 1 mL/min eluent velocity, under a detection temperature of 30°C and a UV detection wavelength of 254 nm. After the methodology validation, 16 batches of Moluodan Concentrated Pill were analyzed by this high-performance liquid chromatography method and both qualitative and quantitative evaluation results were achieved by similarity analysis, principal component analysis and hierarchical cluster analysis. The results of these three chemometrics were in good agreement and all indicated that batch 10 and batch 16 showed significant differences with the other 14 batches. This suggested that the developed high-performance liquid chromatography method could be applied in the quality evaluation of Moluodan Concentrated Pill. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. The Development of A Performance Measurement Concept for the Royal Australian Air Force

    DTIC Science & Technology

    1992-09-01

    knowledge of management accounting and performance measurement techniques. As the RAAF currently has no cost management and performance measurement system...provides a review of the current literature on management accounting for performance measurement, specifically referring to the problems of using...traditional management accounting methodologies as a basis for formulating performance measures. Chapter III explains the methodology used to conduct the

  17. Design, Development and Analysis of Centrifugal Blower

    NASA Astrophysics Data System (ADS)

    Baloni, Beena Devendra; Channiwala, Salim Abbasbhai; Harsha, Sugnanam Naga Ramannath

    2018-06-01

    Centrifugal blowers are widely used turbomachines equipment in all kinds of modern and domestic life. Manufacturing of blowers seldom follow an optimum design solution for individual blower. Although centrifugal blowers are developed as highly efficient machines, design is still based on various empirical and semi empirical rules proposed by fan designers. There are different methodologies used to design the impeller and other components of blowers. The objective of present study is to study explicit design methodologies and tracing unified design to get better design point performance. This unified design methodology is based more on fundamental concepts and minimum assumptions. Parametric study is also carried out for the effect of design parameters on pressure ratio and their interdependency in the design. The code is developed based on a unified design using C programming. Numerical analysis is carried out to check the flow parameters inside the blower. Two blowers, one based on the present design and other on industrial design, are developed with a standard OEM blower manufacturing unit. A comparison of both designs is done based on experimental performance analysis as per IS standard. The results suggest better efficiency and more flow rate for the same pressure head in case of the present design compared with industrial one.

  18. Design Methodology of an Equalizer for Unipolar Non Return to Zero Binary Signals in the Presence of Additive White Gaussian Noise Using a Time Delay Neural Network on a Field Programmable Gate Array

    PubMed Central

    Pérez Suárez, Santiago T.; Travieso González, Carlos M.; Alonso Hernández, Jesús B.

    2013-01-01

    This article presents a design methodology for designing an artificial neural network as an equalizer for a binary signal. Firstly, the system is modelled in floating point format using Matlab. Afterward, the design is described for a Field Programmable Gate Array (FPGA) using fixed point format. The FPGA design is based on the System Generator from Xilinx, which is a design tool over Simulink of Matlab. System Generator allows one to design in a fast and flexible way. It uses low level details of the circuits and the functionality of the system can be fully tested. System Generator can be used to check the architecture and to analyse the effect of the number of bits on the system performance. Finally the System Generator design is compiled for the Xilinx Integrated System Environment (ISE) and the system is described using a hardware description language. In ISE the circuits are managed with high level details and physical performances are obtained. In the Conclusions section, some modifications are proposed to improve the methodology and to ensure portability across FPGA manufacturers.

  19. Facility Energy Performance Benchmarking in a Data-Scarce Environment

    DTIC Science & Technology

    2017-08-01

    environment, and analyze occupant-, system-, and component-level faults contributing to energy in- efficiency. A methodology for developing DoD-specific...Research, Development, Test, and Evaluation (RDTE) Program to develop an intelligent framework, encompassing methodology and model- ing, that...energy performers by installation, climate zone, and other criteria. A methodology for creating the DoD-specific EUIs would be an important part of a

  20. System Dynamics Aviation Readiness Modeling Demonstration

    DTIC Science & Technology

    2005-08-31

    requirements. It is recommended that the Naval Aviation Enterprise take a close look at the requirements i.e., performance measures, methodology ...unit’s capability to perform specific Joint Mission Essential Task List (JMETL) requirements now and in the future. This assessment methodology must...the time-associated costs. The new methodology must base decisions on currently available data and databases. A “useful” readiness model should be

  1. Fabrication and Testing of Microfluidic Optomechanical Oscillators

    PubMed Central

    Han, Kewen; Kim, Kyu Hyun; Kim, Junhwan; Lee, Wonsuk; Liu, Jing; Fan, Xudong; Carmon, Tal; Bahl, Gaurav

    2014-01-01

    Cavity optomechanics experiments that parametrically couple the phonon modes and photon modes have been investigated in various optical systems including microresonators. However, because of the increased acoustic radiative losses during direct liquid immersion of optomechanical devices, almost all published optomechanical experiments have been performed in solid phase. This paper discusses a recently introduced hollow microfluidic optomechanical resonator. Detailed methodology is provided to fabricate these ultra-high-Q microfluidic resonators, perform optomechanical testing, and measure radiation pressure-driven breathing mode and SBS-driven whispering gallery mode parametric vibrations. By confining liquids inside the capillary resonator, high mechanical- and optical- quality factors are simultaneously maintained. PMID:24962013

  2. A rotor technology assessment of the advancing blade concept

    NASA Technical Reports Server (NTRS)

    Pleasants, W. A.

    1983-01-01

    A rotor technology assessment of the Advancing Blade Concept (ABC) was conducted in support of a preliminary design study. The analytical methodology modifications and inputs, the correlation, and the results of the assessment are documented. The primary emphasis was on the high-speed forward flight performance of the rotor. The correlation data base included both the wind tunnel and the flight test results. An advanced ABC rotor design was examined; the suitability of the ABC for a particular mission was not considered. The objective of this technology assessment was to provide estimates of the performance potential of an advanced ABC rotor designed for high speed forward flight.

  3. Designing of a self-adaptive digital filter using genetic algorithm

    NASA Astrophysics Data System (ADS)

    Geng, Xuemei; Li, Hongguang; Xu, Chi

    2018-04-01

    This paper presents a novel methodology applying non-linear model for closed loop Sigma-Delta modulator that is based on genetic algorithm, which offers opportunity to simplify the process of tuning parameters and further improve the noise performance. The proposed Sigma-Delta modulator is able to quickly and efficiently design high performance, high order, closed loop that are robust to sensor fabrication tolerances. Simulation results with respect to the proposed Sigma-Delta modulator, SNR>122dB and the noise floor under -170dB are obtained in frequency range of [5-150Hz]. In further simulation, the robustness of the proposed Sigma-Delta modulator is analyzed.

  4. Determination of MDMA, MDEA and MDA in urine by high performance liquid chromatography with fluorescence detection.

    PubMed

    da Costa, José Luiz; da Matta Chasin, Alice Aparecida

    2004-11-05

    This paper describes the development and validation of analytical methodology for the determination of the use of MDMA, MDEA and MDA in urine. After a simple liquid extraction, the analyses were carried out on a high performance liquid chromatography (HPLC) in an octadecyl column, with fluorescence detection. The mobile phase using a sodium dodecyl sulfate ion-pairing reagent allows good separation and efficiency. The method showed good linearity and precision. Recovery was between 85 and 102% and detection limits were 10, 15 and 20 ng/ml for MDA, MDMA and MDEA, respectively. No interfering substances were detected with fluorescence detection.

  5. Comparing Laser Welding Technologies with Friction Stir Welding for Production of Aluminum Tailor-Welded Blanks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hovanski, Yuri; Carsley, John; Carlson, Blair

    2014-01-15

    A comparison of welding techniques was performed to determine the most effective method for producing aluminum tailor-welded blanks for high volume automotive applications. Aluminum sheet was joined with an emphasis on post weld formability, surface quality and weld speed. Comparative results from several laser based welding techniques along with friction stir welding are presented. The results of this study demonstrate a quantitative comparison of weld methodologies in preparing tailor-welded aluminum stampings for high volume production in the automotive industry. Evaluation of nearly a dozen welding variations ultimately led to down selecting a single process based on post-weld quality and performance.

  6. Distributed Accounting on the Grid

    NASA Technical Reports Server (NTRS)

    Thigpen, William; Hacker, Thomas J.; McGinnis, Laura F.; Athey, Brian D.

    2001-01-01

    By the late 1990s, the Internet was adequately equipped to move vast amounts of data between HPC (High Performance Computing) systems, and efforts were initiated to link together the national infrastructure of high performance computational and data storage resources together into a general computational utility 'grid', analogous to the national electrical power grid infrastructure. The purpose of the Computational grid is to provide dependable, consistent, pervasive, and inexpensive access to computational resources for the computing community in the form of a computing utility. This paper presents a fully distributed view of Grid usage accounting and a methodology for allocating Grid computational resources for use on a Grid computing system.

  7. A methodology to assess performance of human-robotic systems in achievement of collective tasks

    NASA Technical Reports Server (NTRS)

    Howard, Ayanna M.

    2005-01-01

    In this paper, we present a methodology to assess system performance of human-robotic systems in achievement of collective tasks such as habitat construction, geological sampling, and space exploration.

  8. Most systematic reviews of high methodological quality on psoriasis interventions are classified as high risk of bias using ROBIS tool.

    PubMed

    Gómez-García, Francisco; Ruano, Juan; Gay-Mimbrera, Jesus; Aguilar-Luque, Macarena; Sanz-Cabanillas, Juan Luis; Alcalde-Mellado, Patricia; Maestre-López, Beatriz; Carmona-Fernández, Pedro Jesús; González-Padilla, Marcelino; García-Nieto, Antonio Vélez; Isla-Tejera, Beatriz

    2017-12-01

    No gold standard exists to assess methodological quality of systematic reviews (SRs). Although Assessing the Methodological Quality of Systematic Reviews (AMSTAR) is widely accepted for analyzing quality, the ROBIS instrument has recently been developed. This study aimed to compare the capacity of both instruments to capture the quality of SRs concerning psoriasis interventions. Systematic literature searches were undertaken on relevant databases. For each review, methodological quality and bias risk were evaluated using the AMSTAR and ROBIS tools. Descriptive and principal component analyses were conducted to describe similarities and discrepancies between both assessment tools. We classified 139 intervention SRs as displaying high/moderate/low methodological quality and as high/low risk of bias. A high risk of bias was detected for most SRs classified as displaying high or moderate methodological quality by AMSTAR. When comparing ROBIS result profiles, responses to domain 4 signaling questions showed the greatest differences between bias risk assessments, whereas domain 2 items showed the least. When considering SRs published about psoriasis, methodological quality remains suboptimal, and the risk of bias is elevated, even for SRs exhibiting high methodological quality. Furthermore, the AMSTAR and ROBIS tools may be considered as complementary when conducting quality assessment of SRs. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. The effect of instructional methodology on high school students natural sciences standardized tests scores

    NASA Astrophysics Data System (ADS)

    Powell, P. E.

    Educators have recently come to consider inquiry based instruction as a more effective method of instruction than didactic instruction. Experience based learning theory suggests that student performance is linked to teaching method. However, research is limited on inquiry teaching and its effectiveness on preparing students to perform well on standardized tests. The purpose of the study to investigate whether one of these two teaching methodologies was more effective in increasing student performance on standardized science tests. The quasi experimental quantitative study was comprised of two stages. Stage 1 used a survey to identify teaching methods of a convenience sample of 57 teacher participants and determined level of inquiry used in instruction to place participants into instructional groups (the independent variable). Stage 2 used analysis of covariance (ANCOVA) to compare posttest scores on a standardized exam by teaching method. Additional analyses were conducted to examine the differences in science achievement by ethnicity, gender, and socioeconomic status by teaching methodology. Results demonstrated a statistically significant gain in test scores when taught using inquiry based instruction. Subpopulation analyses indicated all groups showed improved mean standardized test scores except African American students. The findings benefit teachers and students by presenting data supporting a method of content delivery that increases teacher efficacy and produces students with a greater cognition of science content that meets the school's mission and goals.

  10. The impact of domestic rainwater harvesting systems in storm water runoff mitigation at the urban block scale.

    PubMed

    Palla, A; Gnecco, I; La Barbera, P

    2017-04-15

    In the framework of storm water management, Domestic Rainwater Harvesting (DRWH) systems are recently recognized as source control solutions according to LID principles. In order to assess the impact of these systems in storm water runoff control, a simple methodological approach is proposed. The hydrologic-hydraulic modelling is undertaken using EPA SWMM; the DRWH is implemented in the model by using a storage unit linked to the building water supply system and to the drainage network. The proposed methodology has been implemented for a residential urban block located in Genoa (Italy). Continuous simulations are performed by using the high-resolution rainfall data series for the ''do nothing'' and DRWH scenarios. The latter includes the installation of a DRWH system for each building of the urban block. Referring to the test site, the peak and volume reduction rate evaluated for the 2125 rainfall events are respectively equal to 33 and 26 percent, on average (with maximum values of 65 percent for peak and 51 percent for volume). In general, the adopted methodology indicates that the hydrologic performance of the storm water drainage network equipped with DRWH systems is noticeable even for the design storm event (T = 10 years) and the rainfall depth seems to affect the hydrologic performance at least when the total depth exceeds 20 mm. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. An analysis of IGBP global land-cover characterization process

    USGS Publications Warehouse

    Loveland, Thomas R.; Zhu, Zhiliang; Ohlen, Donald O.; Brown, Jesslyn F.; Reed, Bradley C.; Yang, Limin

    1999-01-01

    The international Geosphere Biosphere Programme (IGBP) has called for the development of improved global land-cover data for use in increasingly sophisticated global environmental models. To meet this need, the staff of the U.S. Geological Survey and the University of Nebraska-Lincoln developed and applied a global land-cover characterization methodology using 1992-1993 1-km resolution Advanced Very High Resolution Radiometer (AVHRR) and other spatial data. The methodology, based on unsupervised classification with extensive postclassification refinement, yielded a multi-layer database consisting of eight land-cover data sets, descriptive attributes, and source data. An independent IGBP accuracy assessment reports a global accuracy of 73.5 percent, and continental results vary from 63 percent to 83 percent. Although data quality, methodology, interpreter performance, and logistics affected the results, significant problems were associated with the relationship between AVHRR data and fine-scale, spectrally similar land-cover patterns in complex natural or disturbed landscapes.

  12. Discrete Adjoint-Based Design for Unsteady Turbulent Flows On Dynamic Overset Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.; Diskin, Boris

    2012-01-01

    A discrete adjoint-based design methodology for unsteady turbulent flows on three-dimensional dynamic overset unstructured grids is formulated, implemented, and verified. The methodology supports both compressible and incompressible flows and is amenable to massively parallel computing environments. The approach provides a general framework for performing highly efficient and discretely consistent sensitivity analysis for problems involving arbitrary combinations of overset unstructured grids which may be static, undergoing rigid or deforming motions, or any combination thereof. General parent-child motions are also accommodated, and the accuracy of the implementation is established using an independent verification based on a complex-variable approach. The methodology is used to demonstrate aerodynamic optimizations of a wind turbine geometry, a biologically-inspired flapping wing, and a complex helicopter configuration subject to trimming constraints. The objective function for each problem is successfully reduced and all specified constraints are satisfied.

  13. [Controversial issues in economic evaluation (I): perspective and costs of Health Care interventions].

    PubMed

    Oliva, Juan; Brosa, Max; Espín, Jaime; Figueras, Montserrat; Trapero, Marta

    2015-01-01

    Economic evaluation of health care interventions has experienced a strong growth over the past decade and is increasingly present as a support tool in the decisions making process on public funding of health services and pricing in European countries. A necessary element using them is that agents that perform economic evaluations have minimum rules with agreement on methodological aspects. Although there are methodological issues in which there is a high degree of consensus, there are others in which there is no such degree of agreement being closest to the normative field or have experienced significant methodological advances in recent years. In this first article of a series of three, we will discuss on the perspective of analysis and assessment of costs in economic evaluation of health interventions using the technique Metaplan. Finally, research lines are proposed to overcome the identified discrepancies.

  14. A New Integrated Threshold Selection Methodology for Spatial Forecast Verification of Extreme Events

    NASA Astrophysics Data System (ADS)

    Kholodovsky, V.

    2017-12-01

    Extreme weather and climate events such as heavy precipitation, heat waves and strong winds can cause extensive damage to the society in terms of human lives and financial losses. As climate changes, it is important to understand how extreme weather events may change as a result. Climate and statistical models are often independently used to model those phenomena. To better assess performance of the climate models, a variety of spatial forecast verification methods have been developed. However, spatial verification metrics that are widely used in comparing mean states, in most cases, do not have an adequate theoretical justification to benchmark extreme weather events. We proposed a new integrated threshold selection methodology for spatial forecast verification of extreme events that couples existing pattern recognition indices with high threshold choices. This integrated approach has three main steps: 1) dimension reduction; 2) geometric domain mapping; and 3) thresholds clustering. We apply this approach to an observed precipitation dataset over CONUS. The results are evaluated by displaying threshold distribution seasonally, monthly and annually. The method offers user the flexibility of selecting a high threshold that is linked to desired geometrical properties. The proposed high threshold methodology could either complement existing spatial verification methods, where threshold selection is arbitrary, or be directly applicable in extreme value theory.

  15. Save money by understanding variance and tolerancing.

    PubMed

    Stuart, K

    2007-01-01

    Manufacturing processes are inherently variable, which results in component and assembly variance. Unless process capability, variance and tolerancing are fully understood, incorrect design tolerances may be applied, which will lead to more expensive tooling, inflated production costs, high reject rates, product recalls and excessive warranty costs. A methodology is described for correctly allocating tolerances and performing appropriate analyses.

  16. A reliable methodology for quantitative extraction of fruit and vegetable physiological amino acids and their subsequent analysis with commonly available HPLC systems

    USDA-ARS?s Scientific Manuscript database

    High performance liquid chromatography of dabsyl derivatives of amino acids was employed for quantification of physiological amino acids in selected fruits and vegetables. This method was found to be particularly useful because the dabsyl derivatives of glutamine and citrulline were sufficiently se...

  17. Hardware accelerated high performance neutron transport computation based on AGENT methodology

    NASA Astrophysics Data System (ADS)

    Xiao, Shanjie

    The spatial heterogeneity of the next generation Gen-IV nuclear reactor core designs brings challenges to the neutron transport analysis. The Arbitrary Geometry Neutron Transport (AGENT) AGENT code is a three-dimensional neutron transport analysis code being developed at the Laboratory for Neutronics and Geometry Computation (NEGE) at Purdue University. It can accurately describe the spatial heterogeneity in a hierarchical structure through the R-function solid modeler. The previous version of AGENT coupled the 2D transport MOC solver and the 1D diffusion NEM solver to solve the three dimensional Boltzmann transport equation. In this research, the 2D/1D coupling methodology was expanded to couple two transport solvers, the radial 2D MOC solver and the axial 1D MOC solver, for better accuracy. The expansion was benchmarked with the widely applied C5G7 benchmark models and two fast breeder reactor models, and showed good agreement with the reference Monte Carlo results. In practice, the accurate neutron transport analysis for a full reactor core is still time-consuming and thus limits its application. Therefore, another content of my research is focused on designing a specific hardware based on the reconfigurable computing technique in order to accelerate AGENT computations. It is the first time that the application of this type is used to the reactor physics and neutron transport for reactor design. The most time consuming part of the AGENT algorithm was identified. Moreover, the architecture of the AGENT acceleration system was designed based on the analysis. Through the parallel computation on the specially designed, highly efficient architecture, the acceleration design on FPGA acquires high performance at the much lower working frequency than CPUs. The whole design simulations show that the acceleration design would be able to speedup large scale AGENT computations about 20 times. The high performance AGENT acceleration system will drastically shortening the computation time for 3D full-core neutron transport analysis, making the AGENT methodology unique and advantageous, and thus supplies the possibility to extend the application range of neutron transport analysis in either industry engineering or academic research.

  18. Transferable and flexible label-like macromolecular memory on arbitrary substrates with high performance and a facile methodology.

    PubMed

    Lai, Ying-Chih; Hsu, Fang-Chi; Chen, Jian-Yu; He, Jr-Hau; Chang, Ting-Chang; Hsieh, Ya-Ping; Lin, Tai-Yuan; Yang, Ying-Jay; Chen, Yang-Fang

    2013-05-21

    A newly designed transferable and flexible label-like organic memory based on a graphene electrode behaves like a sticker, and can be readily placed on desired substrates or devices for diversified purposes. The memory label reveals excellent performance despite its physical presentation. This may greatly extend the memory applications in various advanced electronics and provide a simple scheme to integrate with other electronics. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Methodology for the preliminary design of high performance schools in hot and humid climates

    NASA Astrophysics Data System (ADS)

    Im, Piljae

    A methodology to develop an easy-to-use toolkit for the preliminary design of high performance schools in hot and humid climates was presented. The toolkit proposed in this research will allow decision makers without simulation knowledge easily to evaluate accurately energy efficient measures for K-5 schools, which would contribute to the accelerated dissemination of energy efficient design. For the development of the toolkit, first, a survey was performed to identify high performance measures available today being implemented in new K-5 school buildings. Then an existing case-study school building in a hot and humid climate was selected and analyzed to understand the energy use pattern in a school building and to be used in developing a calibrated simulation. Based on the information from the previous step, an as-built and calibrated simulation was then developed. To accomplish this, five calibration steps were performed to match the simulation results with the measured energy use. The five steps include: (1) Using an actual 2006 weather file with measured solar radiation, (2) Modifying lighting & equipment schedule using ASHRAE's RP-1093 methods, (3) Using actual equipment performance curves (i.e., scroll chiller), (4) Using the Winkelmann's method for the underground floor heat transfer, and (5) Modifying the HVAC and room setpoint temperature based on the measured field data. Next, the calibrated simulation of the case-study K-5 school was compared to an ASHRAE Standard 90.1-1999 code-compliant school. In the next step, the energy savings potentials from the application of several high performance measures to an equivalent ASHRAE Standard 90.1-1999 code-compliant school. The high performance measures applied included the recommendations from the ASHRAE Advanced Energy Design Guides (AEDG) for K-12 and other high performance measures from the literature review as well as a daylighting strategy and solar PV and thermal systems. The results show that the net energy consumption of the final high performance school with the solar thermal and a solar PV system would be 1,162.1 MMBtu, which corresponds to the 14.9 kBtu/sqft-yr of EUI. The calculated final energy and cost savings over the code compliant school are 68.2% and 69.9%, respectively. As a final step of the research, specifications for a simplified easy-to-use toolkit were then developed, and a prototype screenshot of the toolkit was developed. The toolkit is expected to be used by non-technical decision-maker to select and evaluate high performance measures for a new school building in terms of energy and cost savings in a quick and easy way.

  20. Fabrication of 3-D nanodimensioned electric double layer capacitor structures using block copolymer templates.

    PubMed

    Rasappa, Sozaraj; Borah, Dipu; Senthamaraikannan, Ramsankar; Faulkner, Colm C; Holmes, Justin D; Morris, Michael A

    2014-07-01

    The need for materials for high energy storage has led to very significant research in supercapacitor systems. These can exhibit electrical double layer phenomena and capacitances up to hundreds of F/g. Here, we demonstrate a new supercapacitor fabrication methodology based around the microphase separation of PS-b-PMMA which has been used to prepare copper nanoelectrodes of dimension -13 nm. These structures provide excellent capacitive performance with a maximum specific capacitance of -836 F/g for a current density of 8.06 A/g at a discharge current as high as 75 mA. The excellent performance is due to a high surface area: volume ratio. We suggest that this highly novel, easily fabricated structure might have a number of important applications.

  1. On Certain New Methodology for Reducing Sensor and Readout Electronics Circuitry Noise in Digital Domain

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Miko, Joseph; Bradley, Damon; Heinzen, Katherine

    2008-01-01

    NASA Hubble Space Telescope (HST) and upcoming cosmology science missions carry instruments with multiple focal planes populated with many large sensor detector arrays. These sensors are passively cooled to low temperatures for low-level light (L3) and near-infrared (NIR) signal detection, and the sensor readout electronics circuitry must perform at extremely low noise levels to enable new required science measurements. Because we are at the technological edge of enhanced performance for sensors and readout electronics circuitry, as determined by thermal noise level at given temperature in analog domain, we must find new ways of further compensating for the noise in the signal digital domain. To facilitate this new approach, state-of-the-art sensors are augmented at their array hardware boundaries by non-illuminated reference pixels, which can be used to reduce noise attributed to sensors. There are a few proposed methodologies of processing in the digital domain the information carried by reference pixels, as employed by the Hubble Space Telescope and the James Webb Space Telescope Projects. These methods involve using spatial and temporal statistical parameters derived from boundary reference pixel information to enhance the active (non-reference) pixel signals. To make a step beyond this heritage methodology, we apply the NASA-developed technology known as the Hilbert- Huang Transform Data Processing System (HHT-DPS) for reference pixel information processing and its utilization in reconfigurable hardware on-board a spaceflight instrument or post-processing on the ground. The methodology examines signal processing for a 2-D domain, in which high-variance components of the thermal noise are carried by both active and reference pixels, similar to that in processing of low-voltage differential signals and subtraction of a single analog reference pixel from all active pixels on the sensor. Heritage methods using the aforementioned statistical parameters in the digital domain (such as statistical averaging of the reference pixels themselves) zeroes out the high-variance components, and the counterpart components in the active pixels remain uncorrected. This paper describes how the new methodology was demonstrated through analysis of fast-varying noise components using the Hilbert-Huang Transform Data Processing System tool (HHT-DPS) developed at NASA and the high-level programming language MATLAB (Trademark of MathWorks Inc.), as well as alternative methods for correcting for the high-variance noise component, using an HgCdTe sensor data. The NASA Hubble Space Telescope data post-processing, as well as future deep-space cosmology projects on-board instrument data processing from all the sensor channels, would benefit from this effort.

  2. Electron-beam lithography with character projection technique for high-throughput exposure with line-edge quality control

    NASA Astrophysics Data System (ADS)

    Ikeno, Rimon; Maruyama, Satoshi; Mita, Yoshio; Ikeda, Makoto; Asada, Kunihiro

    2016-07-01

    The high throughput of character projection (CP) electron-beam (EB) lithography makes it a promising technique for low-to-medium volume device fabrication with regularly arranged layouts, such as for standard-cell logics and memory arrays. However, non-VLSI applications such as MEMS and MOEMS may not be able to fully utilize the benefits of the CP method due to the wide variety of layout figures including curved and oblique edges. In addition, the stepwise shapes that appear because of the EB exposure process often result in intolerable edge roughness, which degrades device performances. In this study, we propose a general EB lithography methodology for such applications utilizing a combination of the CP and variable-shaped beam methods. In the process of layout data conversion with CP character instantiation, several control parameters were optimized to minimize the shot count, improve the edge quality, and enhance the overall device performance. We have demonstrated EB shot reduction and edge-quality improvement with our methodology by using a leading-edge EB exposure tool, ADVANTEST F7000S-VD02, and a high-resolution hydrogen silsesquioxane resist. Atomic force microscope observations were used to analyze the resist edge profiles' quality to determine the influence of the control parameters used in the data conversion process.

  3. OrbView-3 Technical Performance Evaluation 2005: Modulation Transfer Function

    NASA Technical Reports Server (NTRS)

    Cole, Aaron

    2007-01-01

    The Technical performance evaluation of OrbView-3 using the Modulation Transfer Function (MTF) is presented. The contents include: 1) MTF Results and Methodology; 2) Radiometric Calibration Methodology; and 3) Relative Radiometric Assessment Results

  4. Swiss Armed Forces Organizational Level Leader Development: A Qualitative Case Study

    DTIC Science & Technology

    2017-06-09

    chapter, divided in five distinct parts, describes the chosen research methodology , explain why the qualitative case study is appropriate to conduct...research study uses a qualitative methodology by performing a qualitative case study on the organizational level leader’s development process within...develop an in-depth understsanding of the phenomen.”82 Summary This research study uses a qualitative methodology by performing a case study on the

  5. UHF Signal Processing and Pattern Recognition of Partial Discharge in Gas-Insulated Switchgear Using Chromatic Methodology

    PubMed Central

    Wang, Xiaohua; Li, Xi; Rong, Mingzhe; Xie, Dingli; Ding, Dan; Wang, Zhixiang

    2017-01-01

    The ultra-high frequency (UHF) method is widely used in insulation condition assessment. However, UHF signal processing algorithms are complicated and the size of the result is large, which hinders extracting features and recognizing partial discharge (PD) patterns. This article investigated the chromatic methodology that is novel in PD detection. The principle of chromatic methodologies in color science are introduced. The chromatic processing represents UHF signals sparsely. The UHF signals obtained from PD experiments were processed using chromatic methodology and characterized by three parameters in chromatic space (H, L, and S representing dominant wavelength, signal strength, and saturation, respectively). The features of the UHF signals were studied hierarchically. The results showed that the chromatic parameters were consistent with conventional frequency domain parameters. The global chromatic parameters can be used to distinguish UHF signals acquired by different sensors, and they reveal the propagation properties of the UHF signal in the L-shaped gas-insulated switchgear (GIS). Finally, typical PD defect patterns had been recognized by using novel chromatic parameters in an actual GIS tank and good performance of recognition was achieved. PMID:28106806

  6. UHF Signal Processing and Pattern Recognition of Partial Discharge in Gas-Insulated Switchgear Using Chromatic Methodology.

    PubMed

    Wang, Xiaohua; Li, Xi; Rong, Mingzhe; Xie, Dingli; Ding, Dan; Wang, Zhixiang

    2017-01-18

    The ultra-high frequency (UHF) method is widely used in insulation condition assessment. However, UHF signal processing algorithms are complicated and the size of the result is large, which hinders extracting features and recognizing partial discharge (PD) patterns. This article investigated the chromatic methodology that is novel in PD detection. The principle of chromatic methodologies in color science are introduced. The chromatic processing represents UHF signals sparsely. The UHF signals obtained from PD experiments were processed using chromatic methodology and characterized by three parameters in chromatic space ( H , L , and S representing dominant wavelength, signal strength, and saturation, respectively). The features of the UHF signals were studied hierarchically. The results showed that the chromatic parameters were consistent with conventional frequency domain parameters. The global chromatic parameters can be used to distinguish UHF signals acquired by different sensors, and they reveal the propagation properties of the UHF signal in the L-shaped gas-insulated switchgear (GIS). Finally, typical PD defect patterns had been recognized by using novel chromatic parameters in an actual GIS tank and good performance of recognition was achieved.

  7. A new methodology capable of characterizing most volatile and less volatile minor edible oils components in a single chromatographic run without solvents or reagents. Detection of new components.

    PubMed

    Alberdi-Cedeño, Jon; Ibargoitia, María L; Cristillo, Giovanna; Sopelana, Patricia; Guillén, María D

    2017-04-15

    The possibilities offered by a new methodology to determine minor components in edible oils are described. This is based on immersion of a solid-phase microextraction fiber of PDMS/DVB into the oil matrix, followed by Gas Chromatography/Mass Spectrometry. It enables characterization and differentiation of edible oils in a simple way, without either solvents or sample modification. This methodology allows simultaneous identification and quantification of sterols, tocols, hydrocarbons of different natures, fatty acids, esters, monoglycerides, fatty amides, aldehydes, ketones, alcohols, epoxides, furans, pyrans and terpenic oxygenated derivatives. The broad information provided by this methodology is useful for different areas of interest such as nutritional value, oxidative stability, technological performance, quality, processing, safety and even the prevention of fraudulent practices. Furthermore, for the first time, certain fatty amides, gamma- and delta-lactones of high molecular weight, and other aromatic compounds such as some esters derived from cinnamic acid have been detected in edible oils. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Coprecipitation-assisted coacervative extraction coupled to high-performance liquid chromatography: An approach for determining organophosphorus pesticides in water samples.

    PubMed

    Mammana, Sabrina B; Berton, Paula; Camargo, Alejandra B; Lascalea, Gustavo E; Altamirano, Jorgelina C

    2017-05-01

    An analytical methodology based on coprecipitation-assisted coacervative extraction coupled to HPLC-UV was developed for determination of five organophosphorus pesticides (OPPs), including fenitrothion, guthion, parathion, methidathion, and chlorpyrifos, in water samples. It involves a green technique leading to an efficient and simple analytical methodology suitable for high-throughput analysis. Relevant physicochemical variables were studied and optimized on the analytical response of each OPP. Under optimized conditions, the resulting methodology was as follows: an aliquot of 9 mL of water sample was placed into a centrifuge tube and 0.5 mL sodium citrate 0.1 M, pH 4; 0.08 mL Al 2 (SO 4 ) 3 0.1 M; and 0.7 mL SDS 0.1 M were added and homogenized. After centrifugation the supernatant was discarded. A 700 μL aliquot of the coacervate-rich phase obtained was dissolved with 300 μL of methanol and 20 μL of the resulting solution was analyzed by HPLC-UV. The resulting LODs ranged within 0.7-2.5 ng/mL and the achieved RSD and recovery values were <8% (n = 3) and >81%, respectively. The proposed analytical methodology was successfully applied for the analysis of five OPPs in water samples for human consumption of different locations of Mendoza. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. An Analysis of Performance Enhancement Techniques for Overset Grid Applications

    NASA Technical Reports Server (NTRS)

    Djomehri, J. J.; Biswas, R.; Potsdam, M.; Strawn, R. C.; Biegel, Bryan (Technical Monitor)

    2002-01-01

    The overset grid methodology has significantly reduced time-to-solution of high-fidelity computational fluid dynamics (CFD) simulations about complex aerospace configurations. The solution process resolves the geometrical complexity of the problem domain by using separately generated but overlapping structured discretization grids that periodically exchange information through interpolation. However, high performance computations of such large-scale realistic applications must be handled efficiently on state-of-the-art parallel supercomputers. This paper analyzes the effects of various performance enhancement techniques on the parallel efficiency of an overset grid Navier-Stokes CFD application running on an SGI Origin2000 machine. Specifically, the role of asynchronous communication, grid splitting, and grid grouping strategies are presented and discussed. Results indicate that performance depends critically on the level of latency hiding and the quality of load balancing across the processors.

  10. How Six Sigma Methodology Improved Doctors' Performance

    ERIC Educational Resources Information Center

    Zafiropoulos, George

    2015-01-01

    Six Sigma methodology was used in a District General Hospital to assess the effect of the introduction of an educational programme to limit unnecessary admissions. The performance of the doctors involved in the programme was assessed. Ishikawa Fishbone and 5 S's were initially used and Pareto analysis of their findings was performed. The results…

  11. Additive Manufacturing of Functional Elements on Sheet Metal

    NASA Astrophysics Data System (ADS)

    Schaub, Adam; Ahuja, Bhrigu; Butzhammer, Lorenz; Osterziel, Johannes; Schmidt, Michael; Merklein, Marion

    Laser Beam Melting (LBM) process with its advantages of high design flexibility and free form manufacturing methodology is often applied limitedly due to its low productivity and unsuitability for mass production compared to conventional manufacturing processes. In order to overcome these limitations, a hybrid manufacturing methodology is developed combining the additive manufacturing process of laser beam melting with sheet forming processes. With an interest towards aerospace and medical industry, the material in focus is Ti-6Al-4V. Although Ti-6Al-4V is a commercially established material and its application for LBM process has been extensively investigated, the combination of LBM of Ti-6Al-4V with sheet metal still needs to be researched. Process dynamics such as high temperature gradients and thermally induced stresses lead to complex stress states at the interaction zone between the sheet and LBM structure. Within the presented paper mechanical characterization of hybrid parts will be performed by shear testing. The association of shear strength with process parameters is further investigated by analyzing the internal structure of the hybrid geometry at varying energy inputs during the LBM process. In order to compare the hybrid manufacturing methodology with conventional fabrication, the conventional methodologies subtractive machining and state of the art Laser Beam Melting is evaluated within this work. These processes will be analyzed for their mechanical characteristics and productivity by determining the build time and raw material consumption for each case. The paper is concluded by presenting the characteristics of the hybrid manufacturing methodology compared to alternative manufacturing technologies.

  12. Guidelines for the Design and Conduct of Clinical Studies in Knee Articular Cartilage Repair

    PubMed Central

    Mithoefer, Kai; Saris, Daniel B.F.; Farr, Jack; Kon, Elizaveta; Zaslav, Kenneth; Cole, Brian J.; Ranstam, Jonas; Yao, Jian; Shive, Matthew; Levine, David; Dalemans, Wilfried; Brittberg, Mats

    2011-01-01

    Objective: To summarize current clinical research practice and develop methodological standards for objective scientific evaluation of knee cartilage repair procedures and products. Design: A comprehensive literature review was performed of high-level original studies providing information relevant for the design of clinical studies on articular cartilage repair in the knee. Analysis of cartilage repair publications and synopses of ongoing trials were used to identify important criteria for the design, reporting, and interpretation of studies in this field. Results: Current literature reflects the methodological limitations of the scientific evidence available for articular cartilage repair. However, clinical trial databases of ongoing trials document a trend suggesting improved study designs and clinical evaluation methodology. Based on the current scientific information and standards of clinical care, detailed methodological recommendations were developed for the statistical study design, patient recruitment, control group considerations, study endpoint definition, documentation of results, use of validated patient-reported outcome instruments, and inclusion and exclusion criteria for the design and conduct of scientifically sound cartilage repair study protocols. A consensus statement among the International Cartilage Repair Society (ICRS) and contributing authors experienced in clinical trial design and implementation was achieved. Conclusions: High-quality clinical research methodology is critical for the optimal evaluation of current and new cartilage repair technologies. In addition to generally applicable principles for orthopedic study design, specific criteria and considerations apply to cartilage repair studies. Systematic application of these criteria and considerations can facilitate study designs that are scientifically rigorous, ethical, practical, and appropriate for the question(s) being addressed in any given cartilage repair research project. PMID:26069574

  13. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.

  14. Methodological quality of diagnostic accuracy studies on non-invasive coronary CT angiography: influence of QUADAS (Quality Assessment of Diagnostic Accuracy Studies included in systematic reviews) items on sensitivity and specificity.

    PubMed

    Schueler, Sabine; Walther, Stefan; Schuetz, Georg M; Schlattmann, Peter; Dewey, Marc

    2013-06-01

    To evaluate the methodological quality of diagnostic accuracy studies on coronary computed tomography (CT) angiography using the QUADAS (Quality Assessment of Diagnostic Accuracy Studies included in systematic reviews) tool. Each QUADAS item was individually defined to adapt it to the special requirements of studies on coronary CT angiography. Two independent investigators analysed 118 studies using 12 QUADAS items. Meta-regression and pooled analyses were performed to identify possible effects of methodological quality items on estimates of diagnostic accuracy. The overall methodological quality of coronary CT studies was merely moderate. They fulfilled a median of 7.5 out of 12 items. Only 9 of the 118 studies fulfilled more than 75 % of possible QUADAS items. One QUADAS item ("Uninterpretable Results") showed a significant influence (P = 0.02) on estimates of diagnostic accuracy with "no fulfilment" increasing specificity from 86 to 90 %. Furthermore, pooled analysis revealed that each QUADAS item that is not fulfilled has the potential to change estimates of diagnostic accuracy. The methodological quality of studies investigating the diagnostic accuracy of non-invasive coronary CT is only moderate and was found to affect the sensitivity and specificity. An improvement is highly desirable because good methodology is crucial for adequately assessing imaging technologies. • Good methodological quality is a basic requirement in diagnostic accuracy studies. • Most coronary CT angiography studies have only been of moderate design quality. • Weak methodological quality will affect the sensitivity and specificity. • No improvement in methodological quality was observed over time. • Authors should consider the QUADAS checklist when undertaking accuracy studies.

  15. A Highly Efficient Sensor Platform Using Simply Manufactured Nanodot Patterned Substrates

    PubMed Central

    Rasappa, Sozaraj; Ghoshal, Tandra; Borah, Dipu; Senthamaraikannan, Ramsankar; Holmes, Justin D.; Morris, Michael A.

    2015-01-01

    Block copolymer (BCP) self-assembly is a low-cost means to nanopattern surfaces. Here, we use these nanopatterns to directly print arrays of nanodots onto a conducting substrate (Indium Tin Oxide (ITO) coated glass) for application as an electrochemical sensor for ethanol (EtOH) and hydrogen peroxide (H2O2) detection. The work demonstrates that BCP systems can be used as a highly efficient, flexible methodology for creating functional surfaces of materials. Highly dense iron oxide nanodots arrays that mimicked the original BCP pattern were prepared by an ‘insitu’ BCP inclusion methodology using poly(styrene)-block-poly(ethylene oxide) (PS-b-PEO). The electrochemical behaviour of these densely packed arrays of iron oxide nanodots fabricated by two different molecular weight PS-b-PEO systems was studied. The dual detection of EtOH and H2O2 was clearly observed. The as-prepared nanodots have good long term thermal and chemical stability at the substrate and demonstrate promising electrocatalytic performance. PMID:26290188

  16. Documentation of probabilistic fracture mechanics codes used for reactor pressure vessels subjected to pressurized thermal shock loading: Parts 1 and 2. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balkey, K.; Witt, F.J.; Bishop, B.A.

    1995-06-01

    Significant attention has been focused on the issue of reactor vessel pressurized thermal shock (PTS) for many years. Pressurized thermal shock transient events are characterized by a rapid cooldown at potentially high pressure levels that could lead to a reactor vessel integrity concern for some pressurized water reactors. As a result of regulatory and industry efforts in the early 1980`s, a probabilistic risk assessment methodology has been established to address this concern. Probabilistic fracture mechanics analyses are performed as part of this methodology to determine conditional probability of significant flaw extension for given pressurized thermal shock events. While recent industrymore » efforts are underway to benchmark probabilistic fracture mechanics computer codes that are currently used by the nuclear industry, Part I of this report describes the comparison of two independent computer codes used at the time of the development of the original U.S. Nuclear Regulatory Commission (NRC) pressurized thermal shock rule. The work that was originally performed in 1982 and 1983 to compare the U.S. NRC - VISA and Westinghouse (W) - PFM computer codes has been documented and is provided in Part I of this report. Part II of this report describes the results of more recent industry efforts to benchmark PFM computer codes used by the nuclear industry. This study was conducted as part of the USNRC-EPRI Coordinated Research Program for reviewing the technical basis for pressurized thermal shock (PTS) analyses of the reactor pressure vessel. The work focused on the probabilistic fracture mechanics (PFM) analysis codes and methods used to perform the PTS calculations. An in-depth review of the methodologies was performed to verify the accuracy and adequacy of the various different codes. The review was structured around a series of benchmark sample problems to provide a specific context for discussion and examination of the fracture mechanics methodology.« less

  17. Prediction models for intracranial hemorrhage or major bleeding in patients on antiplatelet therapy: a systematic review and external validation study.

    PubMed

    Hilkens, N A; Algra, A; Greving, J P

    2016-01-01

    ESSENTIALS: Prediction models may help to identify patients at high risk of bleeding on antiplatelet therapy. We identified existing prediction models for bleeding and validated them in patients with cerebral ischemia. Five prediction models were identified, all of which had some methodological shortcomings. Performance in patients with cerebral ischemia was poor. Background Antiplatelet therapy is widely used in secondary prevention after a transient ischemic attack (TIA) or ischemic stroke. Bleeding is the main adverse effect of antiplatelet therapy and is potentially life threatening. Identification of patients at increased risk of bleeding may help target antiplatelet therapy. This study sought to identify existing prediction models for intracranial hemorrhage or major bleeding in patients on antiplatelet therapy and evaluate their performance in patients with cerebral ischemia. We systematically searched PubMed and Embase for existing prediction models up to December 2014. The methodological quality of the included studies was assessed with the CHARMS checklist. Prediction models were externally validated in the European Stroke Prevention Study 2, comprising 6602 patients with a TIA or ischemic stroke. We assessed discrimination and calibration of included prediction models. Five prediction models were identified, of which two were developed in patients with previous cerebral ischemia. Three studies assessed major bleeding, one studied intracerebral hemorrhage and one gastrointestinal bleeding. None of the studies met all criteria of good quality. External validation showed poor discriminative performance, with c-statistics ranging from 0.53 to 0.64 and poor calibration. A limited number of prediction models is available that predict intracranial hemorrhage or major bleeding in patients on antiplatelet therapy. The methodological quality of the models varied, but was generally low. Predictive performance in patients with cerebral ischemia was poor. In order to reliably predict the risk of bleeding in patients with cerebral ischemia, development of a prediction model according to current methodological standards is needed. © 2015 International Society on Thrombosis and Haemostasis.

  18. Investigation of High-alpha Lateral-directional Control Power Requirements for High-performance Aircraft

    NASA Technical Reports Server (NTRS)

    Foster, John V.; Ross, Holly M.; Ashley, Patrick A.

    1993-01-01

    Designers of the next-generation fighter and attack airplanes are faced with the requirements of good high-angle-of-attack maneuverability as well as efficient high speed cruise capability with low radar cross section (RCS) characteristics. As a result, they are challenged with the task of making critical design trades to achieve the desired levels of maneuverability and performance. This task has highlighted the need for comprehensive, flight-validated lateral-directional control power design guidelines for high angles of attack. A joint NASA/U.S. Navy study has been initiated to address this need and to investigate the complex flight dynamics characteristics and controls requirements for high-angle-of-attack lateral-directional maneuvering. A multi-year research program is underway which includes ground-based piloted simulation and flight validation. This paper will give a status update of this program that will include a program overview, description of test methodology and preliminary results.

  19. Investigation of high-alpha lateral-directional control power requirements for high-performance aircraft

    NASA Technical Reports Server (NTRS)

    Foster, John V.; Ross, Holly M.; Ashley, Patrick A.

    1993-01-01

    Designers of the next-generation fighter and attack airplanes are faced with the requirements of good high angle-of-attack maneuverability as well as efficient high speed cruise capability with low radar cross section (RCS) characteristics. As a result, they are challenged with the task of making critical design trades to achieve the desired levels of maneuverability and performance. This task has highlighted the need for comprehensive, flight-validated lateral-directional control power design guidelines for high angles of attack. A joint NASA/U.S. Navy study has been initiated to address this need and to investigate the complex flight dynamics characteristics and controls requirements for high angle-of-attack lateral-directional maneuvering. A multi-year research program is underway which includes groundbased piloted simulation and flight validation. This paper will give a status update of this program that will include a program overview, description of test methodology and preliminary results.

  20. Analytical methodologies for broad metabolite coverage of exhaled breath condensate.

    PubMed

    Aksenov, Alexander A; Zamuruyev, Konstantin O; Pasamontes, Alberto; Brown, Joshua F; Schivo, Michael; Foutouhi, Soraya; Weimer, Bart C; Kenyon, Nicholas J; Davis, Cristina E

    2017-09-01

    Breath analysis has been gaining popularity as a non-invasive technique that is amenable to a broad range of medical uses. One of the persistent problems hampering the wide application of the breath analysis method is measurement variability of metabolite abundances stemming from differences in both sampling and analysis methodologies used in various studies. Mass spectrometry has been a method of choice for comprehensive metabolomic analysis. For the first time in the present study, we juxtapose the most commonly employed mass spectrometry-based analysis methodologies and directly compare the resultant coverages of detected compounds in exhaled breath condensate in order to guide methodology choices for exhaled breath condensate analysis studies. Four methods were explored to broaden the range of measured compounds across both the volatile and non-volatile domain. Liquid phase sampling with polyacrylate Solid-Phase MicroExtraction fiber, liquid phase extraction with a polydimethylsiloxane patch, and headspace sampling using Carboxen/Polydimethylsiloxane Solid-Phase MicroExtraction (SPME) followed by gas chromatography mass spectrometry were tested for the analysis of volatile fraction. Hydrophilic interaction liquid chromatography and reversed-phase chromatography high performance liquid chromatography mass spectrometry were used for analysis of non-volatile fraction. We found that liquid phase breath condensate extraction was notably superior compared to headspace extraction and differences in employed sorbents manifested altered metabolite coverages. The most pronounced effect was substantially enhanced metabolite capture for larger, higher-boiling compounds using polyacrylate SPME liquid phase sampling. The analysis of the non-volatile fraction of breath condensate by hydrophilic and reverse phase high performance liquid chromatography mass spectrometry indicated orthogonal metabolite coverage by these chromatography modes. We found that the metabolite coverage could be enhanced significantly with the use of organic solvent as a device rinse after breath sampling to collect the non-aqueous fraction as opposed to neat breath condensate sample. Here, we show the detected ranges of compounds in each case and provide a practical guide for methodology selection for optimal detection of specific compounds. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Preparative Purification of Recombinant Proteins: Current Status and Future Trends

    PubMed Central

    Saraswat, Mayank; Ravidá, Alessandra; Holthofer, Harry

    2013-01-01

    Advances in fermentation technologies have resulted in the production of increased yields of proteins of economic, biopharmaceutical, and medicinal importance. Consequently, there is an absolute requirement for the development of rapid, cost-effective methodologies which facilitate the purification of such products in the absence of contaminants, such as superfluous proteins and endotoxins. Here, we provide a comprehensive overview of a selection of key purification methodologies currently being applied in both academic and industrial settings and discuss how innovative and effective protocols such as aqueous two-phase partitioning, membrane chromatography, and high-performance tangential flow filtration may be applied independently of or in conjunction with more traditional protocols for downstream processing applications. PMID:24455685

  2. Advances in indirect detector systems for ultra high-speed hard X-ray imaging with synchrotron light

    NASA Astrophysics Data System (ADS)

    Olbinado, M. P.; Grenzer, J.; Pradel, P.; De Resseguier, T.; Vagovic, P.; Zdora, M.-C.; Guzenko, V. A.; David, C.; Rack, A.

    2018-04-01

    We report on indirect X-ray detector systems for various full-field, ultra high-speed X-ray imaging methodologies, such as X-ray phase-contrast radiography, diffraction topography, grating interferometry and speckle-based imaging performed at the hard X-ray imaging beamline ID19 of the European Synchrotron—ESRF. Our work highlights the versatility of indirect X-ray detectors to multiple goals such as single synchrotron pulse isolation, multiple-frame recording up to millions frames per second, high efficiency, and high spatial resolution. Besides the technical advancements, potential applications are briefly introduced and discussed.

  3. Analyzing Reliability and Performance Trade-Offs of HLS-Based Designs in SRAM-Based FPGAs Under Soft Errors

    NASA Astrophysics Data System (ADS)

    Tambara, Lucas Antunes; Tonfat, Jorge; Santos, André; Kastensmidt, Fernanda Lima; Medina, Nilberto H.; Added, Nemitala; Aguiar, Vitor A. P.; Aguirre, Fernando; Silveira, Marcilei A. G.

    2017-02-01

    The increasing system complexity of FPGA-based hardware designs and shortening of time-to-market have motivated the adoption of new designing methodologies focused on addressing the current need for high-performance circuits. High-Level Synthesis (HLS) tools can generate Register Transfer Level (RTL) designs from high-level software programming languages. These tools have evolved significantly in recent years, providing optimized RTL designs, which can serve the needs of safety-critical applications that require both high performance and high reliability levels. However, a reliability evaluation of HLS-based designs under soft errors has not yet been presented. In this work, the trade-offs of different HLS-based designs in terms of reliability, resource utilization, and performance are investigated by analyzing their behavior under soft errors and comparing them to a standard processor-based implementation in an SRAM-based FPGA. Results obtained from fault injection campaigns and radiation experiments show that it is possible to increase the performance of a processor-based system up to 5,000 times by changing its architecture with a small impact in the cross section (increasing up to 8 times), and still increasing the Mean Workload Between Failures (MWBF) of the system.

  4. Methodological considerations for economic modelling of latent tuberculous infection screening in migrants.

    PubMed

    Shedrawy, J; Siroka, A; Oxlade, O; Matteelli, A; Lönnroth, K

    2017-09-01

    Tuberculosis (TB) in migrants from endemic to low-incidence countries results mainly from the reactivation of latent tuberculous infection (LTBI). LTBI screening policies for migrants vary greatly between countries, and the evidence on the cost-effectiveness of the different approaches is weak and heterogeneous. The aim of this review was to assess the methodology used in published economic evaluations of LTBI screening among migrants to identify critical methodological options that must be considered when using modelling to determine value for money from different economic perspectives. Three electronic databases were searched and 10 articles were included. There was considerable variation across this small number of studies with regard to economic perspective, main outcomes, modelling technique, screening options and target populations considered, as well as in parameterisation of the epidemiological situation, test accuracy, efficacy, safety and programme performance. Only one study adopted a societal perspective; others adopted a health care or wider government perspective. Parameters representing the cascade of screening and treating LTBI varied widely, with some studies using highly aspirational scenarios. This review emphasises the need for a more harmonised approach for economic analysis, and better transparency in how policy options and economic perspectives influence methodological choices. Variability is justifiable for some parameters. However, sufficient data are available to standardise others. A societal perspective is ideal, but can be challenging due to limited data. Assumptions about programme performance should be based on empirical data or at least realistic assumptions. Results should be interpreted within specific contexts and policy options, with cautious generalisations.

  5. Risk Assessment Methodology for Hazardous Waste Management (1998)

    EPA Pesticide Factsheets

    A methodology is described for systematically assessing and comparing the risks to human health and the environment of hazardous waste management alternatives. The methodology selects and links appropriate models and techniques for performing the process.

  6. Development and testing of methodology for evaluating the performance of multi-input/multi-output digital control systems

    NASA Technical Reports Server (NTRS)

    Polotzky, Anthony S.; Wieseman, Carol; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek

    1990-01-01

    The development of a controller performance evaluation (CPE) methodology for multiinput/multioutput digital control systems is described. The equations used to obtain the open-loop plant, controller transfer matrices, and return-difference matrices are given. Results of applying the CPE methodology to evaluate MIMO digital flutter suppression systems being tested on an active flexible wing wind-tunnel model are presented to demonstrate the CPE capability.

  7. High-Penetration Photovoltaic Planning Methodologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, David Wenzhong; Muljadi, Eduard; Tian, Tian

    The main objective of this report is to provide an overview of select U.S. utility methodologies for performing high-penetration photovoltaic (HPPV) system planning and impact studies. This report covers the Federal Energy Regulatory Commission's orders related to photovoltaic (PV) power system interconnection, particularly the interconnection processes for the Large Generation Interconnection Procedures and Small Generation Interconnection Procedures. In addition, it includes U.S. state interconnection standards and procedures. The procedures used by these regulatory bodies consider the impacts of HPPV power plants on the networks. Technical interconnection requirements for HPPV voltage regulation include aspects of power monitoring, grounding, synchronization, connection tomore » the overall distribution system, back-feeds, disconnecting means, abnormal operating conditions, and power quality. This report provides a summary of mitigation strategies to minimize the impact of HPPV. Recommendations and revisions to the standards may take place as the penetration level of renewables on the grid increases and new technologies develop in future years.« less

  8. Modeling Long-term Creep Performance for Welded Nickel-base Superalloy Structures for Power Generation Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Chen; Gupta, Vipul; Huang, Shenyan

    The goal of this project is to model long-term creep performance for nickel-base superalloy weldments in high temperature power generation systems. The project uses physics-based modeling methodologies and algorithms for predicting alloy properties in heterogeneous material structures. The modeling methodology will be demonstrated on a gas turbine combustor liner weldment of Haynes 282 precipitate-strengthened nickel-base superalloy. The major developments are: (1) microstructure-property relationships under creep conditions and microstructure characterization (2) modeling inhomogeneous microstructure in superalloy weld (3) modeling mesoscale plastic deformation in superalloy weld and (4) a constitutive creep model that accounts for weld and base metal microstructure and theirmore » long term evolution. The developed modeling technology is aimed to provide a more efficient and accurate assessment of a material’s long-term performance compared with current testing and extrapolation methods. This modeling technology will also accelerate development and qualification of new materials in advanced power generation systems. This document is a final technical report for the project, covering efforts conducted from October 2014 to December 2016.« less

  9. Modeling of organic solar cell using response surface methodology

    NASA Astrophysics Data System (ADS)

    Suliman, Rajab; Mitul, Abu Farzan; Mohammad, Lal; Djira, Gemechis; Pan, Yunpeng; Qiao, Qiquan

    Polymer solar cells have drawn much attention during the past few decades due to their low manufacturing cost and incompatibility for flexible substrates. In solution-processed organic solar cells, the optimal thickness, annealing temperature, and morphology are key components to achieving high efficiency. In this work, response surface methodology (RSM) is used to find optimal fabrication conditions for polymer solar cells. In order to optimize cell efficiency, the central composite design (CCD) with three independent variables polymer concentration, polymer-fullerene ratio, and active layer spinning speed was used. Optimal device performance was achieved using 10.25 mg/ml polymer concentration, 0.42 polymer-fullerene ratio, and 1624 rpm of active layer spinning speed. The predicted response (the efficiency) at the optimum stationary point was found to be 5.23% for the Poly(diketopyrrolopyrrole-terthiophene) (PDPP3T)/PC60BM solar cells. Moreover, 97% of the variation in the device performance was explained by the best model. Finally, the experimental results are consistent with the CCD prediction, which proves that this is a promising and appropriate model for optimum device performance and fabrication conditions.

  10. Data splitting for artificial neural networks using SOM-based stratified sampling.

    PubMed

    May, R J; Maier, H R; Dandy, G C

    2010-03-01

    Data splitting is an important consideration during artificial neural network (ANN) development where hold-out cross-validation is commonly employed to ensure generalization. Even for a moderate sample size, the sampling methodology used for data splitting can have a significant effect on the quality of the subsets used for training, testing and validating an ANN. Poor data splitting can result in inaccurate and highly variable model performance; however, the choice of sampling methodology is rarely given due consideration by ANN modellers. Increased confidence in the sampling is of paramount importance, since the hold-out sampling is generally performed only once during ANN development. This paper considers the variability in the quality of subsets that are obtained using different data splitting approaches. A novel approach to stratified sampling, based on Neyman sampling of the self-organizing map (SOM), is developed, with several guidelines identified for setting the SOM size and sample allocation in order to minimize the bias and variance in the datasets. Using an example ANN function approximation task, the SOM-based approach is evaluated in comparison to random sampling, DUPLEX, systematic stratified sampling, and trial-and-error sampling to minimize the statistical differences between data sets. Of these approaches, DUPLEX is found to provide benchmark performance with good model performance, with no variability. The results show that the SOM-based approach also reliably generates high-quality samples and can therefore be used with greater confidence than other approaches, especially in the case of non-uniform datasets, with the benefit of scalability to perform data splitting on large datasets. Copyright 2009 Elsevier Ltd. All rights reserved.

  11. Diversity in livestock resources in pastoral systems in Africa.

    PubMed

    Kaufmann, B A; Lelea, M A; Hulsebusch, C G

    2016-11-01

    Pastoral systems are important producers and repositories of livestock diversity. Pastoralists use variability in their livestock resources to manage high levels of environmental variability in economically advantageous ways. In pastoral systems, human-animal-environment interactions are the basis of production and the key to higher productivity and efficiency. In other words, pastoralists manage a production system that exploits variability and keeps production costs low. When differentiating, characterising and evaluating pastoral breeds, this context-specific, functional dimension of diversity in livestock resources needs to be considered. The interaction of animals with their environment is determined not only by morphological and physiological traits but also by experience and socially learned behaviour. This high proportion of non-genetic components determining the performance of livestock means that current models for analysing livestock diversity and performance, which are based on genetic inheritance, have limited ability to describe pastoral performance. There is a need for methodological innovations to evaluate pastoral breeds and animals, since comparisons based on performance 'under optimal conditions' are irrelevant within this production system. Such innovations must acknowledge that livestock or breed performance is governed by complex human-animal-environment interactions, and varies through time and space due to the mobile and seasonal nature of the pastoral system. Pastoralists' breeding concepts and selection strategies seem to be geared towards improving their animals' capability to exploit variability, by - among other things - enhancing within-breed diversity. In-depth studies of these concepts and strategies could contribute considerably towards developing methodological innovations for the characterisation and evaluation of pastoral livestock resources.

  12. Predicting protein complexes from weighted protein-protein interaction graphs with a novel unsupervised methodology: Evolutionary enhanced Markov clustering.

    PubMed

    Theofilatos, Konstantinos; Pavlopoulou, Niki; Papasavvas, Christoforos; Likothanassis, Spiros; Dimitrakopoulos, Christos; Georgopoulos, Efstratios; Moschopoulos, Charalampos; Mavroudi, Seferina

    2015-03-01

    Proteins are considered to be the most important individual components of biological systems and they combine to form physical protein complexes which are responsible for certain molecular functions. Despite the large availability of protein-protein interaction (PPI) information, not much information is available about protein complexes. Experimental methods are limited in terms of time, efficiency, cost and performance constraints. Existing computational methods have provided encouraging preliminary results, but they phase certain disadvantages as they require parameter tuning, some of them cannot handle weighted PPI data and others do not allow a protein to participate in more than one protein complex. In the present paper, we propose a new fully unsupervised methodology for predicting protein complexes from weighted PPI graphs. The proposed methodology is called evolutionary enhanced Markov clustering (EE-MC) and it is a hybrid combination of an adaptive evolutionary algorithm and a state-of-the-art clustering algorithm named enhanced Markov clustering. EE-MC was compared with state-of-the-art methodologies when applied to datasets from the human and the yeast Saccharomyces cerevisiae organisms. Using public available datasets, EE-MC outperformed existing methodologies (in some datasets the separation metric was increased by 10-20%). Moreover, when applied to new human datasets its performance was encouraging in the prediction of protein complexes which consist of proteins with high functional similarity. In specific, 5737 protein complexes were predicted and 72.58% of them are enriched for at least one gene ontology (GO) function term. EE-MC is by design able to overcome intrinsic limitations of existing methodologies such as their inability to handle weighted PPI networks, their constraint to assign every protein in exactly one cluster and the difficulties they face concerning the parameter tuning. This fact was experimentally validated and moreover, new potentially true human protein complexes were suggested as candidates for further validation using experimental techniques. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. An optimized methodology for whole genome sequencing of RNA respiratory viruses from nasopharyngeal aspirates.

    PubMed

    Goya, Stephanie; Valinotto, Laura E; Tittarelli, Estefania; Rojo, Gabriel L; Nabaes Jodar, Mercedes S; Greninger, Alexander L; Zaiat, Jonathan J; Marti, Marcelo A; Mistchenko, Alicia S; Viegas, Mariana

    2018-01-01

    Over the last decade, the number of viral genome sequences deposited in available databases has grown exponentially. However, sequencing methodology vary widely and many published works have relied on viral enrichment by viral culture or nucleic acid amplification with specific primers rather than through unbiased techniques such as metagenomics. The genome of RNA viruses is highly variable and these enrichment methodologies may be difficult to achieve or may bias the results. In order to obtain genomic sequences of human respiratory syncytial virus (HRSV) from positive nasopharyngeal aspirates diverse methodologies were evaluated and compared. A total of 29 nearly complete and complete viral genomes were obtained. The best performance was achieved with a DNase I treatment to the RNA directly extracted from the nasopharyngeal aspirate (NPA), sequence-independent single-primer amplification (SISPA) and library preparation performed with Nextera XT DNA Library Prep Kit with manual normalization. An average of 633,789 and 1,674,845 filtered reads per library were obtained with MiSeq and NextSeq 500 platforms, respectively. The higher output of NextSeq 500 was accompanied by the increasing of duplicated reads percentage generated during SISPA (from an average of 1.5% duplicated viral reads in MiSeq to an average of 74% in NextSeq 500). HRSV genome recovery was not affected by the presence or absence of duplicated reads but the computational demand during the analysis was increased. Considering that only samples with viral load ≥ E+06 copies/ml NPA were tested, no correlation between sample viral loads and number of total filtered reads was observed, nor with the mapped viral reads. The HRSV genomes showed a mean coverage of 98.46% with the best methodology. In addition, genomes of human metapneumovirus (HMPV), human rhinovirus (HRV) and human parainfluenza virus types 1-3 (HPIV1-3) were also obtained with the selected optimal methodology.

  14. Perceived organizational support and extra-role performance: which leads to which?

    PubMed

    Chen, Zhixia; Eisenberger, Robert; Johnson, Kelly M; Sucharski, Ivan L; Aselage, Justin

    2009-02-01

    L. Rhoades and R. Eisenberger (2002) reported the meta-analytic finding of a highly statistically significant relation between perceived organizational support (POS) and performance but concluded that the reviewed studies' methodology allowed no conclusion concerning the direction of the association. To investigate this issue, the authors assessed POS and extra-role performance 2 times, separated by a 3-year interval, among 199 employees of an electronic and appliance sales organization. Using a cross-lagged panel design, the authors found that POS was positively associated with a temporal change in extra-role performance. In contrast, the relation between extra-role performance and temporal change in POS was not statistically significant. These findings provide evidence that POS leads to extra-role performance.

  15. Using PICO Methodology to Answer Questions About Smoking in COPD Patients.

    PubMed

    Jiménez Ruiz, Carlos A; Buljubasich, Daniel; Riesco Miranda, Juan Antonio; Acuña Izcaray, Agustín; de Granda Orive, José Ignacio; Chatkin, José Miguel; Zabert, Gustavo; Guerreros Benavides, Alfredo; Paez Espinel, Nelson; Noé, Valeri; Sánchez-Angarita, Efraín; Núñez-Sánchez, Ingrid; Sansores, Raúl H; Casas, Alejandro; Palomar Lever, Andrés; Alfageme Michavila, Inmaculada

    2017-11-01

    The ALAT and SEPAR Treatment and Control of Smoking Groups have collaborated in the preparation of this document which attempts to answer, by way of PICO methodology, different questions on health interventions for helping COPD patients to stop smoking. The main recommendations are: (i)moderate-quality evidence and strong recommendation for performing spirometry in COPD patients and in smokers with a high risk of developing the disease, as a motivational tool (particularly for showing evidence of lung age), a diagnostic tool, and for active case-finding; (ii)high-quality evidence and strong recommendation for using intensive dedicated behavioral counselling and drug treatment for helping COPD patients to stop smoking; (iii)high-quality evidence and strong recommendation for initiating interventions for helping COPD patients to stop smoking during hospitalization with improvement when the intervention is prolonged after discharge, and (iv)high-quality evidence and strong recommendation for funding treatment of smoking in COPD patients, in view of the impact on health and health economics. Copyright © 2017 SEPAR. Publicado por Elsevier España, S.L.U. All rights reserved.

  16. The Effect of Soft Skills and Training Methodology on Employee Performance

    ERIC Educational Resources Information Center

    Ibrahim, Rosli; Boerhannoeddin, Ali; Bakare, Kazeem Kayode

    2017-01-01

    Purpose: The purpose of this paper is to investigate the effect of soft skill acquisition and the training methodology adopted on employee work performance. In this study, the authors study the trends of research in training and work performance in organisations that focus on the acquisition of technical or "hard skills" for employee…

  17. Graphene Quantum Capacitors for High Frequency Tunable Analog Applications.

    PubMed

    Moldovan, Clara F; Vitale, Wolfgang A; Sharma, Pankaj; Tamagnone, Michele; Mosig, Juan R; Ionescu, Adrian M

    2016-08-10

    Graphene quantum capacitors (GQC) are demonstrated to be enablers of radio-frequency (RF) functions through voltage-tuning of their capacitance. We show that GQC complements MEMS and MOSFETs in terms of performance for high frequency analog applications and tunability. We propose a CMOS compatible fabrication process and report the first experimental assessment of their performance at microwaves frequencies (up to 10 GHz), demonstrating experimental GQCs in the pF range with a tuning ratio of 1.34:1 within 1.25 V, and Q-factors up to 12 at 1 GHz. The figures of merit of graphene variable capacitors are studied in detail from 150 to 350 K. Furthermore, we describe a systematic, graphene specific approach to optimize their performance and predict the figures of merit achieved if such a methodology is applied.

  18. Commentary: Attitude Adjustment--Educating PhD Scientist for Business Careers

    ERIC Educational Resources Information Center

    Schuster, Sheldon M.

    2011-01-01

    The PhD graduate from a US research academic institution who has worked 5-7 years to solve a combination of laboratory and computational problems after an in-depth classroom experience is likely superbly trained in at least a subset of the life sciences and the underlying methodology and thought processes required to perform high level research.…

  19. Integrated Multidisciplinary Design of High Pressure Multistage Compressor Systems (la Conception integree des compresseurs multi-etage a haute performance)

    DTIC Science & Technology

    1998-09-01

    development ONERA and SNECMA and is described in [Nicoud, 91]. This methodology [ Karadimas , 1997]. The aim of all efforts method solves the Quasi-3D...computer power, 1994 2-11 BERTHILLIER, M., DUPONT, C., MONDAL, R., KARADIMAS , G. : New Ways for the Design the BARRAU, J.J. : Blade Forced Response

  20. The Effect on Pupils' Science Performance and Problem-Solving Ability through Lego: An Engineering Design-Based Modeling Approach

    ERIC Educational Resources Information Center

    Li, Yanyan; Huang, Zhinan; Jiang, Menglu; Chang, Ting-Wen

    2016-01-01

    Incorporating scientific fundamentals via engineering through a design-based methodology has proven to be highly effective for STEM education. Engineering design can be instantiated for learning as they involve mental and physical stimulation and develop practical skills especially in solving problems. Lego bricks, as a set of toys based on design…

  1. A Systematic Review of Economic Evaluation Methodologies Between Resource-Limited and Resource-Rich Countries: A Case of Rotavirus Vaccines.

    PubMed

    Thiboonboon, Kittiphong; Santatiwongchai, Benjarin; Chantarastapornchit, Varit; Rattanavipapong, Waranya; Teerawattananon, Yot

    2016-12-01

    For more than three decades, the number and influence of economic evaluations of healthcare interventions have been increasing and gaining attention from a policy level. However, concerns about the credibility of these studies exist, particularly in studies from low- and middle- income countries (LMICs). This analysis was performed to explore economic evaluations conducted in LMICs in terms of methodological variations, quality of reporting and evidence used for the analyses. These results were compared with those studies conducted in high-income countries (HICs). Rotavirus vaccine was selected as a case study, as it is one of the interventions that many studies in both settings have explored. The search to identify individual studies on rotavirus vaccines was performed in March 2014 using MEDLINE and the National Health Service Economic Evaluation Database. Only full economic evaluations, comparing cost and outcomes of at least two alternatives, were included for review. Selected criteria were applied to assess methodological variation, quality of reporting and quality of evidence used. Eighty-five studies were included, consisting of 45 studies in HICs and 40 studies in LMICs. Seventy-five percent of the studies in LMICs were published by researchers from HICs. Compared with studies in HICs, the LMIC studies showed less methodological variety. In terms of the quality of reporting, LMICs had a high adherence to technical criteria, but HICs ultimately proved to be better. The same trend applied for the quality of evidence used. Although the quality of economic evaluations in LMICs was not as high as those from HICs, it is of an acceptable level given several limitations that exist in these settings. However, the results of this study may not reflect the fact that LMICs have developed a better research capacity in the domain of health economics, given that most of the studies were in theory led by researchers from HICs. Putting more effort into fostering the development of both research infrastructure and capacity building as well as encouraging local engagement in LMICs is thus necessary.

  2. Systems cost/performance analysis (study 2.3). Volume 2: Systems cost/performance model. [unmanned automated payload programs and program planning

    NASA Technical Reports Server (NTRS)

    Campbell, B. H.

    1974-01-01

    A methodology which was developed for balanced designing of spacecraft subsystems and interrelates cost, performance, safety, and schedule considerations was refined. The methodology consists of a two-step process: the first step is one of selecting all hardware designs which satisfy the given performance and safety requirements, the second step is one of estimating the cost and schedule required to design, build, and operate each spacecraft design. Using this methodology to develop a systems cost/performance model allows the user of such a model to establish specific designs and the related costs and schedule. The user is able to determine the sensitivity of design, costs, and schedules to changes in requirements. The resulting systems cost performance model is described and implemented as a digital computer program.

  3. METHODOLOGICAL QUALITY OF ECONOMIC EVALUATIONS ALONGSIDE TRIALS OF KNEE PHYSIOTHERAPY.

    PubMed

    García-Pérez, Lidia; Linertová, Renata; Arvelo-Martín, Alejandro; Guerra-Marrero, Carolina; Martínez-Alberto, Carlos Enrique; Cuéllar-Pompa, Leticia; Escobar, Antonio; Serrano-Aguilar, Pedro

    2017-01-01

    The methodological quality of an economic evaluation performed alongside a clinical trial can be underestimated if the paper does not report key methodological features. This study discusses methodological assessment issues on the example of a systematic review on cost-effectiveness of physiotherapy for knee osteoarthritis. Six economic evaluation studies included in the systematic review and related clinical trials were assessed using the 10-question check-list by Drummond and the Physiotherapy Evidence Database (PEDro) scale. All economic evaluations were performed alongside a clinical trial but the studied interventions were too heterogeneous to be synthesized. Methodological quality of the economic evaluations reported in the papers was not free of drawbacks, and in some cases, it improved when information from the related clinical trial was taken into account. Economic evaluation papers dedicate little space to methodological features of related clinical trials; therefore, the methodological quality can be underestimated if evaluated separately from the trials. Future economic evaluations should follow more strictly the recommendations about methodology and the authors should pay special attention to the quality of reporting.

  4. REDItools: high-throughput RNA editing detection made easy.

    PubMed

    Picardi, Ernesto; Pesole, Graziano

    2013-07-15

    The reliable detection of RNA editing sites from massive sequencing data remains challenging and, although several methodologies have been proposed, no computational tools have been released to date. Here, we introduce REDItools a suite of python scripts to perform high-throughput investigation of RNA editing using next-generation sequencing data. REDItools are in python programming language and freely available at http://code.google.com/p/reditools/. ernesto.picardi@uniba.it or graziano.pesole@uniba.it Supplementary data are available at Bioinformatics online.

  5. Induction annealing and subsequent quenching: effect on the thermoelectric properties of boron-doped nanographite ensembles.

    PubMed

    Xie, Ming; Lee, Chee Huei; Wang, Jiesheng; Yap, Yoke Khin; Bruno, Paola; Gruen, Dieter; Singh, Dileep; Routbort, Jules

    2010-04-01

    Boron-doped nanographite ensembles (NGEs) are interesting thermoelectric nanomaterials for high temperature applications. Rapid induction annealing and quenching has been applied to boron-doped NGEs using a relatively low-cost, highly reliable, laboratory built furnace to show that substantial improvements in thermoelectric power factors can be achieved using this methodology. Details of the design and performance of this compact induction furnace as well as results of the thermoelectric measurements will be reported here.

  6. PyMCT: A Very High Level Language Coupling Tool For Climate System Models

    NASA Astrophysics Data System (ADS)

    Tobis, M.; Pierrehumbert, R. T.; Steder, M.; Jacob, R. L.

    2006-12-01

    At the Climate Systems Center of the University of Chicago, we have been examining strategies for applying agile programming techniques to complex high-performance modeling experiments. While the "agile" development methodology differs from a conventional requirements process and its associated milestones, the process remain a formal one. It is distinguished by continuous improvement in functionality, large numbers of small releases, extensive and ongoing testing strategies, and a strong reliance on very high level languages (VHLL). Here we report on PyMCT, which we intend as a core element in a model ensemble control superstructure. PyMCT is a set of Python bindings for MCT, the Fortran-90 based Model Coupling Toolkit, which forms the infrastructure for the inter-component communication in the Community Climate System Model (CCSM). MCT provides a scalable model communication infrastructure. In order to take maximum advantage of agile software development methodologies, we exposed MCT functionality to Python, a prominent VHLL. We describe how the scalable architecture of MCT allows us to overcome the relatively weak runtime performance of Python, so that the performance of the combined system is not severely impacted. To demonstrate these advantages, we reimplemented the CCSM coupler in Python. While this alone offers no new functionality, it does provide a rigorous test of PyMCT functionality and performance. We reimplemented the CPL6 library, presenting an interesting case study of the comparison between conventional Fortran-90 programming and the higher abstraction level provided by a VHLL. The powerful abstractions provided by Python will allow much more complex experimental paradigms. In particular, we hope to build on the scriptability of our coupling strategy to enable systematic sensitivity tests. Our most ambitious objective is to combine our efforts with Bayesian inverse modeling techniques toward objective tuning at the highest level, across model architectures.

  7. "Found Performance": Towards a Musical Methodology for Exploring the Aesthetics of Care.

    PubMed

    Wood, Stuart

    2017-09-18

    Concepts of performance in fine art reflect key processes in music therapy. Music therapy enables practitioners to reframe patients as performers, producing new meanings around the clinical knowledge attached to medical histories and constructs. In this paper, music therapy practices are considered in the wider context of art history, with reference to allied theories from social research. Tracing a century in art that has revised the performativity of found objects (starting with Duchamp's "Fountain"), and of found sound (crystallised by Cage's 4' 33) this paper proposes that music therapy might be a pioneer methodology of "found performance". Examples from music therapy and contemporary socially engaged art practices are brought as potential links between artistic methodologies and medical humanities research, with specific reference to notions of Aesthetics of Care.

  8. The Application of MRI for Depiction of Subtle Blood Brain Barrier Disruption in Stroke

    PubMed Central

    Israeli, David; Tanne, David; Daniels, Dianne; Last, David; Shneor, Ran; Guez, David; Landau, Efrat; Roth, Yiftach; Ocherashvilli, Aharon; Bakon, Mati; Hoffman, Chen; Weinberg, Amit; Volk, Talila; Mardor, Yael

    2011-01-01

    The development of imaging methodologies for detecting blood-brain-barrier (BBB) disruption may help predict stroke patient's propensity to develop hemorrhagic complications following reperfusion. We have developed a delayed contrast extravasation MRI-based methodology enabling real-time depiction of subtle BBB abnormalities in humans with high sensitivity to BBB disruption and high spatial resolution. The increased sensitivity to subtle BBB disruption is obtained by acquiring T1-weighted MRI at relatively long delays (~15 minutes) after contrast injection and subtracting from them images acquired immediately after contrast administration. In addition, the relatively long delays allow for acquisition of high resolution images resulting in high resolution BBB disruption maps. The sensitivity is further increased by image preprocessing with corrections for intensity variations and with whole body (rigid+elastic) registration. Since only two separate time points are required, the time between the two acquisitions can be used for acquiring routine clinical data, keeping the total imaging time to a minimum. A proof of concept study was performed in 34 patients with ischemic stroke and 2 patients with brain metastases undergoing high resolution T1-weighted MRI acquired at 3 time points after contrast injection. The MR images were pre-processed and subtracted to produce BBB disruption maps. BBB maps of patients with brain metastases and ischemic stroke presented different patterns of BBB opening. The significant advantage of the long extravasation time was demonstrated by a dynamic-contrast-enhancement study performed continuously for 18 min. The high sensitivity of our methodology enabled depiction of clear BBB disruption in 27% of the stroke patients who did not have abnormalities on conventional contrast-enhanced MRI. In 36% of the patients, who had abnormalities detectable by conventional MRI, the BBB disruption volumes were significantly larger in the maps than in conventional MRI. These results demonstrate the advantages of delayed contrast extravasation in increasing the sensitivity to subtle BBB disruption in ischemic stroke patients. The calculated disruption maps provide clear depiction of significant volumes of BBB disruption unattainable by conventional contrast-enhanced MRI. PMID:21209786

  9. The application of MRI for depiction of subtle blood brain barrier disruption in stroke.

    PubMed

    Israeli, David; Tanne, David; Daniels, Dianne; Last, David; Shneor, Ran; Guez, David; Landau, Efrat; Roth, Yiftach; Ocherashvilli, Aharon; Bakon, Mati; Hoffman, Chen; Weinberg, Amit; Volk, Talila; Mardor, Yael

    2010-12-26

    The development of imaging methodologies for detecting blood-brain-barrier (BBB) disruption may help predict stroke patient's propensity to develop hemorrhagic complications following reperfusion. We have developed a delayed contrast extravasation MRI-based methodology enabling real-time depiction of subtle BBB abnormalities in humans with high sensitivity to BBB disruption and high spatial resolution. The increased sensitivity to subtle BBB disruption is obtained by acquiring T1-weighted MRI at relatively long delays (~15 minutes) after contrast injection and subtracting from them images acquired immediately after contrast administration. In addition, the relatively long delays allow for acquisition of high resolution images resulting in high resolution BBB disruption maps. The sensitivity is further increased by image preprocessing with corrections for intensity variations and with whole body (rigid+elastic) registration. Since only two separate time points are required, the time between the two acquisitions can be used for acquiring routine clinical data, keeping the total imaging time to a minimum. A proof of concept study was performed in 34 patients with ischemic stroke and 2 patients with brain metastases undergoing high resolution T1-weighted MRI acquired at 3 time points after contrast injection. The MR images were pre-processed and subtracted to produce BBB disruption maps. BBB maps of patients with brain metastases and ischemic stroke presented different patterns of BBB opening. The significant advantage of the long extravasation time was demonstrated by a dynamic-contrast-enhancement study performed continuously for 18 min. The high sensitivity of our methodology enabled depiction of clear BBB disruption in 27% of the stroke patients who did not have abnormalities on conventional contrast-enhanced MRI. In 36% of the patients, who had abnormalities detectable by conventional MRI, the BBB disruption volumes were significantly larger in the maps than in conventional MRI. These results demonstrate the advantages of delayed contrast extravasation in increasing the sensitivity to subtle BBB disruption in ischemic stroke patients. The calculated disruption maps provide clear depiction of significant volumes of BBB disruption unattainable by conventional contrast-enhanced MRI.

  10. Visualization of the Eastern Renewable Generation Integration Study: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruchalla, Kenny; Novacheck, Joshua; Bloom, Aaron

    The Eastern Renewable Generation Integration Study (ERGIS), explores the operational impacts of the wide spread adoption of wind and solar photovoltaics (PV) resources in the U.S. Eastern Interconnection and Quebec Interconnection (collectively, EI). In order to understand some of the economic and reliability challenges of managing hundreds of gigawatts of wind and PV generation, we developed state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NREL's high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated withmore » evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions. state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NRELs high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated with evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions.« less

  11. A methodology for evaluating detection performance of ultrasonic array imaging algorithms for coarse-grained materials.

    PubMed

    Van Pamel, Anton; Brett, Colin R; Lowe, Michael J S

    2014-12-01

    Improving the ultrasound inspection capability for coarse-grained metals remains of longstanding interest and is expected to become increasingly important for next-generation electricity power plants. Conventional ultrasonic A-, B-, and C-scans have been found to suffer from strong background noise caused by grain scattering, which can severely limit the detection of defects. However, in recent years, array probes and full matrix capture (FMC) imaging algorithms have unlocked exciting possibilities for improvements. To improve and compare these algorithms, we must rely on robust methodologies to quantify their performance. This article proposes such a methodology to evaluate the detection performance of imaging algorithms. For illustration, the methodology is applied to some example data using three FMC imaging algorithms; total focusing method (TFM), phase-coherent imaging (PCI), and decomposition of the time-reversal operator with multiple scattering filter (DORT MSF). However, it is important to note that this is solely to illustrate the methodology; this article does not attempt the broader investigation of different cases that would be needed to compare the performance of these algorithms in general. The methodology considers the statistics of detection, presenting the detection performance as probability of detection (POD) and probability of false alarm (PFA). A test sample of coarse-grained nickel super alloy, manufactured to represent materials used for future power plant components and containing some simple artificial defects, is used to illustrate the method on the candidate algorithms. The data are captured in pulse-echo mode using 64-element array probes at center frequencies of 1 and 5 MHz. In this particular case, it turns out that all three algorithms are shown to perform very similarly when comparing their flaw detection capabilities.

  12. System Evaluation and Life-Cycle Cost Analysis of a Commercial-Scale High-Temperature Electrolysis Hydrogen Production Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwin A. Harvego; James E. O'Brien; Michael G. McKellar

    2012-11-01

    Results of a system evaluation and lifecycle cost analysis are presented for a commercial-scale high-temperature electrolysis (HTE) central hydrogen production plant. The plant design relies on grid electricity to power the electrolysis process and system components, and industrial natural gas to provide process heat. The HYSYS process analysis software was used to evaluate the reference central plant design capable of producing 50,000 kg/day of hydrogen. The HYSYS software performs mass and energy balances across all components to allow optimization of the design using a detailed process flow sheet and realistic operating conditions specified by the analyst. The lifecycle cost analysismore » was performed using the H2A analysis methodology developed by the Department of Energy (DOE) Hydrogen Program. This methodology utilizes Microsoft Excel spreadsheet analysis tools that require detailed plant performance information (obtained from HYSYS), along with financial and cost information to calculate lifecycle costs. The results of the lifecycle analyses indicate that for a 10% internal rate of return, a large central commercial-scale hydrogen production plant can produce 50,000 kg/day of hydrogen at an average cost of $2.68/kg. When the cost of carbon sequestration is taken into account, the average cost of hydrogen production increases by $0.40/kg to $3.08/kg.« less

  13. An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft

    NASA Technical Reports Server (NTRS)

    Olson, E. D.; Mavris, D. N.

    2000-01-01

    An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.

  14. Investigation of metallurgical coatings for automotive applications

    NASA Astrophysics Data System (ADS)

    Su, Jun Feng

    Metallurgical coatings have been widely used in the automotive industry from component machining, engine daily running to body decoration due to their high hardness, wear resistance, corrosion resistance and low friction coefficient. With high demands in energy saving, weight reduction and limiting environmental impact, the use of new materials such as light Aluminum/magnesium alloys with high strength-weight ratio for engine block and advanced high-strength steel (AHSS) with better performance in crash energy management for die stamping, are increasing. However, challenges are emerging when these new materials are applied such as the wear of the relative soft light alloys and machining tools for hard AHSS. The protective metallurgical coatings are the best option to profit from these new materials' advantages without altering largely in mass production equipments, machinery, tools and human labor. In this dissertation, a plasma electrolytic oxidation (PEO) coating processing on aluminum alloys was introduced in engine cylinder bores to resist wear and corrosion. The tribological behavior of the PEO coatings under boundary and starve lubrication conditions was studied experimentally and numerically for the first time. Experimental results of the PEO coating demonstrated prominent wear resistance and low friction, taking into account the extreme working conditions. The numerical elastohydrodynamic lubrication (EHL) and asperity contact based tribological study also showed a promising approach on designing low friction and high wear resistant PEO coatings. Other than the fabrication of the new coatings, a novel coating evaluation methodology, namely, inclined impact sliding tester was presented in the second part of this dissertation. This methodology has been developed and applied in testing and analyzing physical vapor deposition (PVD)/ chemical vapor deposition (CVD)/PEO coatings. Failure mechanisms of these common metallurgical hard coatings were systematically studied and summarized via the new testing methodology. Field tests based on the new coating characterization technique proved that this methodology is reliable, effective and economical.

  15. Performance in physiology evaluation: possible improvement by active learning strategies.

    PubMed

    Montrezor, Luís H

    2016-12-01

    The evaluation process is complex and extremely important in the teaching/learning process. Evaluations are constantly employed in the classroom to assist students in the learning process and to help teachers improve the teaching process. The use of active methodologies encourages students to participate in the learning process, encourages interaction with their peers, and stimulates thinking about physiological mechanisms. This study examined the performance of medical students on physiology over four semesters with and without active engagement methodologies. Four activities were used: a puzzle, a board game, a debate, and a video. The results show that engaging in activities with active methodologies before a physiology cognitive monitoring test significantly improved student performance compared with not performing the activities. We integrate the use of these methodologies with classic lectures, and this integration appears to improve the teaching/learning process in the discipline of physiology and improves the integration of physiology with cardiology and neurology. In addition, students enjoy the activities and perform better on their evaluations when they use them. Copyright © 2016 The American Physiological Society.

  16. Coordinated crew performance in commercial aircraft operations

    NASA Technical Reports Server (NTRS)

    Murphy, M. R.

    1977-01-01

    A specific methodology is proposed for an improved system of coding and analyzing crew member interaction. The complexity and lack of precision of many crew and task variables suggest the usefulness of fuzzy linguistic techniques for modeling and computer simulation of the crew performance process. Other research methodologies and concepts that have promise for increasing the effectiveness of research on crew performance are identified.

  17. Three-Dimensional Electrodes for High-Performance Bioelectrochemical Systems

    PubMed Central

    Yu, Yang-Yang; Zhai, Dan-Dan; Si, Rong-Wei; Sun, Jian-Zhong; Liu, Xiang; Yong, Yang-Chun

    2017-01-01

    Bioelectrochemical systems (BES) are groups of bioelectrochemical technologies and platforms that could facilitate versatile environmental and biological applications. The performance of BES is mainly determined by the key process of electron transfer at the bacteria and electrode interface, which is known as extracellular electron transfer (EET). Thus, developing novel electrodes to encourage bacteria attachment and enhance EET efficiency is of great significance. Recently, three-dimensional (3D) electrodes, which provide large specific area for bacteria attachment and macroporous structures for substrate diffusion, have emerged as a promising electrode for high-performance BES. Herein, a comprehensive review of versatile methodology developed for 3D electrode fabrication is presented. This review article is organized based on the categorization of 3D electrode fabrication strategy and BES performance comparison. In particular, the advantages and shortcomings of these 3D electrodes are presented and their future development is discussed. PMID:28054970

  18. An Agile Constructionist Mentoring Methodology for Software Projects in the High School

    ERIC Educational Resources Information Center

    Meerbaum-Salant, Orni; Hazzan, Orit

    2010-01-01

    This article describes the construction process and evaluation of the Agile Constructionist Mentoring Methodology (ACMM), a mentoring method for guiding software development projects in the high school. The need for such a methodology has arisen due to the complexity of mentoring software project development in the high school. We introduce the…

  19. A methodology to ensure and improve accuracy of Ki67 labelling index estimation by automated digital image analysis in breast cancer tissue.

    PubMed

    Laurinavicius, Arvydas; Plancoulaine, Benoit; Laurinaviciene, Aida; Herlin, Paulette; Meskauskas, Raimundas; Baltrusaityte, Indra; Besusparis, Justinas; Dasevicius, Darius; Elie, Nicolas; Iqbal, Yasir; Bor, Catherine

    2014-01-01

    Immunohistochemical Ki67 labelling index (Ki67 LI) reflects proliferative activity and is a potential prognostic/predictive marker of breast cancer. However, its clinical utility is hindered by the lack of standardized measurement methodologies. Besides tissue heterogeneity aspects, the key element of methodology remains accurate estimation of Ki67-stained/counterstained tumour cell profiles. We aimed to develop a methodology to ensure and improve accuracy of the digital image analysis (DIA) approach. Tissue microarrays (one 1-mm spot per patient, n = 164) from invasive ductal breast carcinoma were stained for Ki67 and scanned. Criterion standard (Ki67-Count) was obtained by counting positive and negative tumour cell profiles using a stereology grid overlaid on a spot image. DIA was performed with Aperio Genie/Nuclear algorithms. A bias was estimated by ANOVA, correlation and regression analyses. Calibration steps of the DIA by adjusting the algorithm settings were performed: first, by subjective DIA quality assessment (DIA-1), and second, to compensate the bias established (DIA-2). Visual estimate (Ki67-VE) on the same images was performed by five pathologists independently. ANOVA revealed significant underestimation bias (P < 0.05) for DIA-0, DIA-1 and two pathologists' VE, while DIA-2, VE-median and three other VEs were within the same range. Regression analyses revealed best accuracy for the DIA-2 (R-square = 0.90) exceeding that of VE-median, individual VEs and other DIA settings. Bidirectional bias for the DIA-2 with overestimation at low, and underestimation at high ends of the scale was detected. Measurement error correction by inverse regression was applied to improve DIA-2-based prediction of the Ki67-Count, in particularfor the clinically relevant interval of Ki67-Count < 40%. Potential clinical impact of the prediction was tested by dichotomising the cases at the cut-off values of 10, 15, and 20%. Misclassification rate of 5-7% was achieved, compared to that of 11-18% for the VE-median-based prediction. Our experiments provide methodology to achieve accurate Ki67-LI estimation by DIA, based on proper validation, calibration, and measurement error correction procedures, guided by quantified bias from reference values obtained by stereology grid count. This basic validation step is an important prerequisite for high-throughput automated DIA applications to investigate tissue heterogeneity and clinical utility aspects of Ki67 and other immunohistochemistry (IHC) biomarkers.

  20. A methodology to ensure and improve accuracy of Ki67 labelling index estimation by automated digital image analysis in breast cancer tissue

    PubMed Central

    2014-01-01

    Introduction Immunohistochemical Ki67 labelling index (Ki67 LI) reflects proliferative activity and is a potential prognostic/predictive marker of breast cancer. However, its clinical utility is hindered by the lack of standardized measurement methodologies. Besides tissue heterogeneity aspects, the key element of methodology remains accurate estimation of Ki67-stained/counterstained tumour cell profiles. We aimed to develop a methodology to ensure and improve accuracy of the digital image analysis (DIA) approach. Methods Tissue microarrays (one 1-mm spot per patient, n = 164) from invasive ductal breast carcinoma were stained for Ki67 and scanned. Criterion standard (Ki67-Count) was obtained by counting positive and negative tumour cell profiles using a stereology grid overlaid on a spot image. DIA was performed with Aperio Genie/Nuclear algorithms. A bias was estimated by ANOVA, correlation and regression analyses. Calibration steps of the DIA by adjusting the algorithm settings were performed: first, by subjective DIA quality assessment (DIA-1), and second, to compensate the bias established (DIA-2). Visual estimate (Ki67-VE) on the same images was performed by five pathologists independently. Results ANOVA revealed significant underestimation bias (P < 0.05) for DIA-0, DIA-1 and two pathologists’ VE, while DIA-2, VE-median and three other VEs were within the same range. Regression analyses revealed best accuracy for the DIA-2 (R-square = 0.90) exceeding that of VE-median, individual VEs and other DIA settings. Bidirectional bias for the DIA-2 with overestimation at low, and underestimation at high ends of the scale was detected. Measurement error correction by inverse regression was applied to improve DIA-2-based prediction of the Ki67-Count, in particular for the clinically relevant interval of Ki67-Count < 40%. Potential clinical impact of the prediction was tested by dichotomising the cases at the cut-off values of 10, 15, and 20%. Misclassification rate of 5-7% was achieved, compared to that of 11-18% for the VE-median-based prediction. Conclusions Our experiments provide methodology to achieve accurate Ki67-LI estimation by DIA, based on proper validation, calibration, and measurement error correction procedures, guided by quantified bias from reference values obtained by stereology grid count. This basic validation step is an important prerequisite for high-throughput automated DIA applications to investigate tissue heterogeneity and clinical utility aspects of Ki67 and other immunohistochemistry (IHC) biomarkers. PMID:24708745

  1. Methodological quality of systematic reviews on influenza vaccination.

    PubMed

    Remschmidt, Cornelius; Wichmann, Ole; Harder, Thomas

    2014-03-26

    There is a growing body of evidence on the risks and benefits of influenza vaccination in various target groups. Systematic reviews are of particular importance for policy decisions. However, their methodological quality can vary considerably. To investigate the methodological quality of systematic reviews on influenza vaccination (efficacy, effectiveness, safety) and to identify influencing factors. A systematic literature search on systematic reviews on influenza vaccination was performed, using MEDLINE, EMBASE and three additional databases (1990-2013). Review characteristics were extracted and the methodological quality of the reviews was evaluated using the assessment of multiple systematic reviews (AMSTAR) tool. U-test, Kruskal-Wallis test, chi-square test, and multivariable linear regression analysis were used to assess the influence of review characteristics on AMSTAR-score. Fourty-six systematic reviews fulfilled the inclusion criteria. Average methodological quality was high (median AMSTAR-score: 8), but variability was large (AMSTAR range: 0-11). Quality did not differ significantly according to vaccination target group. Cochrane reviews had higher methodological quality than non-Cochrane reviews (p=0.001). Detailed analysis showed that this was due to better study selection and data extraction, inclusion of unpublished studies, and better reporting of study characteristics (all p<0.05). In the adjusted analysis, no other factor, including industry sponsorship or journal impact factor had an influence on AMSTAR score. Systematic reviews on influenza vaccination showed large differences regarding their methodological quality. Reviews conducted by the Cochrane collaboration were of higher quality than others. When using systematic reviews to guide the development of vaccination recommendations, the methodological quality of a review in addition to its content should be considered. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Uncertainty evaluation of EnPIs in industrial applications as a key factor in setting improvement actions

    NASA Astrophysics Data System (ADS)

    D'Emilia, G.; Di Gasbarro, D.; Gaspari, A.; Natale, E.

    2015-11-01

    A methodology is proposed assuming high-level Energy Performance Indicators (EnPIs) uncertainty as quantitative indicator of the evolution of an Energy Management System (EMS). Motivations leading to the selection of the EnPIs, uncertainty evaluation techniques and criteria supporting decision-making are discussed, in order to plan and pursue reliable measures for energy performance improvement. In this paper, problems, priorities, operative possibilities and reachable improvement limits are examined, starting from the measurement uncertainty assessment. Two different industrial cases are analysed with reference to the following aspects: absence/presence of energy management policy and action plans; responsibility level for the energy issues; employees’ training and motivation in respect of the energy problems; absence/presence of adequate infrastructures for monitoring and sharing of energy information; level of standardization and integration of methods and procedures linked to the energy activities; economic and financial resources for the improvement of energy efficiency. A critic and comparative analysis of the obtained results is realized. The methodology, experimentally validated, allows developing useful considerations for effective, realistic and economically feasible improvement plans, depending on the specific situation. Recursive application of the methodology allows getting reliable and resolved assessment of the EMS status, also in dynamic industrial contexts.

  3. A procedural method for the efficient implementation of full-custom VLSI designs

    NASA Technical Reports Server (NTRS)

    Belk, P.; Hickey, N.

    1987-01-01

    An imbedded language system for the layout of very large scale integration (VLSI) circuits is examined. It is shown that through the judicious use of this system, a large variety of circuits can be designed with circuit density and performance comparable to traditional full-custom design methods, but with design costs more comparable to semi-custom design methods. The high performance of this methodology is attributable to the flexibility of procedural descriptions of VLSI layouts and to a number of automatic and semi-automatic tools within the system.

  4. The relationship between return on investment and quality of study methodology in workplace health promotion programs.

    PubMed

    Baxter, Siyan; Sanderson, Kristy; Venn, Alison J; Blizzard, C Leigh; Palmer, Andrew J

    2014-01-01

    To determine the relationship between return on investment (ROI) and quality of study methodology in workplace health promotion programs. Data were obtained through a systematic literature search of National Health Service Economic Evaluation Database (NHS EED), Database of Abstracts of Reviews of Effects (DARE), Health Technology Database (HTA), Cost Effectiveness Analysis (CEA) Registry, EconLit, PubMed, Embase, Wiley, and Scopus. Included were articles written in English or German reporting cost(s) and benefit(s) and single or multicomponent health promotion programs on working adults. Return-to-work and workplace injury prevention studies were excluded. Methodological quality was graded using British Medical Journal Economic Evaluation Working Party checklist. Economic outcomes were presented as ROI. ROI was calculated as ROI = (benefits - costs of program)/costs of program. Results were weighted by study size and combined using meta-analysis techniques. Sensitivity analysis was performed using two additional methodological quality checklists. The influences of quality score and important study characteristics on ROI were explored. Fifty-one studies (61 intervention arms) published between 1984 and 2012 included 261,901 participants and 122,242 controls from nine industry types across 12 countries. Methodological quality scores were highly correlated between checklists (r = .84-.93). Methodological quality improved over time. Overall weighted ROI [mean ± standard deviation (confidence interval)] was 1.38 ± 1.97 (1.38-1.39), which indicated a 138% return on investment. When accounting for methodological quality, an inverse relationship to ROI was found. High-quality studies (n = 18) had a smaller mean ROI, 0.26 ± 1.74 (.23-.30), compared to moderate (n = 16) 0.90 ± 1.25 (.90-.91) and low-quality (n = 27) 2.32 ± 2.14 (2.30-2.33) studies. Randomized control trials (RCTs) (n = 12) exhibited negative ROI, -0.22 ± 2.41(-.27 to -.16). Financial returns become increasingly positive across quasi-experimental, nonexperimental, and modeled studies: 1.12 ± 2.16 (1.11-1.14), 1.61 ± 0.91 (1.56-1.65), and 2.05 ± 0.88 (2.04-2.06), respectively. Overall, mean weighted ROI in workplace health promotion demonstrated a positive ROI. Higher methodological quality studies provided evidence of smaller financial returns. Methodological quality and study design are important determinants.

  5. Single-shot and single-sensor high/super-resolution microwave imaging based on metasurface.

    PubMed

    Wang, Libo; Li, Lianlin; Li, Yunbo; Zhang, Hao Chi; Cui, Tie Jun

    2016-06-01

    Real-time high-resolution (including super-resolution) imaging with low-cost hardware is a long sought-after goal in various imaging applications. Here, we propose broadband single-shot and single-sensor high-/super-resolution imaging by using a spatio-temporal dispersive metasurface and an imaging reconstruction algorithm. The metasurface with spatio-temporal dispersive property ensures the feasibility of the single-shot and single-sensor imager for super- and high-resolution imaging, since it can convert efficiently the detailed spatial information of the probed object into one-dimensional time- or frequency-dependent signal acquired by a single sensor fixed in the far-field region. The imaging quality can be improved by applying a feature-enhanced reconstruction algorithm in post-processing, and the desired imaging resolution is related to the distance between the object and metasurface. When the object is placed in the vicinity of the metasurface, the super-resolution imaging can be realized. The proposed imaging methodology provides a unique means to perform real-time data acquisition, high-/super-resolution images without employing expensive hardware (e.g. mechanical scanner, antenna array, etc.). We expect that this methodology could make potential breakthroughs in the areas of microwave, terahertz, optical, and even ultrasound imaging.

  6. Extension of least squares spectral resolution algorithm to high-resolution lipidomics data.

    PubMed

    Zeng, Ying-Xu; Mjøs, Svein Are; David, Fabrice P A; Schmid, Adrien W

    2016-03-31

    Lipidomics, which focuses on the global study of molecular lipids in biological systems, has been driven tremendously by technical advances in mass spectrometry (MS) instrumentation, particularly high-resolution MS. This requires powerful computational tools that handle the high-throughput lipidomics data analysis. To address this issue, a novel computational tool has been developed for the analysis of high-resolution MS data, including the data pretreatment, visualization, automated identification, deconvolution and quantification of lipid species. The algorithm features the customized generation of a lipid compound library and mass spectral library, which covers the major lipid classes such as glycerolipids, glycerophospholipids and sphingolipids. Next, the algorithm performs least squares resolution of spectra and chromatograms based on the theoretical isotope distribution of molecular ions, which enables automated identification and quantification of molecular lipid species. Currently, this methodology supports analysis of both high and low resolution MS as well as liquid chromatography-MS (LC-MS) lipidomics data. The flexibility of the methodology allows it to be expanded to support more lipid classes and more data interpretation functions, making it a promising tool in lipidomic data analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Development of a weight/sizing design synthesis computer program. Volume 1: Program formulation

    NASA Technical Reports Server (NTRS)

    Garrison, J. M.

    1973-01-01

    The development of a weight/sizing design synthesis methodology for use in support of the main line space shuttle program is discussed. The methodology has a minimum number of data inputs and quick turn around capabilities. The methodology makes it possible to: (1) make weight comparisons between current shuttle configurations and proposed changes, (2) determine the effects of various subsystems trades on total systems weight, and (3) determine the effects of weight on performance and performance on weight.

  8. Sensory re-education after nerve injury of the upper limb: a systematic review.

    PubMed

    Oud, Tanja; Beelen, Anita; Eijffinger, Elianne; Nollet, Frans

    2007-06-01

    To systematically review the available evidence for the effectiveness of sensory re-education to improve the sensibility of the hand in patients with a peripheral nerve injury of the upper limb. Studies were identified by an electronic search in the databases MEDLINE, Cumulative Index to Nursing & Allied Health Literature (CINAHL), EMBASE, the Cochrane Library, the Physiotherapy Evidence Database (PEDro), and the database of the Dutch National Institute of Allied Health Professions (Doconline) and by screening the reference lists of relevant articles. Two reviewers selected studies that met the following inclusion criteria: all designs except case reports, adults with impaired sensibility of the hand due to a peripheral nerve injury of the upper limb, and sensibility and functional sensibility as outcome measures. The methodological quality of the included studies was independently assessed by two reviewers. A best-evidence synthesis was performed, based on design, methodological quality and significant findings on outcome measures. Seven studies, with sample sizes ranging from 11 to 49, were included in the systematic review and appraised for content. Five of these studies were of poor methodological quality. One uncontrolled study (N = 1 3 ) was considered to be of sufficient methodological quality, and one randomized controlled trial (N = 49) was of high methodological quality. Best-evidence synthesis showed that there is limited evidence for the effectiveness of sensory re-education, provided by a statistically significant improvement in sensibility found in one high-quality randomized controlled trial. There is a need for further well-defined clinical trials to assess the effectiveness of sensory re-education of patients with impaired sensibility of the hand due to a peripheral nerve injury.

  9. From SNOMED CT to Uberon: Transferability of evaluation methodology between similarly structured ontologies.

    PubMed

    Elhanan, Gai; Ochs, Christopher; Mejino, Jose L V; Liu, Hao; Mungall, Christopher J; Perl, Yehoshua

    2017-06-01

    To examine whether disjoint partial-area taxonomy, a semantically-based evaluation methodology that has been successfully tested in SNOMED CT, will perform with similar effectiveness on Uberon, an anatomical ontology that belongs to a structurally similar family of ontologies as SNOMED CT. A disjoint partial-area taxonomy was generated for Uberon. One hundred randomly selected test concepts that overlap between partial-areas were matched to a same size control sample of non-overlapping concepts. The samples were blindly inspected for non-critical issues and presumptive errors first by a general domain expert whose results were then confirmed or rejected by a highly experienced anatomical ontology domain expert. Reported issues were subsequently reviewed by Uberon's curators. Overlapping concepts in Uberon's disjoint partial-area taxonomy exhibited a significantly higher rate of all issues. Clear-cut presumptive errors trended similarly but did not reach statistical significance. A sub-analysis of overlapping concepts with three or more relationship types indicated a much higher rate of issues. Overlapping concepts from Uberon's disjoint abstraction network are quite likely (up to 28.9%) to exhibit issues. The results suggest that the methodology can transfer well between same family ontologies. Although Uberon exhibited relatively few overlapping concepts, the methodology can be combined with other semantic indicators to expand the process to other concepts within the ontology that will generate high yields of discovered issues. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Compositional evaluation of selected agro-industrial wastes as valuable sources for the recovery of complex carbohydrates.

    PubMed

    Vojvodić, Aleksandra; Komes, Draženka; Vovk, Irena; Belščak-Cvitanović, Ana; Bušić, Arijana

    2016-11-01

    Re-utilization of various agro-industrial wastes is of growing importance from many aspects. Considering the variety and complexity of such materials, compositional data and compliant methodology is still undergoing many updates and improvements. Present study evaluated sugar beet pulp (SBP), walnut shell (WS), cocoa bean husk (CBH), onion peel (OP) and pea pods (PP) as potentially valuable materials for carbohydrate recovery. Macrocomponent analyses revealed carbohydrate fraction as the most abundant, dominating in dietary fibres. Upon complete acid hydrolysis of sample alcohol insoluble residues, developed procedures of high performance thin-layer chromatography (HPTLC) and high performance liquid chromatography (HPLC) coupled with 3-methyl-1-phenyl-2-pyrazolin-5-one pre-column derivatization (PMP-derivatization) were used for carbohydrate monomeric composition determination. HPTLC exhibited good qualitative features useful for multi-sample rapid analysis, while HPLC superior separation and quantification characteristics. Distinctive monomeric patterns were obtained among samples. OP, SBP and CBH, due to the high galacturonic acid content (20.81%, 13.96% and 6.90% dry matter basis, respectively), may be regarded as pectin sources, while WS and PP as materials abundant in xylan-rich hemicellulose (total xylan content 15.53%, 9.63% dry matter basis, respectively). Present study provides new and valuable compositional data for different plant residual materials and a reference for the application of established methodology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. A comparison of methods for DPLL loop filter design

    NASA Technical Reports Server (NTRS)

    Aguirre, S.; Hurd, W. J.; Kumar, R.; Statman, J.

    1986-01-01

    Four design methodologies for loop filters for a class of digital phase-locked loops (DPLLs) are presented. The first design maps an optimum analog filter into the digital domain; the second approach designs a filter that minimizes in discrete time weighted combination of the variance of the phase error due to noise and the sum square of the deterministic phase error component; the third method uses Kalman filter estimation theory to design a filter composed of a least squares fading memory estimator and a predictor. The last design relies on classical theory, including rules for the design of compensators. Linear analysis is used throughout the article to compare different designs, and includes stability, steady state performance and transient behavior of the loops. Design methodology is not critical when the loop update rate can be made high relative to loop bandwidth, as the performance approaches that of continuous time. For low update rates, however, the miminization method is significantly superior to the other methods.

  12. Sustainability at the local scale: defining highly aggregated indices for assessing environmental performance. The province of Reggio Emilia (Italy) as a case study.

    PubMed

    Clerici, Nicola; Bodini, Antonio; Ferrarini, Alessandro

    2004-10-01

    In order to achieve improved sustainability, local authorities need to use tools that adequately describe and synthesize environmental information. This article illustrates a methodological approach that organizes a wide suite of environmental indicators into few aggregated indices, making use of correlation, principal component analysis, and fuzzy sets. Furthermore, a weighting system, which includes stakeholders' priorities and ambitions, is applied. As a case study, the described methodology is applied to the Reggio Emilia Province in Italy, by considering environmental information from 45 municipalities. Principal component analysis is used to condense an initial set of 19 indicators into 6 fundamental dimensions that highlight patterns of environmental conditions at the provincial scale. These dimensions are further aggregated in two indices of environmental performance through fuzzy sets. The simple form of these indices makes them particularly suitable for public communication, as they condensate a wide set of heterogeneous indicators. The main outcomes of the analysis and the potential applications of the method are discussed.

  13. Distributed intelligent control and management (DICAM) applications and support for semi-automated development

    NASA Technical Reports Server (NTRS)

    Hayes-Roth, Frederick; Erman, Lee D.; Terry, Allan; Hayes-Roth, Barbara

    1992-01-01

    We have recently begun a 4-year effort to develop a new technology foundation and associated methodology for the rapid development of high-performance intelligent controllers. Our objective in this work is to enable system developers to create effective real-time systems for control of multiple, coordinated entities in much less time than is currently required. Our technical strategy for achieving this objective is like that in other domain-specific software efforts: analyze the domain and task underlying effective performance, construct parametric or model-based generic components and overall solutions to the task, and provide excellent means for specifying, selecting, tailoring or automatically generating the solution elements particularly appropriate for the problem at hand. In this paper, we first present our specific domain focus, briefly describe the methodology and environment we are developing to provide a more regular approach to software development, and then later describe the issues this raises for the research community and this specific workshop.

  14. Modal Identification in an Automotive Multi-Component System Using HS 3D-DIC

    PubMed Central

    López-Alba, Elías; Felipe-Sesé, Luis; Díaz, Francisco A.

    2018-01-01

    The modal characterization of automotive lighting systems becomes difficult using sensors due to the light weight of the elements which compose the component as well as the intricate access to allocate them. In experimental modal analysis, high speed 3D digital image correlation (HS 3D-DIC) is attracting the attention since it provides full-field contactless measurements of 3D displacements as main advantage over other techniques. Different methodologies have been published that perform modal identification, i.e., natural frequencies, damping ratios, and mode shapes using the full-field information. In this work, experimental modal analysis has been performed in a multi-component automotive lighting system using HS 3D-DIC. Base motion excitation was applied to simulate operating conditions. A recently validated methodology has been employed for modal identification using transmissibility functions, i.e., the transfer functions from base motion tests. Results make it possible to identify local and global behavior of the different elements of injected polymeric and metallic materials. PMID:29401725

  15. An easy, rapid and inexpensive method to monitor tributyltin (TBT) toxicity in the laboratory.

    PubMed

    Cruz, Andreia; Moreira, Rafael; Mendo, Sónia

    2014-05-01

    Tributyltin (TBT) contamination remains a major problem worldwide. Many laboratories are committed to the development of remediation methodologies that could help reduce the negative impact of this compound in the environment. Furthermore, it is important to have at hand simple methodologies for evaluating TBT toxicity in the laboratory, besides the use of complex and costly analytical instrumentation. With that purpose, a method was adapted that is based on the inhibition of growth of an indicator strain, Micrococcus luteus ATCC 9341, under TBT. Different types of matrices, of TBT concentrations and sample treatments were tested. The results herein reported show that the bioassay method can be applied for both aqueous and soil samples and also for a high range of TBT concentrations (at least up to 500 μmol/L). Besides being cheap and easy to perform, it can be performed in any laboratory. Additionally, one possible application of the method to monitor TBT degradation is presented as an example.

  16. A Tensor Product Formulation of Strassen's Matrix Multiplication Algorithm with Memory Reduction

    DOE PAGES

    Kumar, B.; Huang, C. -H.; Sadayappan, P.; ...

    1995-01-01

    In this article, we present a program generation strategy of Strassen's matrix multiplication algorithm using a programming methodology based on tensor product formulas. In this methodology, block recursive programs such as the fast Fourier Transforms and Strassen's matrix multiplication algorithm are expressed as algebraic formulas involving tensor products and other matrix operations. Such formulas can be systematically translated to high-performance parallel/vector codes for various architectures. In this article, we present a nonrecursive implementation of Strassen's algorithm for shared memory vector processors such as the Cray Y-MP. A previous implementation of Strassen's algorithm synthesized from tensor product formulas required working storagemore » of size O(7 n ) for multiplying 2 n × 2 n matrices. We present a modified formulation in which the working storage requirement is reduced to O(4 n ). The modified formulation exhibits sufficient parallelism for efficient implementation on a shared memory multiprocessor. Performance results on a Cray Y-MP8/64 are presented.« less

  17. The Application of the Microgenetic Method to Studies of Learning in Science Education: Characteristics of Published Studies, Methodological Issues and Recommendations for Future Research

    ERIC Educational Resources Information Center

    Brock, Richard; Taber, Keith S.

    2017-01-01

    This paper examines the role of the microgenetic method in science education. The microgenetic method is a technique for exploring the progression of learning in detail through repeated, high-frequency observations of a learner's "performance" in some activity. Existing microgenetic studies in science education are analysed. This leads…

  18. Spontaneous Analogy by Piggybacking on a Perceptual System

    DTIC Science & Technology

    2013-08-01

    1992). High-level Perception, Representation, and Analogy: A Critique of Artificial Intelligence Methodology. J. Exp. Theor. Artif . Intell., 4(3...nrl.navy.mil David W. Aha Navy Center for Applied Research in Artificial Intelligence Naval Research Laboratory (Code 5510); Washington, DC 20375 david.aha...Research Laboratory,Center for Applied Research in Artificial Intelligence (Code 5510),4555 Overlook Ave., SW,Washington,DC,20375 8. PERFORMING ORGANIZATION

  19. Exploitation of Unintentional Information Leakage from Integrated Circuits

    DTIC Science & Technology

    2011-12-01

    U.S. Defense Science Board Task Force examined the effects and risks of outsourcing high performance microchip production to foreign countries [Off05...mapping methodology is developed and demon- strated to comprehensively assess the information leakage of arbitrary block cipher implementations. The...engineering poses a serious threat since it can en- able competitors or adversaries to bypass years of research and development through counterfeiting or

  20. The Effect of Communication Skills Training by Video Feedback Method on Clinical Skills of Interns of Isfahan University of Medical Sciences Compared to Didactic Methods

    ERIC Educational Resources Information Center

    Managheb, S. E.; Zamani, A.; Shams, B.; Farajzadegan, Z.

    2012-01-01

    Background: Effective communication is essential to the practice of high-quality medicine. There are methodological challenges in communication skills training. This study was performed in order to assess the educational benefits of communication skills training by video feedback method versus traditional formats such as lectures on clinical…

  1. Eastern Renewable Generation Integration Study: Redefining What’s Possible for Renewable Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bloom, Aaron

    NREL project manager Aaron Bloom introduces NREL’s Eastern Renewable Generation Integration Study (ERGIS) and high-performance computing capabilities and new methodologies that allowed NREL to model operations of the Eastern Interconnection at unprecedented fidelity. ERGIS shows that the Eastern Interconnection can balance the variability and uncertainty of wind and solar photovoltaics at a 5-minute level, for one simulated year.

  2. The Methodology of Magpies

    ERIC Educational Resources Information Center

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  3. Optical tweezers force measurements to study parasites chemotaxis

    NASA Astrophysics Data System (ADS)

    de Thomaz, A. A.; Pozzo, L. Y.; Fontes, A.; Almeida, D. B.; Stahl, C. V.; Santos-Mallet, J. R.; Gomes, S. A. O.; Feder, D.; Ayres, D. C.; Giorgio, S.; Cesar, C. L.

    2009-07-01

    In this work, we propose a methodology to study microorganisms chemotaxis in real time using an Optical Tweezers system. Optical Tweezers allowed real time measurements of the force vectors, strength and direction, of living parasites under chemical or other kinds of gradients. This seems to be the ideal tool to perform observations of taxis response of cells and microorganisms with high sensitivity to capture instantaneous responses to a given stimulus. Forces involved in the movement of unicellular parasites are very small, in the femto-pico-Newton range, about the same order of magnitude of the forces generated in an Optical Tweezers. We applied this methodology to investigate the Leishmania amazonensis (L. amazonensis) and Trypanossoma cruzi (T. cruzi) under distinct situations.

  4. Global-Context Based Salient Region Detection in Nature Images

    NASA Astrophysics Data System (ADS)

    Bao, Hong; Xu, De; Tang, Yingjun

    Visually saliency detection provides an alternative methodology to image description in many applications such as adaptive content delivery and image retrieval. One of the main aims of visual attention in computer vision is to detect and segment the salient regions in an image. In this paper, we employ matrix decomposition to detect salient object in nature images. To efficiently eliminate high contrast noise regions in the background, we integrate global context information into saliency detection. Therefore, the most salient region can be easily selected as the one which is globally most isolated. The proposed approach intrinsically provides an alternative methodology to model attention with low implementation complexity. Experiments show that our approach achieves much better performance than that from the existing state-of-art methods.

  5. Testing Orr's document delivery test on biomedical journals in South Africa.

    PubMed Central

    Steynberg, S; Rossouw, S F

    1995-01-01

    This paper describes the use of a document delivery test (DDT) to measure the availability of biomedical research journals in South African health sciences libraries. The methodology employed was developed twenty years ago by a team of researchers from the Institute for the Advancement of Medical Communication under the direction of R. H. Orr. The testing of the methodology was in itself an objective of the present research. A citation pool consisting of 307 items was constructed from references to journal articles in papers published in 1989 by South African biomedical researchers. The availability of each article was determined at each of seven medical library sites; the performance was measured and presented as an arithmetical value or document delivery capability index (CI). The results of the tests show a high level of availability, ranging from CI = 81.68 to CI = 92.97 for the journals sampled. The DDT methodology was found to be practical, applicable to such studies, and flexible. Its use is recommended for similar studies. Images PMID:7703944

  6. An Effective Modal Approach to the Dynamic Evaluation of Fracture Toughness of Quasi-Brittle Materials

    NASA Astrophysics Data System (ADS)

    Ferreira, L. E. T.; Vareda, L. V.; Hanai, J. B.; Sousa, J. L. A. O.; Silva, A. I.

    2017-05-01

    A modal dynamic analysis is used as the tool to evaluate the fracture toughness of concrete from the results of notched-through beam tests. The dimensionless functions describing the relation between the frequencies and specimen geometry used for identifying the variation in the natural frequency as a function of crack depth is first determined for a 150 × 150 × 500-mm notched-through specimen. The frequency decrease resulting from the propagating crack is modeled through a modal/fracture mechanics approach, leading to determination of an effective crack length. This length, obtained numerically, is used to evaluate the fracture toughness of concrete, the critical crack mouth opening displacements, and the brittleness index proposed. The methodology is applied to tests performed on high-strength concrete specimens. The frequency response for each specimen is evaluated before and after each crack propagation step. The methodology is then validated by comparison with results from the application of other methodologies described in the literature and suggested by RILEM.

  7. Ergonomics program management in Tucuruí Hydropower Plant using TPM methodology.

    PubMed

    Santos, R M; Sassi, A C; Sá, B M; Miguez, S A; Pardauil, A A

    2012-01-01

    This paper aims to present the benefits achieved in the ergonomics process management with the use of the TPM methodology (Total Productive Maintenance) in Tucuruí Hydropower Plant. The methodology is aligned with the corporate guidelines, moreover with the Strategic Planning of the company, it is represented in the TPM Pillars including the Health Pillar in which is inserted the ergonomics process. The results of the ergonomic actions demonstrated a 12% reduction over the absenteeism rate due to musculoskeletal disorders, solving 77,0% of ergonomic non-conformities, what favored the rise of the Organizational Climate in 44,8%, impacting on the overall performance of the company. Awards confirmed the success of the work by the achievement of the Award for TPM Excellence in 2001, Award for Excellence in Consistent TPM Commitment in 2009 and more recently the Special Award for TPM Achievement, 2010. The determination of the high rank administration and workers, allied with the involvement/dynamism of Pillars, has assured the success of this management practice in Tucuruí Hydropower Plant.

  8. Assessing the Fire Risk for a Historic Hangar

    NASA Technical Reports Server (NTRS)

    Datta, Koushik; Morrison, Richard S.

    2010-01-01

    NASA Ames Research Center (ARC) is evaluating options of reuse of its historic Hangar 1. As a part of this evaluation, a qualitative fire risk assessment study was performed to evaluate the potential threat of combustion of the historic hangar. The study focused on the fire risk trade-off of either installing or not installing a Special Hazard Fire Suppression System in the Hangar 1 deck areas. The assessment methodology was useful in discussing the important issues among various groups within the Center. Once the methodology was deemed acceptable, the results were assessed. The results showed that the risk remained in the same risk category, whether Hangar 1 does or does not have a Special Hazard Fire Suppression System. Note that the methodology assessed the risk to Hangar 1 and not the risk to an aircraft in the hangar. If one had a high value aircraft, the aircraft risk analysis could potentially show a different result. The assessed risk results were then communicated to management and other stakeholders.

  9. The SAMI Galaxy Survey: cubism and covariance, putting round pegs into square holes

    NASA Astrophysics Data System (ADS)

    Sharp, R.; Allen, J. T.; Fogarty, L. M. R.; Croom, S. M.; Cortese, L.; Green, A. W.; Nielsen, J.; Richards, S. N.; Scott, N.; Taylor, E. N.; Barnes, L. A.; Bauer, A. E.; Birchall, M.; Bland-Hawthorn, J.; Bloom, J. V.; Brough, S.; Bryant, J. J.; Cecil, G. N.; Colless, M.; Couch, W. J.; Drinkwater, M. J.; Driver, S.; Foster, C.; Goodwin, M.; Gunawardhana, M. L. P.; Ho, I.-T.; Hampton, E. J.; Hopkins, A. M.; Jones, H.; Konstantopoulos, I. S.; Lawrence, J. S.; Leslie, S. K.; Lewis, G. F.; Liske, J.; López-Sánchez, Á. R.; Lorente, N. P. F.; McElroy, R.; Medling, A. M.; Mahajan, S.; Mould, J.; Parker, Q.; Pracy, M. B.; Obreschkow, D.; Owers, M. S.; Schaefer, A. L.; Sweet, S. M.; Thomas, A. D.; Tonini, C.; Walcher, C. J.

    2015-01-01

    We present a methodology for the regularization and combination of sparse sampled and irregularly gridded observations from fibre-optic multiobject integral field spectroscopy. The approach minimizes interpolation and retains image resolution on combining subpixel dithered data. We discuss the methodology in the context of the Sydney-AAO multiobject integral field spectrograph (SAMI) Galaxy Survey underway at the Anglo-Australian Telescope. The SAMI instrument uses 13 fibre bundles to perform high-multiplex integral field spectroscopy across a 1° diameter field of view. The SAMI Galaxy Survey is targeting ˜3000 galaxies drawn from the full range of galaxy environments. We demonstrate the subcritical sampling of the seeing and incomplete fill factor for the integral field bundles results in only a 10 per cent degradation in the final image resolution recovered. We also implement a new methodology for tracking covariance between elements of the resulting data cubes which retains 90 per cent of the covariance information while incurring only a modest increase in the survey data volume.

  10. A methodology for thermodynamic simulation of high temperature, internal reforming fuel cell systems

    NASA Astrophysics Data System (ADS)

    Matelli, José Alexandre; Bazzo, Edson

    This work presents a methodology for simulation of fuel cells to be used in power production in small on-site power/cogeneration plants that use natural gas as fuel. The methodology contemplates thermodynamics and electrochemical aspects related to molten carbonate and solid oxide fuel cells (MCFC and SOFC, respectively). Internal steam reforming of the natural gas hydrocarbons is considered for hydrogen production. From inputs as cell potential, cell power, number of cell in the stack, ancillary systems power consumption, reformed natural gas composition and hydrogen utilization factor, the simulation gives the natural gas consumption, anode and cathode stream gases temperature and composition, and thermodynamic, electrochemical and practical efficiencies. Both energetic and exergetic methods are considered for performance analysis. The results obtained from natural gas reforming thermodynamics simulation show that the hydrogen production is maximum around 700 °C, for a steam/carbon ratio equal to 3. As shown in the literature, the found results indicate that the SOFC is more efficient than MCFC.

  11. Transgenic bovine as bioreactors: Challenges and perspectives

    PubMed Central

    Monzani, Paulo S.; Adona, Paulo R.; Ohashi, Otávio M.; Meirelles, Flávio V.; Wheeler, Matthew B.

    2016-01-01

    ABSTRACT The use of recombinant proteins has increased in diverse commercial sectors. Various systems for protein production have been used for the optimization of production and functional protein expression. The mammary gland is considered to be a very interesting system for the production of recombinant proteins due to its high level of expression and its ability to perform post-translational modifications. Cows produce large quantities of milk over a long period of lactation, and therefore this species is an important candidate for recombinant protein expression in milk. However, transgenic cows are more difficult to generate due to the inefficiency of transgenic methodologies, the long periods for transgene detection, recombinant protein expression and the fact that only a single calf is obtained at the end of each pregnancy. An increase in efficiency for transgenic methodologies for cattle is a big challenge to overcome. Promising methodologies have been proposed that can help to overcome this obstacle, enabling the use of transgenic cattle as bioreactors for protein production in milk for industry. PMID:27166649

  12. Programming methodology for a general purpose automation controller

    NASA Technical Reports Server (NTRS)

    Sturzenbecker, M. C.; Korein, J. U.; Taylor, R. H.

    1987-01-01

    The General Purpose Automation Controller is a multi-processor architecture for automation programming. A methodology has been developed whose aim is to simplify the task of programming distributed real-time systems for users in research or manufacturing. Programs are built by configuring function blocks (low-level computations) into processes using data flow principles. These processes are activated through the verb mechanism. Verbs are divided into two classes: those which support devices, such as robot joint servos, and those which perform actions on devices, such as motion control. This programming methodology was developed in order to achieve the following goals: (1) specifications for real-time programs which are to a high degree independent of hardware considerations such as processor, bus, and interconnect technology; (2) a component approach to software, so that software required to support new devices and technologies can be integrated by reconfiguring existing building blocks; (3) resistance to error and ease of debugging; and (4) a powerful command language interface.

  13. a Prompt Methodology to Georeference Complex Hypogea Environments

    NASA Astrophysics Data System (ADS)

    Troisi, S.; Baiocchi, V.; Del Pizzo, S.; Giannone, F.

    2017-02-01

    Actually complex underground structures and facilities occupy a wide space in our cities, most of them are often unsurveyed; cable duct, drainage system are not exception. Furthermore, several inspection operations are performed in critical air condition, that do not allow or make more difficult a conventional survey. In this scenario a prompt methodology to survey and georeferencing such facilities is often indispensable. A visual based approach was proposed in this paper; such methodology provides a 3D model of the environment and the path followed by the camera using the conventional photogrammetric/Structure from motion software tools. The key-role is played by the lens camera; indeed, a fisheye system was employed to obtain a very wide field of view (FOV) and therefore high overlapping among the frames. The camera geometry is in according to a forward motion along the axis camera. Consequently, to avoid instability of bundle adjustment algorithm a preliminary calibration of camera was carried out. A specific case study was reported and the accuracy achieved.

  14. Rapid monitoring of glycerol in fermentation growth media: Facilitating crude glycerol bioprocess development.

    PubMed

    Abad, Sergi; Pérez, Xavier; Planas, Antoni; Turon, Xavier

    2014-04-01

    Recently, the need for crude glycerol valorisation from the biodiesel industry has generated many studies for practical and economic applications. Amongst them, fermentations based on glycerol media for the production of high value metabolites are prominent applications. This has generated a need to develop analytical techniques which allow fast and simple glycerol monitoring during fermentation. The methodology should be fast and inexpensive to be adopted in research, as well as in industrial applications. In this study three different methods were analysed and compared: two common methodologies based on liquid chromatography and enzymatic kits, and the new method based on a DotBlot assay coupled with image analysis. The new methodology is faster and cheaper than the other conventional methods, with comparable performance. Good linearity, precision and accuracy were achieved in the lower range (10 or 15 g/L to depletion), the most common range of glycerol concentrations to monitor fermentations in terms of growth kinetics. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. KSC management training system project

    NASA Technical Reports Server (NTRS)

    Sepulveda, Jose A.

    1993-01-01

    The stated objectives for the summer of 1993 were: to review the Individual Development Plan Surveys for 1994 in order to automate the analysis of the Needs Assessment effort; and to develop and implement evaluation methodologies to perform ongoing program-wide course-to-course assessment. This includes the following: to propose a methodology to develop and implement objective, performance-based assessment instruments for each training effort; to mechanize course evaluation forms and develop software to facilitate the data gathering, analysis, and reporting processes; and to implement the methodology, forms, and software in at lease one training course or seminar selected among those normally offered in the summer at KSC. Section two of this report addresses the work done in regard to the Individual Development Plan Surveys for 1994. Section three presents the methodology proposed to develop and implement objective, performance-based assessment instruments for each training course offered at KSC.

  16. 76 FR 6795 - Statement of Organization, Functions, and Delegations of Authority; Office of the National...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-08

    ... Coordinator; (2) applies research methodologies to perform evaluation studies of health information technology grant programs; and, (3) applies advanced mathematical or quantitative modeling to the U.S. health care... remaining items in the paragraph accordingly: ``(1) Applying research methodologies to perform evaluation...

  17. Versatile Methodology to Encapsulate Gold Nanoparticles in PLGA Nanoparticles Obtained by Nano-Emulsion Templating.

    PubMed

    Fornaguera, Cristina; Feiner-Gracia, Natàlia; Dols-Perez, Aurora; García-Celma, Maria José; Solans, Conxita

    2017-05-01

    Gold nanoparticles have been proved useful for many biomedical applications, specifically, for their use as advanced imaging systems. However, they usually present problems related with stability and toxicity. In the present work, gold-nanoparticles have been encapsulated in polymeric nanoparticles using a novel methodology based on nano-emulsion templating. Firstly, gold nanoparticles have been transferred from water to ethyl acetate, a solvent classified as class III by the NIH guidelines (low toxic potential). Next, the formation of nano-emulsions loaded with gold nanoparticles has been performed using a low-energy, the phase inversion composition (PIC) emulsification method, followed by solvent evaporation giving rise to polymeric nanoparticles. Using this methodology, high concentrations of gold nanoparticles (>100 pM) have been encapsulated. Increasing gold nanoparticle concentration, nano-emulsion and nanoparticle sizes increase, resulting in a decrease on the stability. It is noteworthy that the designed nanoparticles did not produce cytotoxicity neither hemolysis at the required concentration. Therefore, it can be concluded that a novel and very versatile methodology has been developed for the production of polymeric nanoparticles loaded with gold nanoparticles. Graphical Abstract Schematic representation of AuNP-loaded polymeric nanoparticles preparation from nano-emulsion templating.

  18. Thermal signature identification system (TheSIS): a spread spectrum temperature cycling method

    NASA Astrophysics Data System (ADS)

    Merritt, Scott

    2015-03-01

    NASA GSFC's Thermal Signature Identification System (TheSIS) 1) measures the high order dynamic responses of optoelectronic components to direct sequence spread-spectrum temperature cycling, 2) estimates the parameters of multiple autoregressive moving average (ARMA) or other models the of the responses, 3) and selects the most appropriate model using the Akaike Information Criterion (AIC). Using the AIC-tested model and parameter vectors from TheSIS, one can 1) select high-performing components on a multivariate basis, i.e., with multivariate Figures of Merit (FOMs), 2) detect subtle reversible shifts in performance, and 3) investigate irreversible changes in component or subsystem performance, e.g. aging. We show examples of the TheSIS methodology for passive and active components and systems, e.g. fiber Bragg gratings (FBGs) and DFB lasers with coupled temperature control loops, respectively.

  19. Evidence and practice in spine registries

    PubMed Central

    van Hooff, Miranda L; Jacobs, Wilco C H; Willems, Paul C; Wouters, Michel W J M; de Kleuver, Marinus; Peul, Wilco C; Ostelo, Raymond W J G; Fritzell, Peter

    2015-01-01

    Background and purpose We performed a systematic review and a survey in order to (1) evaluate the evidence for the impact of spine registries on the quality of spine care, and with that, on patient-related outcomes, and (2) evaluate the methodology used to organize, analyze, and report the “quality of spine care” from spine registries. Methods To study the impact, the literature on all spinal disorders was searched. To study methodology, the search was restricted to degenerative spinal disorders. The risk of bias in the studies included was assessed with the Newcastle-Ottawa scale. Additionally, a survey among registry representatives was performed to acquire information about the methodology and practice of existing registries. Results 4,273 unique references up to May 2014 were identified, and 1,210 were eligible for screening and assessment. No studies on impact were identified, but 34 studies were identified to study the methodology. Half of these studies (17 of the 34) were judged to have a high risk of bias. The survey identified 25 spine registries, representing 14 countries. The organization of these registries, methods used, analytical approaches, and dissemination of results are presented. Interpretation We found a lack of evidence that registries have had an impact on the quality of spine care, regardless of whether intervention was non-surgical and/or surgical. To improve the quality of evidence published with registry data, we present several recommendations. Application of these recommendations could lead to registries showing trends, monitoring the quality of spine care given, and ultimately improving the value of the care given to patients with degenerative spinal disorders. PMID:25909475

  20. In Vivo Patellofemoral Contact Mechanics During Active Extension Using a Novel Dynamic MRI-based Methodology

    PubMed Central

    Borotikar, Bhushan S.; Sheehan, Frances T.

    2017-01-01

    Objectives To establish an in vivo, normative patellofemoral cartilage contact mechanics database acquired during voluntary muscle control using a novel dynamic magnetic resonance (MR) imaging-based computational methodology and validate the contact mechanics sensitivity to the known sub-millimeter methodological inaccuracies. Design Dynamic cine phase-contrast and multi-plane cine images were acquired while female subjects (n=20, sample of convenience) performed an open kinetic chain (knee flexion-extension) exercise inside a 3-Tesla MR scanner. Static cartilage models were created from high resolution three-dimensional static MR data and accurately placed in their dynamic pose at each time frame based on the cine-PC data. Cartilage contact parameters were calculated based on the surface overlap. Statistical analysis was performed using paired t-test and a one-sample repeated measures ANOVA. The sensitivity of the contact parameters to the known errors in the patellofemoral kinematics was determined. Results Peak mean patellofemoral contact area was 228.7±173.6mm2 at 40° knee angle. During extension, contact centroid and peak strain locations tracked medially on the femoral and patellar cartilage and were not significantly different from each other. At 30°, 35°, and 40° of knee extension, contact area was significantly different. Contact area and centroid locations were insensitive to rotational and translational perturbations. Conclusion This study is a first step towards unfolding the biomechanical pathways to anterior patellofemoral pain and OA using dynamic, in vivo, and accurate methodologies. The database provides crucial data for future studies and for validation of, or as an input to, computational models. PMID:24012620

  1. Evaluation Model for Pavement Surface Distress on 3d Point Clouds from Mobile Mapping System

    NASA Astrophysics Data System (ADS)

    Aoki, K.; Yamamoto, K.; Shimamura, H.

    2012-07-01

    This paper proposes a methodology to evaluate the pavement surface distress for maintenance planning of road pavement using 3D point clouds from Mobile Mapping System (MMS). The issue on maintenance planning of road pavement requires scheduled rehabilitation activities for damaged pavement sections to keep high level of services. The importance of this performance-based infrastructure asset management on actual inspection data is globally recognized. Inspection methodology of road pavement surface, a semi-automatic measurement system utilizing inspection vehicles for measuring surface deterioration indexes, such as cracking, rutting and IRI, have already been introduced and capable of continuously archiving the pavement performance data. However, any scheduled inspection using automatic measurement vehicle needs much cost according to the instruments' specification or inspection interval. Therefore, implementation of road maintenance work, especially for the local government, is difficult considering costeffectiveness. Based on this background, in this research, the methodologies for a simplified evaluation for pavement surface and assessment of damaged pavement section are proposed using 3D point clouds data to build urban 3D modelling. The simplified evaluation results of road surface were able to provide useful information for road administrator to find out the pavement section for a detailed examination and for an immediate repair work. In particular, the regularity of enumeration of 3D point clouds was evaluated using Chow-test and F-test model by extracting the section where the structural change of a coordinate value was remarkably achieved. Finally, the validity of the current methodology was investigated by conducting a case study dealing with the actual inspection data of the local roads.

  2. Advanced biosensing methodologies developed for evaluating performance quality and safety of emerging biophotonics technologies and medical devices (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Ilev, Ilko K.; Walker, Bennett; Calhoun, William; Hassan, Moinuddin

    2016-03-01

    Biophotonics is an emerging field in modern biomedical technology that has opened up new horizons for transfer of state-of-the-art techniques from the areas of lasers, fiber optics and biomedical optics to the life sciences and medicine. This field continues to vastly expand with advanced developments across the entire spectrum of biomedical applications ranging from fundamental "bench" laboratory studies to clinical patient "bedside" diagnostics and therapeutics. However, in order to translate these technologies to clinical device applications, the scientific and industrial community, and FDA are facing the requirement for a thorough evaluation and review of laser radiation safety and efficacy concerns. In many cases, however, the review process is complicated due the lack of effective means and standard test methods to precisely analyze safety and effectiveness of some of the newly developed biophotonics techniques and devices. There is, therefore, an immediate public health need for new test protocols, guidance documents and standard test methods to precisely evaluate fundamental characteristics, performance quality and safety of these technologies and devices. Here, we will overview our recent developments of novel test methodologies for safety and efficacy evaluation of some emerging biophotonics technologies and medical devices. These methodologies are based on integrating the advanced features of state-of-the-art optical sensor technologies and approaches such as high-resolution fiber-optic sensing, confocal and optical coherence tomography imaging, and infrared spectroscopy. The presentation will also illustrate some methodologies developed and implemented for testing intraocular lens implants, biochemical contaminations of medical devices, ultrahigh-resolution nanoscopy, and femtosecond laser therapeutics.

  3. Author-paper affiliation network architecture influences the methodological quality of systematic reviews and meta-analyses of psoriasis.

    PubMed

    Sanz-Cabanillas, Juan Luis; Ruano, Juan; Gomez-Garcia, Francisco; Alcalde-Mellado, Patricia; Gay-Mimbrera, Jesus; Aguilar-Luque, Macarena; Maestre-Lopez, Beatriz; Gonzalez-Padilla, Marcelino; Carmona-Fernandez, Pedro J; Velez Garcia-Nieto, Antonio; Isla-Tejera, Beatriz

    2017-01-01

    Moderate-to-severe psoriasis is associated with significant comorbidity, an impaired quality of life, and increased medical costs, including those associated with treatments. Systematic reviews (SRs) and meta-analyses (MAs) of randomized clinical trials are considered two of the best approaches to the summarization of high-quality evidence. However, methodological bias can reduce the validity of conclusions from these types of studies and subsequently impair the quality of decision making. As co-authorship is among the most well-documented forms of research collaboration, the present study aimed to explore whether authors' collaboration methods might influence the methodological quality of SRs and MAs of psoriasis. Methodological quality was assessed by two raters who extracted information from full articles. After calculating total and per-item Assessment of Multiple Systematic Reviews (AMSTAR) scores, reviews were classified as low (0-4), medium (5-8), or high (9-11) quality. Article metadata and journal-related bibliometric indices were also obtained. A total of 741 authors from 520 different institutions and 32 countries published 220 reviews that were classified as high (17.2%), moderate (55%), or low (27.7%) methodological quality. The high methodological quality subnetwork was larger but had a lower connection density than the low and moderate methodological quality subnetworks; specifically, the former contained relatively fewer nodes (authors and reviews), reviews by authors, and collaborators per author. Furthermore, the high methodological quality subnetwork was highly compartmentalized, with several modules representing few poorly interconnected communities. In conclusion, structural differences in author-paper affiliation network may influence the methodological quality of SRs and MAs on psoriasis. As the author-paper affiliation network structure affects study quality in this research field, authors who maintain an appropriate balance between scientific quality and productivity are more likely to develop higher quality reviews.

  4. Bridging the gap between neurocognitive processing theory and performance validity assessment among the cognitively impaired: a review and methodological approach.

    PubMed

    Leighton, Angela; Weinborn, Michael; Maybery, Murray

    2014-10-01

    Bigler (2012) and Larrabee (2012) recently addressed the state of the science surrounding performance validity tests (PVTs) in a dialogue highlighting evidence for the valid and increased use of PVTs, but also for unresolved problems. Specifically, Bigler criticized the lack of guidance from neurocognitive processing theory in the PVT literature. For example, individual PVTs have applied the simultaneous forced-choice methodology using a variety of test characteristics (e.g., word vs. picture stimuli) with known neurocognitive processing implications (e.g., the "picture superiority effect"). However, the influence of such variations on classification accuracy has been inadequately evaluated, particularly among cognitively impaired individuals. The current review places the PVT literature in the context of neurocognitive processing theory, and identifies potential methodological factors to account for the significant variability we identified in classification accuracy across current PVTs. We subsequently evaluated the utility of a well-known cognitive manipulation to provide a Clinical Analogue Methodology (CAM), that is, to alter the PVT performance of healthy individuals to be similar to that of a cognitively impaired group. Initial support was found, suggesting the CAM may be useful alongside other approaches (analogue malingering methodology) for the systematic evaluation of PVTs, particularly the influence of specific neurocognitive processing components on performance.

  5. Macroergonomic analysis and design for improved safety and quality performance.

    PubMed

    Kleiner, B M

    1999-01-01

    Macroergonomics, which emerged historically after sociotechnical systems theory, quality management, and ergonomics, is presented as the basis for a needed integrative methodology. A macroergonomics methodology was presented in some detail to demonstrate how aspects of microergonomics, total quality management (TQM), and sociotechnical systems (STS) can be triangulated in a common approach. In the context of this methodology, quality and safety were presented as 2 of several important performance criteria. To demonstrate aspects of the methodology, 2 case studies were summarized with safety and quality performance results where available. The first case manipulated both personnel and technical factors to achieve a "safety culture" at a nuclear site. The concept of safety culture is defined in INSAG-4 (International Atomic Energy Agency, 1991). as "that assembly of characteristics and attitudes in organizations and individuals which establishes that, as an overriding priority, nuclear plant safety issues receive the attention warranted by their significance." The second case described a tire manufacturing intervention to improve quality (as defined by Sink and Tuttle, 1989) through joint consideration of technical and social factors. It was suggested that macroergonomics can yield greater performance than can be achieved through ergonomic intervention alone. Whereas case studies help to make the case, more rigorous formative and summative research is needed to refine and validate the proposed methodology respectively.

  6. Impact of design-parameters on the optical performance of a high-power adaptive mirror

    NASA Astrophysics Data System (ADS)

    Koek, Wouter D.; Nijkerk, David; Smeltink, Jeroen A.; van den Dool, Teun C.; van Zwet, Erwin J.; van Baars, Gregor E.

    2017-02-01

    TNO is developing a High Power Adaptive Mirror (HPAM) to be used in the CO2 laser beam path of an Extreme Ultra- Violet (EUV) light source for next-generation lithography. In this paper we report on a developed methodology, and the necessary simulation tools, to assess the performance and associated sensitivities of this deformable mirror. Our analyses show that, given the current limited insight concerning the process window of EUV generation, the HPAM module should have an actuator pitch of <= 4 mm. Furthermore we have modelled the sensitivity of performance with respect to dimpling and actuator noise. For example, for a deformable mirror with an actuator pitch of 4 mm, and if the associated performance impact is to be limited to smaller than 5%, the actuator noise should be smaller than 45 nm (rms). Our tools assist in the detailed design process by assessing the performance impact of various design choices, including for example those that affect the shape and spectral content of the influence function.

  7. Measure Guideline: Passive Vents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berger, David; Neri, Robin

    2016-02-05

    This document addresses the use of passive vents as a source of outdoor air in multifamily buildings. The challenges associated with implementing passive vents and the factors affecting performance are outlined. A comprehensive design methodology and quantified performance metrics are provided. Two hypothetical design examples are provided to illustrate the process. This document is intended to be useful to designers, decision-makers, and contractors implementing passive ventilation strategies. It is also intended to be a resource for those responsible for setting high-performance building program requirements, especially pertaining to ventilation and outdoor air. To ensure good indoor air quality, a dedicated sourcemore » of outdoor air is an integral part of high-performance buildings. Presently, there is a lack of guidance pertaining to the design and installation of passive vents, resulting in poor system performance. This report details the criteria necessary for designing, constructing, and testing passive vent systems to enable them to provide consistent and reliable levels of ventilation air from outdoors.« less

  8. Polymer-Templated LiFePO4/C Nanonetworks as High-Performance Cathode Materials for Lithium-Ion Batteries.

    PubMed

    Fischer, Michael G; Hua, Xiao; Wilts, Bodo D; Castillo-Martínez, Elizabeth; Steiner, Ullrich

    2018-01-17

    Lithium iron phosphate (LFP) is currently one of the main cathode materials used in lithium-ion batteries due to its safety, relatively low cost, and exceptional cycle life. To overcome its poor ionic and electrical conductivities, LFP is often nanostructured, and its surface is coated with conductive carbon (LFP/C). Here, we demonstrate a sol-gel based synthesis procedure that utilizes a block copolymer (BCP) as a templating agent and a homopolymer as an additional carbon source. The high-molecular-weight BCP produces self-assembled aggregates with the precursor-sol on the 10 nm scale, stabilizing the LFP structure during crystallization at high temperatures. This results in a LFP nanonetwork consisting of interconnected ∼10 nm-sized particles covered by a uniform carbon coating that displays a high rate performance and an excellent cycle life. Our "one-pot" method is facile and scalable for use in established battery production methodologies.

  9. Performance-based methodology for assessing seismic vulnerability and capacity of buildings

    NASA Astrophysics Data System (ADS)

    Shibin, Lin; Lili, Xie; Maosheng, Gong; Ming, Li

    2010-06-01

    This paper presents a performance-based methodology for the assessment of seismic vulnerability and capacity of buildings. The vulnerability assessment methodology is based on the HAZUS methodology and the improved capacitydemand-diagram method. The spectral displacement ( S d ) of performance points on a capacity curve is used to estimate the damage level of a building. The relationship between S d and peak ground acceleration (PGA) is established, and then a new vulnerability function is expressed in terms of PGA. Furthermore, the expected value of the seismic capacity index (SCev) is provided to estimate the seismic capacity of buildings based on the probability distribution of damage levels and the corresponding seismic capacity index. The results indicate that the proposed vulnerability methodology is able to assess seismic damage of a large number of building stock directly and quickly following an earthquake. The SCev provides an effective index to measure the seismic capacity of buildings and illustrate the relationship between the seismic capacity of buildings and seismic action. The estimated result is compared with damage surveys of the cities of Dujiangyan and Jiangyou in the M8.0 Wenchuan earthquake, revealing that the methodology is acceptable for seismic risk assessment and decision making. The primary reasons for discrepancies between the estimated results and the damage surveys are discussed.

  10. MiRduplexSVM: A High-Performing MiRNA-Duplex Prediction and Evaluation Methodology

    PubMed Central

    Karathanasis, Nestoras; Tsamardinos, Ioannis; Poirazi, Panayiota

    2015-01-01

    We address the problem of predicting the position of a miRNA duplex on a microRNA hairpin via the development and application of a novel SVM-based methodology. Our method combines a unique problem representation and an unbiased optimization protocol to learn from mirBase19.0 an accurate predictive model, termed MiRduplexSVM. This is the first model that provides precise information about all four ends of the miRNA duplex. We show that (a) our method outperforms four state-of-the-art tools, namely MaturePred, MiRPara, MatureBayes, MiRdup as well as a Simple Geometric Locator when applied on the same training datasets employed for each tool and evaluated on a common blind test set. (b) In all comparisons, MiRduplexSVM shows superior performance, achieving up to a 60% increase in prediction accuracy for mammalian hairpins and can generalize very well on plant hairpins, without any special optimization. (c) The tool has a number of important applications such as the ability to accurately predict the miRNA or the miRNA*, given the opposite strand of a duplex. Its performance on this task is superior to the 2nts overhang rule commonly used in computational studies and similar to that of a comparative genomic approach, without the need for prior knowledge or the complexity of performing multiple alignments. Finally, it is able to evaluate novel, potential miRNAs found either computationally or experimentally. In relation with recent confidence evaluation methods used in miRBase, MiRduplexSVM was successful in identifying high confidence potential miRNAs. PMID:25961860

  11. Large scale nonlinear programming for the optimization of spacecraft trajectories

    NASA Astrophysics Data System (ADS)

    Arrieta-Camacho, Juan Jose

    Despite the availability of high fidelity mathematical models, the computation of accurate optimal spacecraft trajectories has never been an easy task. While simplified models of spacecraft motion can provide useful estimates on energy requirements, sizing, and cost; the actual launch window and maneuver scheduling must rely on more accurate representations. We propose an alternative for the computation of optimal transfers that uses an accurate representation of the spacecraft dynamics. Like other methodologies for trajectory optimization, this alternative is able to consider all major disturbances. In contrast, it can handle explicitly equality and inequality constraints throughout the trajectory; it requires neither the derivation of costate equations nor the identification of the constrained arcs. The alternative consist of two steps: (1) discretizing the dynamic model using high-order collocation at Radau points, which displays numerical advantages, and (2) solution to the resulting Nonlinear Programming (NLP) problem using an interior point method, which does not suffer from the performance bottleneck associated with identifying the active set, as required by sequential quadratic programming methods; in this way the methodology exploits the availability of sound numerical methods, and next generation NLP solvers. In practice the methodology is versatile; it can be applied to a variety of aerospace problems like homing, guidance, and aircraft collision avoidance; the methodology is particularly well suited for low-thrust spacecraft trajectory optimization. Examples are presented which consider the optimization of a low-thrust orbit transfer subject to the main disturbances due to Earth's gravity field together with Lunar and Solar attraction. Other example considers the optimization of a multiple asteroid rendezvous problem. In both cases, the ability of our proposed methodology to consider non-standard objective functions and constraints is illustrated. Future research directions are identified, involving the automatic scheduling and optimization of trajectory correction maneuvers. The sensitivity information provided by the methodology is expected to be invaluable in such research pursuit. The collocation scheme and nonlinear programming algorithm presented in this work, complement other existing methodologies by providing reliable and efficient numerical methods able to handle large scale, nonlinear dynamic models.

  12. Application of hybrid methodology to rotors in steady and maneuvering flight

    NASA Astrophysics Data System (ADS)

    Rajmohan, Nischint

    Helicopters are versatile flying machines that have capabilities that are unparalleled by fixed wing aircraft, such as operating in hover, performing vertical takeoff and landing on unprepared sites. This makes their use especially desirable in military and search-and-rescue operations. However, modern helicopters still suffer from high levels of noise and vibration caused by the physical phenomena occurring in the vicinity of the rotor blades. Therefore, improvement in rotorcraft design to reduce the noise and vibration levels requires understanding of the underlying physical phenomena, and accurate prediction capabilities of the resulting rotorcraft aeromechanics. The goal of this research is to study the aeromechanics of rotors in steady and maneuvering flight using hybrid Computational Fluid Dynamics (CFD) methodology. The hybrid CFD methodology uses the Navier-Stokes equations to solve the flow near the blade surface but the effect of the far wake is computed through the wake model. The hybrid CFD methodology is computationally efficient and its wake modeling approach is nondissipative making it an attractive tool to study rotorcraft aeromechanics. Several enhancements were made to the CFD methodology and it was coupled to a Computational Structural Dynamics (CSD) methodology to perform a trimmed aeroelastic analysis of a rotor in forward flight. The coupling analyses, both loose and tight were used to identify the key physical phenomena that affect rotors in different steady flight regimes. The modeling enhancements improved the airloads predictions for a variety of flight conditions. It was found that the tightly coupled method did not impact the loads significantly for steady flight conditions compared to the loosely coupled method. The coupling methodology was extended to maneuvering flight analysis by enhancing the computational and structural models to handle non-periodic flight conditions and vehicle motions in time accurate mode. The flight test control angles were employed to enable the maneuvering flight analysis. The fully coupled model provided the presence of three dynamic stall cycles on the rotor in maneuver. It is important to mention that analysis of maneuvering flight requires knowledge of the pilot input control pitch settings, and the vehicle states. As the result, these computational tools cannot be used for analysis of loads in a maneuver that has not been duplicated in a real flight. This is a significant limitation if these tools are to be selected during the design phase of a helicopter where its handling qualities are evaluated in different trajectories. Therefore, a methodology was developed to couple the CFD/CSD simulation with an inverse flight mechanics simulation to perform the maneuver analysis without using the flight test control input. The methodology showed reasonable convergence in steady flight regime and control angles predictions compared fairly well with test data. In the maneuvering flight regions, the convergence was slower due to relaxation techniques used for the numerical stability. The subsequent computed control angles for the maneuvering flight regions compared well with test data. Further, the enhancement of the rotor inflow computations in the inverse simulation through implementation of a Lagrangian wake model improved the convergence of the coupling methodology.

  13. A Simple Index for the High-Citation Tail of Citation Distribution to Quantify Research Performance in Countries and Institutions

    PubMed Central

    Rodríguez-Navarro, Alonso

    2011-01-01

    Background Conventional scientometric predictors of research performance such as the number of papers, citations, and papers in the top 1% of highly cited papers cannot be validated in terms of the number of Nobel Prize achievements across countries and institutions. The purpose of this paper is to find a bibliometric indicator that correlates with the number of Nobel Prize achievements. Methodology/Principal Findings This study assumes that the high-citation tail of citation distribution holds most of the information about high scientific performance. Here I propose the x-index, which is calculated from the number of national articles in the top 1% and 0.1% of highly cited papers and has a subtractive term to discount highly cited papers that are not scientific breakthroughs. The x-index, the number of Nobel Prize achievements, and the number of national articles in Nature or Science are highly correlated. The high correlations among these independent parameters demonstrate that they are good measures of high scientific performance because scientific excellence is their only common characteristic. However, the x-index has superior features as compared to the other two parameters. Nobel Prize achievements are low frequency events and their number is an imprecise indicator, which in addition is zero in most institutions; the evaluation of research making use of the number of publications in prestigious journals is not advised. Conclusion The x-index is a simple and precise indicator for high research performance. PMID:21647383

  14. Avoidable waste related to inadequate methods and incomplete reporting of interventions: a systematic review of randomized trials performed in Sub-Saharan Africa.

    PubMed

    Ndounga Diakou, Lee Aymar; Ntoumi, Francine; Ravaud, Philippe; Boutron, Isabelle

    2017-07-05

    Randomized controlled trials (RCTs) are needed to improve health care in Sub-Saharan Africa (SSA). However, inadequate methods and incomplete reporting of interventions can prevent the transposition of research in practice which leads waste of research. The aim of this systematic review was to assess the avoidable waste in research related to inadequate methods and incomplete reporting of interventions in RCTs performed in SSA. We performed a methodological systematic review of RCTs performed in SSA and published between 1 January 2014 and 31 March 2015. We searched PubMed, the Cochrane library and the African Index Medicus to identify reports. We assessed the risk of bias using the Cochrane Risk of Bias tool, and for each risk of bias item, determined whether easy adjustments with no or minor cost could change the domain to low risk of bias. The reporting of interventions was assessed by using standardized checklists based on the Consolidated Standards for Reporting Trials, and core items of the Template for Intervention Description and Replication. Corresponding authors of reports with incomplete reporting of interventions were contacted to obtain additional information. Data were descriptively analyzed. Among 121 RCTs selected, 74 (61%) evaluated pharmacological treatments (PTs), including drugs and nutritional supplements; and 47 (39%) nonpharmacological treatments (NPTs) (40 participative interventions, 1 surgical procedure, 3 medical devices and 3 therapeutic strategies). Overall, the randomization sequence was adequately generated in 76 reports (62%) and the intervention allocation concealed in 48 (39%). The primary outcome was described as blinded in 46 reports (38%), and incomplete outcome data were adequately addressed in 78 (64%). Applying easy methodological adjustments with no or minor additional cost to trials with at least one domain at high risk of bias could have reduced the number of domains at high risk for 24 RCTs (19%). Interventions were completely reported for 73/121 (60%) RCTs: 51/74 (68%) of PTs and 22/47 (46%) of NPTs. Additional information was obtained from corresponding authors for 11/48 reports (22%). Inadequate methods and incomplete reporting of published SSA RCTs could be improved by easy and inexpensive methodological adjustments and adherence to reporting guidelines.

  15. Application of CCG Sensors to a High-Temperature Structure Subjected to Thermo-Mechanical Load.

    PubMed

    Xie, Weihua; Meng, Songhe; Jin, Hua; Du, Chong; Wang, Libin; Peng, Tao; Scarpa, Fabrizio; Xu, Chenghai

    2016-10-13

    This paper presents a simple methodology to perform a high temperature coupled thermo-mechanical test using ultra-high temperature ceramic material specimens (UHTCs), which are equipped with chemical composition gratings sensors (CCGs). The methodology also considers the presence of coupled loading within the response provided by the CCG sensors. The theoretical strain of the UHTCs specimens calculated with this technique shows a maximum relative error of 2.15% between the analytical and experimental data. To further verify the validity of the results from the tests, a Finite Element (FE) model has been developed to simulate the temperature, stress and strain fields within the UHTC structure equipped with the CCG. The results show that the compressive stress exceeds the material strength at the bonding area, and this originates a failure by fracture of the supporting structure in the hot environment. The results related to the strain fields show that the relative error with the experimental data decrease with an increase of temperature. The relative error is less than 15% when the temperature is higher than 200 °C, and only 6.71% at 695 °C.

  16. Methylation detection oligonucleotide microarray analysis: a high-resolution method for detection of CpG island methylation

    PubMed Central

    Kamalakaran, Sitharthan; Kendall, Jude; Zhao, Xiaoyue; Tang, Chunlao; Khan, Sohail; Ravi, Kandasamy; Auletta, Theresa; Riggs, Michael; Wang, Yun; Helland, Åslaug; Naume, Bjørn; Dimitrova, Nevenka; Børresen-Dale, Anne-Lise; Hicks, Jim; Lucito, Robert

    2009-01-01

    Methylation of CpG islands associated with genes can affect the expression of the proximal gene, and methylation of non-associated CpG islands correlates to genomic instability. This epigenetic modification has been shown to be important in many pathologies, from development and disease to cancer. We report the development of a novel high-resolution microarray that detects the methylation status of over 25 000 CpG islands in the human genome. Experiments were performed to demonstrate low system noise in the methodology and that the array probes have a high signal to noise ratio. Methylation measurements between different cell lines were validated demonstrating the accuracy of measurement. We then identified alterations in CpG islands, both those associated with gene promoters, as well as non-promoter-associated islands in a set of breast and ovarian tumors. We demonstrate that this methodology accurately identifies methylation profiles in cancer and in principle it can differentiate any CpG methylation alterations and can be adapted to analyze other species. PMID:19474344

  17. The Scaling of Performance and Losses in Miniature Internal Combustion Engines

    DTIC Science & Technology

    2010-01-01

    reliable measurements of engine performance and losses in these small engines. Methodologies are also developed for measuring volumetric, heat transfer...making reliable measurements of engine performance and losses in these small engines. Methodologies are also developed for measuring volumetric, heat ...the most important challenge as it accounts for 60-70% of total energy losses. Combustion losses are followed in order of importance by heat transfer

  18. Automation Applications in an Advanced Air Traffic Management System : Volume 3. Methodology for Man-Machine Task Allocation

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 3 describes the methodology for man-machine task allocation. It contains a description of man and machine performance capabilities and an explanation of the methodology employed to allocate tasks to human or automated resources. It also presen...

  19. Improving Junior Infantry Officer Leader Development and Performance

    DTIC Science & Technology

    2017-06-09

    researcher used a qualitative literature review and semi-structured interview methodology to analyze Army leadership theories and leader development...researcher used a qualitative literature review and semi-structured interview methodology to analyze Army leadership theories and leader development...CHAPTER 3 RESEARCH METHODOLOGY ..............................................................132 CHAPTER 4 QUALITATIVE ANALYSIS

  20. Empiric determination of corrected visual acuity standards for train crews.

    PubMed

    Schwartz, Steven H; Swanson, William H

    2005-08-01

    Probably the most common visual standard for employment in the transportation industry is best-corrected, high-contrast visual acuity. Because such standards were often established absent empiric linkage to job performance, it is possible that a job applicant or employee who has visual acuity less than the standard may be able to satisfactorily perform the required job activities. For the transportation system that we examined, the train crew is required to inspect visually the length of the train before and during the time it leaves the station. The purpose of the inspection is to determine if an individual is in a hazardous position with respect to the train. In this article, we determine the extent to which high-contrast visual acuity can predict performance on a simulated task. Performance at discriminating hazardous from safe conditions, as depicted in projected photographic slides, was determined as a function of visual acuity. For different levels of visual acuity, which was varied through the use of optical defocus, a subject was required to label scenes as hazardous or safe. Task performance was highly correlated with visual acuity as measured under conditions normally used for vision screenings (high-illumination and high-contrast): as the acuity decreases, performance at discriminating hazardous from safe scenes worsens. This empirically based methodology can be used to establish a corrected high-contrast visual acuity standard for safety-sensitive work in transportation that is linked to the performance of a job-critical task.

  1. High-performance parallel analysis of coupled problems for aircraft propulsion

    NASA Technical Reports Server (NTRS)

    Felippa, C. A.; Farhat, C.; Lanteri, S.; Gumaste, U.; Ronaghi, M.

    1994-01-01

    Applications are described of high-performance parallel, computation for the analysis of complete jet engines, considering its multi-discipline coupled problem. The coupled problem involves interaction of structures with gas dynamics, heat conduction and heat transfer in aircraft engines. The methodology issues addressed include: consistent discrete formulation of coupled problems with emphasis on coupling phenomena; effect of partitioning strategies, augmentation and temporal solution procedures; sensitivity of response to problem parameters; and methods for interfacing multiscale discretizations in different single fields. The computer implementation issues addressed include: parallel treatment of coupled systems; domain decomposition and mesh partitioning strategies; data representation in object-oriented form and mapping to hardware driven representation, and tradeoff studies between partitioning schemes and fully coupled treatment.

  2. Load Balancing Strategies for Multi-Block Overset Grid Applications

    NASA Technical Reports Server (NTRS)

    Djomehri, M. Jahed; Biswas, Rupak; Lopez-Benitez, Noe; Biegel, Bryan (Technical Monitor)

    2002-01-01

    The multi-block overset grid method is a powerful technique for high-fidelity computational fluid dynamics (CFD) simulations about complex aerospace configurations. The solution process uses a grid system that discretizes the problem domain by using separately generated but overlapping structured grids that periodically update and exchange boundary information through interpolation. For efficient high performance computations of large-scale realistic applications using this methodology, the individual grids must be properly partitioned among the parallel processors. Overall performance, therefore, largely depends on the quality of load balancing. In this paper, we present three different load balancing strategies far overset grids and analyze their effects on the parallel efficiency of a Navier-Stokes CFD application running on an SGI Origin2000 machine.

  3. A review and preliminary evaluation of methodological factors in performance assessments of time-varying aircraft noise effects

    NASA Technical Reports Server (NTRS)

    Coates, G. D.; Alluisi, E. A.

    1975-01-01

    The effects of aircraft noise on human performance is considered. Progress is reported in the following areas: (1) review of the literature to identify the methodological and stimulus parameters involved in the study of noise effects on human performance; (2) development of a theoretical framework to provide working hypotheses as to the effects of noise on complex human performance; and (3) data collection on the first of several experimental investigations designed to provide tests of the hypotheses.

  4. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    NASA Technical Reports Server (NTRS)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  5. Knowledge based system and decision making methodologies in materials selection for aircraft cabin metallic structures

    NASA Astrophysics Data System (ADS)

    Adhikari, Pashupati Raj

    Materials selection processes have been the most important aspects in product design and development. Knowledge-based system (KBS) and some of the methodologies used in the materials selection for the design of aircraft cabin metallic structures are discussed. Overall aircraft weight reduction means substantially less fuel consumption. Part of the solution to this problem is to find a way to reduce overall weight of metallic structures inside the cabin. Among various methodologies of materials selection using Multi Criterion Decision Making (MCDM) techniques, a few of them are demonstrated with examples and the results are compared with those obtained using Ashby's approach in materials selection. Pre-defined constraint values, mainly mechanical properties, are employed as relevant attributes in the process. Aluminum alloys with high strength-to-weight ratio have been second-to-none in most of the aircraft parts manufacturing. Magnesium alloys that are much lighter in weight as alternatives to the Al-alloys currently in use in the structures are tested using the methodologies and ranked results are compared. Each material attribute considered in the design are categorized as benefit and non-benefit attribute. Using Ashby's approach, material indices that are required to be maximized for an optimum performance are determined, and materials are ranked based on the average of consolidated indices ranking. Ranking results are compared for any disparity among the methodologies.

  6. Appraisal of systematic reviews on the management of peri-implant diseases with two methodological tools.

    PubMed

    Faggion, Clovis Mariano; Monje, Alberto; Wasiak, Jason

    2018-06-01

    This study aimed to evaluate and compare the performance of two methodological instruments to appraise systematic reviews and to identify potential disagreements of systematic review authors regarding risk of bias (RoB) evaluation of randomized controlled trials (RCTs) included in systematic reviews on peri-implant diseases. We searched Medline, Web of Science, Cochrane Library, PubMed Central, and Google Scholar for systematic reviews on peri-implant diseases published before July 11, 2017. Two authors independently evaluated the RoB and methodological quality of the systematic reviews by applying the Risk of Bias in Systematic Reviews (ROBIS) tool and Assessing the Methodological Quality of Systematic Reviews (AMSTAR) checklist, respectively. We assessed the RoB scores of the same RCTs published in different systematic reviews. Of the 32 systematic reviews identified, 23 reviews addressed the clinical topic of peri-implantitis. A high RoB was detected for most systematic reviews (n=25) using ROBIS, whilst five systematic reviews displayed low methodological quality by AMSTAR. Almost 30% of the RoB comparisons (for the same RCTs) had different RoB ratings across systematic reviews. The ROBIS tool appears to provide more conservative results than AMSTAR checklist. Considerable disagreement was found among systematic review authors rating the same RCT included in different systematic reviews. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. A methodology to estimate uncertainty for emission projections through sensitivity analysis.

    PubMed

    Lumbreras, Julio; de Andrés, Juan Manuel; Pérez, Javier; Borge, Rafael; de la Paz, David; Rodríguez, María Encarnación

    2015-04-01

    Air pollution abatement policies must be based on quantitative information on current and future emissions of pollutants. As emission projections uncertainties are inevitable and traditional statistical treatments of uncertainty are highly time/resources consuming, a simplified methodology for nonstatistical uncertainty estimation based on sensitivity analysis is presented in this work. The methodology was applied to the "with measures" scenario for Spain, concretely over the 12 highest emitting sectors regarding greenhouse gas and air pollutants emissions. Examples of methodology application for two important sectors (power plants, and agriculture and livestock) are shown and explained in depth. Uncertainty bands were obtained up to 2020 by modifying the driving factors of the 12 selected sectors and the methodology was tested against a recomputed emission trend in a low economic-growth perspective and official figures for 2010, showing a very good performance. A solid understanding and quantification of uncertainties related to atmospheric emission inventories and projections provide useful information for policy negotiations. However, as many of those uncertainties are irreducible, there is an interest on how they could be managed in order to derive robust policy conclusions. Taking this into account, a method developed to use sensitivity analysis as a source of information to derive nonstatistical uncertainty bands for emission projections is presented and applied to Spain. This method simplifies uncertainty assessment and allows other countries to take advantage of their sensitivity analyses.

  8. Three-dimensional interconnected network of graphene-wrapped porous silicon spheres: in situ magnesiothermic-reduction synthesis and enhanced lithium-storage capabilities.

    PubMed

    Wu, Ping; Wang, Hui; Tang, Yawen; Zhou, Yiming; Lu, Tianhong

    2014-03-12

    A novel type of 3D porous Si-G micro/nanostructure (i.e., 3D interconnected network of graphene-wrapped porous silicon spheres, Si@G network) was constructed through layer-by-layer assembly and subsequent in situ magnesiothermic-reduction methodology. Compared with bare Si spheres, the as-synthesized Si@G network exhibits markedly enhanced anodic performance in terms of specific capacity, cycling stability, and rate capability, making it an ideal anode candidate for high-energy, long-life, and high-power lithium-ion batteries.

  9. Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis

    NASA Technical Reports Server (NTRS)

    Babcock, P.; Schor, A.; Rosch, G.

    1998-01-01

    This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.

  10. Human Schedule Performance, Protocol Analysis, and the "Silent Dog" Methodology

    ERIC Educational Resources Information Center

    Cabello, Francisco; Luciano, Carmen; Gomez, Inmaculada; Barnes-Holmes, Dermot

    2004-01-01

    The purpose of the current experiment was to investigate the role of private verbal behavior on the operant performances of human adults, using a protocol analysis procedure with additional methodological controls (the "silent dog" method). Twelve subjects were exposed to fixed ratio 8 and differential reinforcement of low rate 3-s schedules. For…

  11. Using a False Biofeedback Methodology to Explore Relationships between Learners' Affect, Metacognition, and Performance

    ERIC Educational Resources Information Center

    Strain, Amber Chauncey; Azevedo, Roger; D'Mello, Sidney K.

    2013-01-01

    We used a false-biofeedback methodology to manipulate physiological arousal in order to induce affective states that would influence learners' metacognitive judgments and learning performance. False-biofeedback is a method used to induce physiological arousal (and resultant affective states) by presenting learners with audio stimuli of false heart…

  12. Peer Review of a Formal Verification/Design Proof Methodology

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The role of formal verification techniques in system validation was examined. The value and the state of the art of performance proving for fault-tolerant compuers were assessed. The investigation, development, and evaluation of performance proving tools were reviewed. The technical issues related to proof methodologies are examined. The technical issues discussed are summarized.

  13. Identifying individual changes in performance with composite quality indicators while accounting for regression to the mean.

    PubMed

    Gajewski, Byron J; Dunton, Nancy

    2013-04-01

    Almost a decade ago Morton and Torgerson indicated that perceived medical benefits could be due to "regression to the mean." Despite this caution, the regression to the mean "effects on the identification of changes in institutional performance do not seem to have been considered previously in any depth" (Jones and Spiegelhalter). As a response, Jones and Spiegelhalter provide a methodology to adjust for regression to the mean when modeling recent changes in institutional performance for one-variable quality indicators. Therefore, in our view, Jones and Spiegelhalter provide a breakthrough methodology for performance measures. At the same time, in the interests of parsimony, it is useful to aggregate individual quality indicators into a composite score. Our question is, can we develop and demonstrate a methodology that extends the "regression to the mean" literature to allow for composite quality indicators? Using a latent variable modeling approach, we extend the methodology to the composite indicator case. We demonstrate the approach on 4 indicators collected by the National Database of Nursing Quality Indicators. A simulation study further demonstrates its "proof of concept."

  14. Numerical characteristics of quantum computer simulation

    NASA Astrophysics Data System (ADS)

    Chernyavskiy, A.; Khamitov, K.; Teplov, A.; Voevodin, V.; Voevodin, Vl.

    2016-12-01

    The simulation of quantum circuits is significantly important for the implementation of quantum information technologies. The main difficulty of such modeling is the exponential growth of dimensionality, thus the usage of modern high-performance parallel computations is relevant. As it is well known, arbitrary quantum computation in circuit model can be done by only single- and two-qubit gates, and we analyze the computational structure and properties of the simulation of such gates. We investigate the fact that the unique properties of quantum nature lead to the computational properties of the considered algorithms: the quantum parallelism make the simulation of quantum gates highly parallel, and on the other hand, quantum entanglement leads to the problem of computational locality during simulation. We use the methodology of the AlgoWiki project (algowiki-project.org) to analyze the algorithm. This methodology consists of theoretical (sequential and parallel complexity, macro structure, and visual informational graph) and experimental (locality and memory access, scalability and more specific dynamic characteristics) parts. Experimental part was made by using the petascale Lomonosov supercomputer (Moscow State University, Russia). We show that the simulation of quantum gates is a good base for the research and testing of the development methods for data intense parallel software, and considered methodology of the analysis can be successfully used for the improvement of the algorithms in quantum information science.

  15. An update on technical and methodological aspects for cardiac PET applications.

    PubMed

    Presotto, Luca; Busnardo, Elena; Gianolli, Luigi; Bettinardi, Valentino

    2016-12-01

    Positron emission tomography (PET) is indicated for a large number of cardiac diseases: perfusion and viability studies are commonly used to evaluate coronary artery disease; PET can also be used to assess sarcoidosis and endocarditis, as well as to investigate amyloidosis. Furthermore, a hot topic for research is plaque characterization. Most of these studies are technically very challenging. High count rates and short acquisition times characterize perfusion scans while very small targets have to be imaged in inflammation/infection and plaques examinations. Furthermore, cardiac PET suffers from respiratory and cardiac motion blur. Each type of studies has specific requirements from the technical and methodological point of view, thus PET systems with overall high performances are required. Furthermore, in the era of hybrid PET/computed tomography (CT) and PET/Magnetic Resonance Imaging (MRI) systems, the combination of complementary functional and anatomical information can be used to improve diagnosis and prognosis. Moreover, PET images can be qualitatively and quantitatively improved exploiting information from the other modality, using advanced algorithms. In this review we will report the latest technological and methodological innovations for PET cardiac applications, with particular reference to the state of the art of the hybrid PET/CT and PET/MRI. We will also report the most recent advancements in software, from reconstruction algorithms to image processing and analysis programs.

  16. Temperature - Emissivity Separation Assessment in a Sub-Urban Scenario

    NASA Astrophysics Data System (ADS)

    Moscadelli, M.; Diani, M.; Corsini, G.

    2017-10-01

    In this paper, a methodology that aims at evaluating the effectiveness of different TES strategies is presented. The methodology takes into account the specific material of interest in the monitored scenario, sensor characteristics, and errors in the atmospheric compensation step. The methodology is proposed in order to predict and analyse algorithms performances during the planning of a remote sensing mission, aimed to discover specific materials of interest in the monitored scenario. As case study, the proposed methodology is applied to a real airborne data set of a suburban scenario. In order to perform the TES problem, three state-of-the-art algorithms, and a recently proposed one, are investigated: Temperature-Emissivity Separation '98 (TES-98) algorithm, Stepwise Refining TES (SRTES) algorithm, Linear piecewise TES (LTES) algorithm, and Optimized Smoothing TES (OSTES) algorithm. At the end, the accuracy obtained with real data, and the ones predicted by means of the proposed methodology are compared and discussed.

  17. Work-based physiological assessment of physically-demanding trades: a methodological overview.

    PubMed

    Taylor, Nigel A S; Groeller, Herb

    2003-03-01

    Technological advances, modified work practices, altered employment strategies, work-related injuries, and the rise in work-related litigation and compensation claims necessitate ongoing trade analysis research. Such research enables the identification and development of gender- and age-neutral skills, physiological attributes and employment standards required to satisfactorily perform critical trade tasks. This paper overviews a methodological approach which may be adopted when seeking to establish trade-specific physiological competencies for physically-demanding trades (occupations). A general template is presented for conducting a trade analyses within physically-demanding trades, such as those encountered within military or emergency service occupations. Two streams of analysis are recommended: the trade analysis and the task analysis. The former involves a progressive dissection of activities and skills into a series of specific tasks (elements), and results in a broad approximation of the types of trade duties, and the links between trade tasks. The latter, will lead to the determination of how a task is performed within a trade, and the physiological attributes required to satisfactorily perform that task. The approach described within this paper is designed to provide research outcomes which have high content, criterion-related and construct validities.

  18. Multivariate analyses of individual variation in soccer skill as a tool for talent identification and development: utilising evolutionary theory in sports science.

    PubMed

    Wilson, Robbie S; James, Rob S; David, Gwendolyn; Hermann, Ecki; Morgan, Oliver J; Niehaus, Amanda C; Hunter, Andrew; Thake, Doug; Smith, Michelle D

    2016-11-01

    The development of a comprehensive protocol for quantifying soccer-specific skill could markedly improve both talent identification and development. Surprisingly, most protocols for talent identification in soccer still focus on the more generic athletic attributes of team sports, such as speed, strength, agility and endurance, rather than on a player's technical skills. We used a multivariate methodology borrowed from evolutionary analyses of adaptation to develop our quantitative assessment of individual soccer-specific skill. We tested the performance of 40 individual academy-level players in eight different soccer-specific tasks across an age range of 13-18 years old. We first quantified the repeatability of each skill performance then explored the effects of age on soccer-specific skill, correlations between each of the pairs of skill tasks independent of age, and finally developed an individual metric of overall skill performance that could be easily used by coaches. All of our measured traits were highly repeatable when assessed over a short period and we found that an individual's overall skill - as well as their performance in their best task - was strongly positively correlated with age. Most importantly, our study established a simple but comprehensive methodology for assessing skill performance in soccer players, thus allowing coaches to rapidly assess the relative abilities of their players, identify promising youths and work on eliminating skill deficits in players.

  19. Structured Innovation of High-Performance Wave Energy Converter Technology: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weber, Jochem W.; Laird, Daniel

    Wave energy converter (WEC) technology development has not yet delivered the desired commercial maturity nor, and more importantly, the techno-economic performance. The reasons for this have been recognized and fundamental requirements for successful WEC technology development have been identified. This paper describes a multi-year project pursued in collaboration by the National Renewable Energy Laboratory and Sandia National Laboratories to innovate and develop new WEC technology. It specifies the project strategy, shows how this differs from the state-of-the-art approach and presents some early project results. Based on the specification of fundamental functional requirements of WEC technology, structured innovation and systemic problemmore » solving methodologies are applied to invent and identify new WEC technology concepts. Using Technology Performance Levels (TPL) as an assessment metric of the techno-economic performance potential, high performance technology concepts are identified and selected for further development. System performance is numerically modelled and optimized and key performance aspects are empirically validated. The project deliverables are WEC technology specifications of high techno-economic performance technologies of TPL 7 or higher at TRL 3 with some key technology challenges investigated at higher TRL. These wave energy converter technology specifications will be made available to industry for further, full development and commercialisation (TRL 4 - TRL 9).« less

  20. Congenital Heart Surgery Case Mix Across North American Centers and Impact on Performance Assessment.

    PubMed

    Pasquali, Sara K; Wallace, Amelia S; Gaynor, J William; Jacobs, Marshall L; O'Brien, Sean M; Hill, Kevin D; Gaies, Michael G; Romano, Jennifer C; Shahian, David M; Mayer, John E; Jacobs, Jeffrey P

    2016-11-01

    Performance assessment in congenital heart surgery is challenging due to the wide heterogeneity of disease. We describe current case mix across centers, evaluate methodology inclusive of all cardiac operations versus the more homogeneous subset of Society of Thoracic Surgeons benchmark operations, and describe implications regarding performance assessment. Centers (n = 119) participating in the Society of Thoracic Surgeons Congenital Heart Surgery Database (2010 through 2014) were included. Index operation type and frequency across centers were described. Center performance (risk-adjusted operative mortality) was evaluated and classified when including the benchmark versus all eligible operations. Overall, 207 types of operations were performed during the study period (112,140 total cases). Few operations were performed across all centers; only 25% were performed at least once by 75% or more of centers. There was 7.9-fold variation across centers in the proportion of total cases comprising high-complexity cases (STAT 5). In contrast, the benchmark operations made up 36% of cases, and all but 2 were performed by at least 90% of centers. When evaluating performance based on benchmark versus all operations, 15% of centers changed performance classification; 85% remained unchanged. Benchmark versus all operation methodology was associated with lower power, with 35% versus 78% of centers meeting sample size thresholds. There is wide variation in congenital heart surgery case mix across centers. Metrics based on benchmark versus all operations are associated with strengths (less heterogeneity) and weaknesses (lower power), and lead to differing performance classification for some centers. These findings have implications for ongoing efforts to optimize performance assessment, including choice of target population and appropriate interpretation of reported metrics. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  1. Eastern Renewable Generation Integration Study: Redefining What’s Possible for Renewable Energy

    ScienceCinema

    Bloom, Aaron

    2018-01-16

    NREL project manager Aaron Bloom introduces NREL’s Eastern Renewable Generation Integration Study (ERGIS) and high-performance computing capabilities and new methodologies that allowed NREL to model operations of the Eastern Interconnection at unprecedented fidelity. ERGIS shows that the Eastern Interconnection can balance the variability and uncertainty of wind and solar photovoltaics at a 5-minute level, for one simulated year.

  2. Programming Methodology for High Performance Applications on Tiled Architectures

    DTIC Science & Technology

    2009-06-01

    members, both voting and non- voting ; • A page of links, including links to all available PCA home pages, the home pages of other DARPA programs of...HPEC-SI); and • Organizational information on the Morphware Forum, such as membership requirements and voting procedures. The web site was the...The following organizations are voting members of the Morphware Forum at this writing: o Defense Advanced Research Projects Agency o Georgia

  3. Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers

    NASA Technical Reports Server (NTRS)

    Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.

    1983-01-01

    A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.

  4. Improving Mathematics Performance among Secondary Students with EBD: A Methodological Review

    ERIC Educational Resources Information Center

    Mulcahy, Candace A.; Krezmien, Michael P.; Travers, Jason

    2016-01-01

    In this methodological review, the authors apply special education research quality indicators and standards for single case design to analyze mathematics intervention studies for secondary students with emotional and behavioral disorders (EBD). A systematic methodological review of literature from 1975 to December 2012 yielded 19 articles that…

  5. Constellation Ground Systems Launch Availability Analysis: Enhancing Highly Reliable Launch Systems Design

    NASA Technical Reports Server (NTRS)

    Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.

    2010-01-01

    Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, in a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation; testing results, and other information. Where appropriate, actual performance history was used for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to verify compliance with requirements and to highlight design or performance shortcomings for further decision-making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability and maintainability analysis, and present findings and observation based on analysis leading to the Ground Systems Preliminary Design Review milestone.

  6. Updated methodology for nuclear magnetic resonance characterization of shales

    NASA Astrophysics Data System (ADS)

    Washburn, Kathryn E.; Birdwell, Justin E.

    2013-08-01

    Unconventional petroleum resources, particularly in shales, are expected to play an increasingly important role in the world's energy portfolio in the coming years. Nuclear magnetic resonance (NMR), particularly at low-field, provides important information in the evaluation of shale resources. Most of the low-field NMR analyses performed on shale samples rely heavily on standard T1 and T2 measurements. We present a new approach using solid echoes in the measurement of T1 and T1-T2 correlations that addresses some of the challenges encountered when making NMR measurements on shale samples compared to conventional reservoir rocks. Combining these techniques with standard T1 and T2 measurements provides a more complete assessment of the hydrogen-bearing constituents (e.g., bitumen, kerogen, clay-bound water) in shale samples. These methods are applied to immature and pyrolyzed oil shale samples to examine the solid and highly viscous organic phases present during the petroleum generation process. The solid echo measurements produce additional signal in the oil shale samples compared to the standard methodologies, indicating the presence of components undergoing homonuclear dipolar coupling. The results presented here include the first low-field NMR measurements performed on kerogen as well as detailed NMR analysis of highly viscous thermally generated bitumen present in pyrolyzed oil shale.

  7. Compact sieve-tray distillation column for ammonia-water absorption heat pump: Part 1 -- Design methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anand, G.; Erickson, D.C.

    1999-07-01

    The distillation column is a key component of ammonia-water absorption units including advanced generator-absorber heat exchange (GAX) cycle heat pumps. The design of the distillation column is critical to unit performance, size, and cost. The distillation column can be designed with random packing, structured packing, or various tray configurations. A sieve-tray distillation column is the least complicated tray design and is less costly than high-efficiency packing. Substantial literature is available on sieve tray design and performance. However, most of the correlations and design recommendations were developed for large industrial hydrocarbon systems and are generally not directly applicable to the compactmore » ammonia-water column discussed here. The correlations were reviewed and modified as appropriate for this application, and a sieve-tray design model was developed. This paper presents the sieve-tray design methodology for highly compact ammonia-water columns. A conceptual design of the distillation column for an 8 ton vapor exchange (VX) GAX heat pump is presented, illustrating relevant design parameters and trends. The design process revealed several issues that have to be investigated experimentally to design the final optimized rectifier. Validation of flooding and weeping limits and tray/point efficiencies are of primary importance.« less

  8. Physical examination tests for the diagnosis of femoroacetabular impingement. A systematic review.

    PubMed

    Pacheco-Carrillo, Aitana; Medina-Porqueres, Ivan

    2016-09-01

    Numerous clinical tests have been proposed to diagnose FAI, but little is known about their diagnostic accuracy. To summarize and evaluate research on the accuracy of physical examination tests for diagnosis of FAI. A search of the PubMed, SPORTDiscus and CINAHL databases was performed. Studies were considered eligible if they compared the results of physical examination tests to those of a reference standard. Methodological quality and internal validity assessment was performed by two independent reviewers using the Quality Assessment of Diagnostic Accuracy Studies (QUADAS) tool. The systematic search strategy revealed 298 potential articles, five of which articles met the inclusion criteria. After assessment using the QUADAS score, four of the five articles were of high quality. Clinical tests included were Impingement sign, IROP test (Internal Rotation Over Pressure), FABER test (Flexion-Abduction-External Rotation), Stinchfield/RSRL (Resisted Straight Leg Raise) test, Scour test, Maximal squat test, and the Anterior Impingement test. IROP test, impingement sign, and FABER test showed the most sensitive values to identify FAI. The diagnostic accuracy of physical examination tests to assess FAI is limited due to its heterogenecity. There is a strong need for sound research of high methodological quality in this area. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Application of self-organizing maps to the study of U-Zr-Ti-Nb distribution in sandstone-hosted uranium ores

    NASA Astrophysics Data System (ADS)

    Klus, Jakub; Pořízka, Pavel; Prochazka, David; Mikysek, Petr; Novotný, Jan; Novotný, Karel; Slobodník, Marek; Kaiser, Jozef

    2017-05-01

    This paper presents a novel approach for processing the spectral information obtained from high-resolution elemental mapping performed by means of Laser-Induced Breakdown Spectroscopy. The proposed methodology is aimed at the description of possible elemental associations within a heterogeneous sample. High-resolution elemental mapping provides a large number of measurements. Moreover, typical laser-induced plasma spectrum consists of several thousands of spectral variables. Analysis of heterogeneous samples, where valuable information is hidden in a limited fraction of sample mass, requires special treatment. The sample under study is a sandstone-hosted uranium ore that shows irregular distribution of ore elements such as zirconium, titanium, uranium and niobium. Presented processing methodology shows the way to reduce the dimensionality of data and retain the spectral information by utilizing self-organizing maps (SOM). The spectral information from SOM is processed further to detect either simultaneous or isolated presence of elements. Conclusions suggested by SOM are in good agreement with geological studies of mineralization phases performed at the deposit. Even deeper investigation of the SOM results enables discrimination of interesting measurements and reveals new possibilities in the visualization of chemical mapping information. Suggested approach improves the description of elemental associations in mineral phases, which is crucial for the mining industry.

  10. Updated methodology for nuclear magnetic resonance characterization of shales

    USGS Publications Warehouse

    Washburn, Kathryn E.; Birdwell, Justin E.

    2013-01-01

    Unconventional petroleum resources, particularly in shales, are expected to play an increasingly important role in the world’s energy portfolio in the coming years. Nuclear magnetic resonance (NMR), particularly at low-field, provides important information in the evaluation of shale resources. Most of the low-field NMR analyses performed on shale samples rely heavily on standard T1 and T2 measurements. We present a new approach using solid echoes in the measurement of T1 and T1–T2 correlations that addresses some of the challenges encountered when making NMR measurements on shale samples compared to conventional reservoir rocks. Combining these techniques with standard T1 and T2 measurements provides a more complete assessment of the hydrogen-bearing constituents (e.g., bitumen, kerogen, clay-bound water) in shale samples. These methods are applied to immature and pyrolyzed oil shale samples to examine the solid and highly viscous organic phases present during the petroleum generation process. The solid echo measurements produce additional signal in the oil shale samples compared to the standard methodologies, indicating the presence of components undergoing homonuclear dipolar coupling. The results presented here include the first low-field NMR measurements performed on kerogen as well as detailed NMR analysis of highly viscous thermally generated bitumen present in pyrolyzed oil shale.

  11. 77 FR 30411 - Connect America Fund; High-Cost Universal Service Support

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-23

    ... ``benchmarks'' for high cost loop support (HCLS). The methodology the Bureau adopts, builds on the analysis... to support continued broadband investment. The methodology the Bureau adopts today is described in... methodology, HCLS will be recalculated to account for the additional support available under the overall cap...

  12. Integrated model-based retargeting and optical proximity correction

    NASA Astrophysics Data System (ADS)

    Agarwal, Kanak B.; Banerjee, Shayak

    2011-04-01

    Conventional resolution enhancement techniques (RET) are becoming increasingly inadequate at addressing the challenges of subwavelength lithography. In particular, features show high sensitivity to process variation in low-k1 lithography. Process variation aware RETs such as process-window OPC are becoming increasingly important to guarantee high lithographic yield, but such techniques suffer from high runtime impact. An alternative to PWOPC is to perform retargeting, which is a rule-assisted modification of target layout shapes to improve their process window. However, rule-based retargeting is not a scalable technique since rules cannot cover the entire search space of two-dimensional shape configurations, especially with technology scaling. In this paper, we propose to integrate the processes of retargeting and optical proximity correction (OPC). We utilize the normalized image log slope (NILS) metric, which is available at no extra computational cost during OPC. We use NILS to guide dynamic target modification between iterations of OPC. We utilize the NILS tagging capabilities of Calibre TCL scripting to identify fragments with low NILS. We then perform NILS binning to assign different magnitude of retargeting to different NILS bins. NILS is determined both for width, to identify regions of pinching, and space, to locate regions of potential bridging. We develop an integrated flow for 1x metal lines (M1) which exhibits lesser lithographic hotspots compared to a flow with just OPC and no retargeting. We also observe cases where hotspots that existed in the rule-based retargeting flow are fixed using our methodology. We finally also demonstrate that such a retargeting methodology does not significantly alter design properties by electrically simulating a latch layout before and after retargeting. We observe less than 1% impact on latch Clk-Q and D-Q delays post-retargeting, which makes this methodology an attractive one for use in improving shape process windows without perturbing designed values.

  13. “The 3/3 Strategy”: A Successful Multifaceted Hospital Wide Hand Hygiene Intervention Based on WHO and Continuous Quality Improvement Methodology

    PubMed Central

    Mestre, Gabriel; Berbel, Cristina; Tortajada, Purificación; Alarcia, Margarita; Coca, Roser; Gallemi, Gema; Garcia, Irene; Fernández, Mari Mar; Aguilar, Mari Carmen; Martínez, José Antonio; Rodríguez-Baño, Jesús

    2012-01-01

    Background Only multifaceted hospital wide interventions have been successful in achieving sustained improvements in hand hygiene (HH) compliance. Methodology/Principal Findings Pre-post intervention study of HH performance at baseline (October 2007– December 2009) and during intervention, which included two phases. Phase 1 (2010) included multimodal WHO approach. Phase 2 (2011) added Continuous Quality Improvement (CQI) tools and was based on: a) Increase of alcohol hand rub (AHR) solution placement (from 0.57 dispensers/bed to 1.56); b) Increase in frequency of audits (three days every three weeks: “3/3 strategy”); c) Implementation of a standardized register form of HH corrective actions; d) Statistical Process Control (SPC) as time series analysis methodology through appropriate control charts. During the intervention period we performed 819 scheduled direct observation audits which provided data from 11,714 HH opportunities. The most remarkable findings were: a) significant improvements in HH compliance with respect to baseline (25% mean increase); b) sustained high level (82%) of HH compliance during intervention; c) significant increase in AHRs consumption over time; c) significant decrease in the rate of healthcare-acquired MRSA; d) small but significant improvements in HH compliance when comparing phase 2 to phase 1 [79.5% (95% CI: 78.2–80.7) vs 84.6% (95% CI:83.8–85.4), p<0.05]; e) successful use of control charts to identify significant negative and positive deviations (special causes) related to the HH compliance process over time (“positive”: 90.1% as highest HH compliance coinciding with the “World hygiene day”; and “negative”:73.7% as lowest HH compliance coinciding with a statutory lay-off proceeding). Conclusions/Significance CQI tools may be a key addition to WHO strategy to maintain a good HH performance over time. In addition, SPC has shown to be a powerful methodology to detect special causes in HH performance (positive and negative) and to help establishing adequate feedback to healthcare workers. PMID:23110061

  14. Multi-application controls: Robust nonlinear multivariable aerospace controls applications

    NASA Technical Reports Server (NTRS)

    Enns, Dale F.; Bugajski, Daniel J.; Carter, John; Antoniewicz, Bob

    1994-01-01

    This viewgraph presentation describes the general methodology used to apply Honywell's Multi-Application Control (MACH) and the specific application to the F-18 High Angle-of-Attack Research Vehicle (HARV) including piloted simulation handling qualities evaluation. The general steps include insertion of modeling data for geometry and mass properties, aerodynamics, propulsion data and assumptions, requirements and specifications, e.g. definition of control variables, handling qualities, stability margins and statements for bandwidth, control power, priorities, position and rate limits. The specific steps include choice of independent variables for least squares fits to aerodynamic and propulsion data, modifications to the management of the controls with regard to integrator windup and actuation limiting and priorities, e.g. pitch priority over roll, and command limiting to prevent departures and/or undesirable inertial coupling or inability to recover to a stable trim condition. The HARV control problem is characterized by significant nonlinearities and multivariable interactions in the low speed, high angle-of-attack, high angular rate flight regime. Systematic approaches to the control of vehicle motions modeled with coupled nonlinear equations of motion have been developed. This paper will discuss the dynamic inversion approach which explicity accounts for nonlinearities in the control design. Multiple control effectors (including aerodynamic control surfaces and thrust vectoring control) and sensors are used to control the motions of the vehicles in several degrees-of-freedom. Several maneuvers will be used to illustrate performance of MACH in the high angle-of-attack flight regime. Analytical methods for assessing the robust performance of the multivariable control system in the presence of math modeling uncertainty, disturbances, and commands have reached a high level of maturity. The structured singular value (mu) frequency response methodology is presented as a method for analyzing robust performance and the mu-synthesis method will be presented as a method for synthesizing a robust control system. The paper concludes with the author's expectations regarding future applications of robust nonlinear multivariable controls.

  15. High-Resolution Melting Analysis for Rapid Detection of Sequence Type 131 Escherichia coli.

    PubMed

    Harrison, Lucas B; Hanson, Nancy D

    2017-06-01

    Escherichia coli isolates belonging to the sequence type 131 (ST131) clonal complex have been associated with the global distribution of fluoroquinolone and β-lactam resistance. Whole-genome sequencing and multilocus sequence typing identify sequence type but are expensive when evaluating large numbers of samples. This study was designed to develop a cost-effective screening tool using high-resolution melting (HRM) analysis to differentiate ST131 from non-ST131 E. coli in large sample populations in the absence of sequence analysis. The method was optimized using DNA from 12 E. coli isolates. Singleplex PCR was performed using 10 ng of DNA, Type-it HRM buffer, and multilocus sequence typing primers and was followed by multiplex PCR. The amplicon sizes ranged from 630 to 737 bp. Melt temperature peaks were determined by performing HRM analysis at 0.1°C resolution from 50 to 95°C on a Rotor-Gene Q 5-plex HRM system. Derivative melt curves were compared between sequence types and analyzed by principal component analysis. A blinded study of 191 E. coli isolates of ST131 and unknown sequence types validated this methodology. This methodology returned 99.2% specificity (124 true negatives and 1 false positive) and 100% sensitivity (66 true positives and 0 false negatives). This HRM methodology distinguishes ST131 from non-ST131 E. coli without sequence analysis. The analysis can be accomplished in about 3 h in any laboratory with an HRM-capable instrument and principal component analysis software. Therefore, this assay is a fast and cost-effective alternative to sequencing-based ST131 identification. Copyright © 2017 Harrison and Hanson.

  16. Lewis Acid-Base Adduct Approach for High Efficiency Perovskite Solar Cells.

    PubMed

    Lee, Jin-Wook; Kim, Hui-Seon; Park, Nam-Gyu

    2016-02-16

    Since the first report on the long-term durable 9.7% solid-state perovskite solar cell employing methylammonium lead iodide (CH3NH3PbI3), mesoporous TiO2, and 2,2',7,7'-tetrakis[N,N-di(4-methoxyphenyl)amino]-9,9'-spirobifluorene (spiro-MeOTAD) in 2012, following the seed technologies on perovskite-sensitized liquid junction solar cells in 2009 and 2011, a surge of interest has been focused on perovskite solar cells due to superb photovoltaic performance and extremely facile fabrication processes. The power conversion efficiency (PCE) of perovskite solar cells reached 21% in a very short period of time. Such an unprecedentedly high photovoltaic performance is due to the intrinsic optoelectronic property of organolead iodide perovskite material. Moreover, a high dielectric constant, sub-millimeter scale carrier diffusion length, an underlying ferroelectric property, and ion migration behavior can make organolead halide perovskites suitable for multifunctionality. Thus, besides solar cell applications, perovskite material has recently been applied to a variety fields of materials science such as photodetectors, light emitting diodes, lasing, X-ray imaging, resistive memory, and water splitting. Regardless of application areas, the growth of a well-defined perovskite layer with high crystallinity is essential for effective utilization of its excellent physicochemical properties. Therefore, an effective methodology for preparation of high quality perovskite layers is required. In this Account, an effective methodology for production of high quality perovskite layers is described, which is the Lewis acid-base adduct approach. In the solution process to form the perovskite layer, the key chemicals of CH3NH3I (or HC(NH2)2I) and PbI2 are used by dissolving them in polar aprotic solvents. Since polar aprotic solvents bear oxygen, sulfur, or nitrogen, they can act as a Lewis base. In addition, the main group compound PbI2 is known to be a Lewis acid. Thus, PbI2 has a chance to form an adduct by reacting with the Lewis base. Crystal growth and morphology of perovskite can be controlled by taking advantage of the weak chemical interaction in the adduct. We have successfully fabricated highly reproducible CH3NH3PbI3 perovskite solar cells with PCE as high as 19.7% via adducts of PbI2 with oxygen-donor N,N'-dimethyl sulfoxide. This adduct approach has been found to be generally adopted, where formamidinium lead iodide perovskite, HC(NH2)2PbI3 (FAPbI3), with large grain, high crystallinity, and long-lived carrier lifetime was successfully fabricated via an adduct of PbI2 with sulfur-donor thiourea as Lewis base. The adduct approach proposed in this Account is a very promising methodology to achieve high quality perovskite films with high photovoltaic performance. Furthermore, single crystal growth on the conductive substrate is expected to be possible if we kinetically control the elimination of Lewis base in the adduct.

  17. Quasi-finite-time control for high-order nonlinear systems with mismatched disturbances via mapping filtered forwarding technique

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Huang, X. L.; Lu, H. Q.

    2017-02-01

    In this study, a quasi-finite-time control method for designing stabilising control laws is developed for high-order strict-feedback nonlinear systems with mismatched disturbances. By using mapping filtered forwarding technique, a virtual control is designed to force the off-the-manifold coordinate to converge to zero in quasi-finite time at each step of the design; at the same time, the manifold is rendered insensitive to time-varying, bounded and unknown disturbances. In terms of standard forwarding methodology, the algorithm proposed here not only does not require the Lyapunov function for controller design, but also avoids to calculate the derivative of sign function. As far as the dynamic performance of closed-loop systems is concerned, we essentially obtain the finite-time performances, which is typically reflected in the following aspects: fast and accurate responses, high tracking precision, and robust disturbance rejection. Spring, mass, and damper system and flexible joints robot are tested to demonstrate the proposed controller performance.

  18. Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoessel, Chris

    2013-11-13

    This project developed a new high-performance R-10/high SHGC window design, reviewed market positioning and evaluated manufacturing solutions required for broad market adoption. The project objectives were accomplished by: identifying viable technical solutions based on modeling of modern and potential coating stacks and IGU designs; development of new coating material sets for HM thin film stacks, as well as improved HM IGU designs to accept multiple layers of HM films; matching promising new coating designs with new HM IGU designs to demonstrate performance gains; and, in cooperation with a window manufacturer, assess the potential for high-volume manufacturing and cost efficiency ofmore » a HM-based R-10 window with improved solar heat gain characteristics. A broad view of available materials and design options was applied to achieve the desired improvements. Gated engineering methodologies were employed to guide the development process from concept generation to a window demonstration. The project determined that a slightly de-rated window performance allows formulation of a path to achieve the desired cost reductions to support end consumer adoption.« less

  19. A clustering approach for the analysis of solar energy yields: A case study for concentrating solar thermal power plants

    NASA Astrophysics Data System (ADS)

    Peruchena, Carlos M. Fernández; García-Barberena, Javier; Guisado, María Vicenta; Gastón, Martín

    2016-05-01

    The design of Concentrating Solar Thermal Power (CSTP) systems requires a detailed knowledge of the dynamic behavior of the meteorology at the site of interest. Meteorological series are often condensed into one representative year with the aim of data volume reduction and speeding-up of energy system simulations, defined as Typical Meteorological Year (TMY). This approach seems to be appropriate for rather detailed simulations of a specific plant; however, in previous stages of the design of a power plant, especially during the optimization of the large number of plant parameters before a final design is reached, a huge number of simulations are needed. Even with today's technology, the computational effort to simulate solar energy system performance with one year of data at high frequency (as 1-min) may become colossal if a multivariable optimization has to be performed. This work presents a simple and efficient methodology for selecting number of individual days able to represent the electrical production of the plant throughout the complete year. To achieve this objective, a new procedure for determining a reduced set of typical weather data in order to evaluate the long-term performance of a solar energy system is proposed. The proposed methodology is based on cluster analysis and permits to drastically reduce computational effort related to the calculation of a CSTP plant energy yield by simulating a reduced number of days from a high frequency TMY.

  20. Performance comparison between the mycobacteria growth indicator tube system and Löwenstein-Jensen medium in the routine detection of Mycobacterium tuberculosis at public health care facilities in Rio de Janeiro, Brazil: preliminary results of a pragmatic clinical trial.

    PubMed

    Moreira, Adriana da Silva Rezende; Huf, Gisele; Vieira, Maria Armanda; Fonseca, Leila; Ricks, Monica; Kritski, Afrânio Lineu

    2013-01-01

    In view of the fact that the World Health Organization has recommended the use of the mycobacteria growth indicator tube (MGIT) 960 system for the diagnosis of tuberculosis and that there is as yet no evidence regarding the clinical impact of its use in health care systems, we conducted a pragmatic clinical trial to evaluate the clinical performance and cost-effectiveness of the use of MGIT 960 at two health care facilities in the city of Rio de Janeiro, Brazil, where the incidence of tuberculosis is high. Here, we summarize the methodology and preliminary results of the trial. (ISRCTN.org Identifier: ISRCTN79888843 [http://isrctn.org/]) In view of the fact that the World Health Organization has recommended the use of the mycobacteria growth indicator tube (MGIT) 960 system for the diagnosis of tuberculosis and that there is as yet no evidence regarding the clinical impact of its use in health care systems, we conducted a pragmatic clinical trial to evaluate the clinical performance and cost-effectiveness of the use of MGIT 960 at two health care facilities in the city of Rio de Janeiro, Brazil, where the incidence of tuberculosis is high. Here, we summarize the methodology and preliminary results of the trial. (ISRCTN.org Identifier: ISRCTN79888843 [http://isrctn.org/]).

  1. Full-Envelope Launch Abort System Performance Analysis Methodology

    NASA Technical Reports Server (NTRS)

    Aubuchon, Vanessa V.

    2014-01-01

    The implementation of a new dispersion methodology is described, which dis-perses abort initiation altitude or time along with all other Launch Abort System (LAS) parameters during Monte Carlo simulations. In contrast, the standard methodology assumes that an abort initiation condition is held constant (e.g., aborts initiated at altitude for Mach 1, altitude for maximum dynamic pressure, etc.) while dispersing other LAS parameters. The standard method results in large gaps in performance information due to the discrete nature of initiation conditions, while the full-envelope dispersion method provides a significantly more comprehensive assessment of LAS abort performance for the full launch vehicle ascent flight envelope and identifies performance "pinch-points" that may occur at flight conditions outside of those contained in the discrete set. The new method has significantly increased the fidelity of LAS abort simulations and confidence in the results.

  2. Nonlinear Performance Seeking Control using Fuzzy Model Reference Learning Control and the Method of Steepest Descent

    NASA Technical Reports Server (NTRS)

    Kopasakis, George

    1997-01-01

    Performance Seeking Control (PSC) attempts to find and control the process at the operating condition that will generate maximum performance. In this paper a nonlinear multivariable PSC methodology will be developed, utilizing the Fuzzy Model Reference Learning Control (FMRLC) and the method of Steepest Descent or Gradient (SDG). This PSC control methodology employs the SDG method to find the operating condition that will generate maximum performance. This operating condition is in turn passed to the FMRLC controller as a set point for the control of the process. The conventional SDG algorithm is modified in this paper in order for convergence to occur monotonically. For the FMRLC control, the conventional fuzzy model reference learning control methodology is utilized, with guidelines generated here for effective tuning of the FMRLC controller.

  3. Laboratory test methodology for evaluating the effects of electromagnetic disturbances on fault-tolerant control systems

    NASA Technical Reports Server (NTRS)

    Belcastro, Celeste M.

    1989-01-01

    Control systems for advanced aircraft, especially those with relaxed static stability, will be critical to flight and will, therefore, have very high reliability specifications which must be met for adverse as well as nominal operating conditions. Adverse conditions can result from electromagnetic disturbances caused by lightning, high energy radio frequency transmitters, and nuclear electromagnetic pulses. Tools and techniques must be developed to verify the integrity of the control system in adverse operating conditions. The most difficult and illusive perturbations to computer based control systems caused by an electromagnetic environment (EME) are functional error modes that involve no component damage. These error modes are collectively known as upset, can occur simultaneously in all of the channels of a redundant control system, and are software dependent. A methodology is presented for performing upset tests on a multichannel control system and considerations are discussed for the design of upset tests to be conducted in the lab on fault tolerant control systems operating in a closed loop with a simulated plant.

  4. Rain/No-Rain Identification from Bispectral Satellite Information using Deep Neural Networks

    NASA Astrophysics Data System (ADS)

    Tao, Y.

    2016-12-01

    Satellite-based precipitation estimation products have the advantage of high resolution and global coverage. However, they still suffer from insufficient accuracy. To accurately estimate precipitation from satellite data, there are two most important aspects: sufficient precipitation information in the satellite information and proper methodologies to extract such information effectively. This study applies the state-of-the-art machine learning methodologies to bispectral satellite information for Rain/No-Rain detection. Specifically, we use deep neural networks to extract features from infrared and water vapor channels and connect it to precipitation identification. To evaluate the effectiveness of the methodology, we first applies it to the infrared data only (Model DL-IR only), the most commonly used inputs for satellite-based precipitation estimation. Then we incorporates water vapor data (Model DL-IR + WV) to further improve the prediction performance. Radar stage IV dataset is used as ground measurement for parameter calibration. The operational product, Precipitation Estimation from Remotely Sensed Information Using Artificial Neural Networks Cloud Classification System (PERSIANN-CCS), is used as a reference to compare the performance of both models in both winter and summer seasons.The experiments show significant improvement for both models in precipitation identification. The overall performance gains in the Critical Success Index (CSI) are 21.60% and 43.66% over the verification periods for Model DL-IR only and Model DL-IR+WV model compared to PERSIANN-CCS, respectively. Moreover, specific case studies show that the water vapor channel information and the deep neural networks effectively help recover a large number of missing precipitation pixels under warm clouds while reducing false alarms under cold clouds.

  5. Risk assessment tools to identify women with increased risk of osteoporotic fracture: complexity or simplicity? A systematic review.

    PubMed

    Rubin, Katrine Hass; Friis-Holmberg, Teresa; Hermann, Anne Pernille; Abrahamsen, Bo; Brixen, Kim

    2013-08-01

    A huge number of risk assessment tools have been developed. Far from all have been validated in external studies, more of them have absence of methodological and transparent evidence, and few are integrated in national guidelines. Therefore, we performed a systematic review to provide an overview of existing valid and reliable risk assessment tools for prediction of osteoporotic fractures. Additionally, we aimed to determine if the performance of each tool was sufficient for practical use, and last, to examine whether the complexity of the tools influenced their discriminative power. We searched PubMed, Embase, and Cochrane databases for papers and evaluated these with respect to methodological quality using the Quality Assessment Tool for Diagnostic Accuracy Studies (QUADAS) checklist. A total of 48 tools were identified; 20 had been externally validated, however, only six tools had been tested more than once in a population-based setting with acceptable methodological quality. None of the tools performed consistently better than the others and simple tools (i.e., the Osteoporosis Self-assessment Tool [OST], Osteoporosis Risk Assessment Instrument [ORAI], and Garvan Fracture Risk Calculator [Garvan]) often did as well or better than more complex tools (i.e., Simple Calculated Risk Estimation Score [SCORE], WHO Fracture Risk Assessment Tool [FRAX], and Qfracture). No studies determined the effectiveness of tools in selecting patients for therapy and thus improving fracture outcomes. High-quality studies in randomized design with population-based cohorts with different case mixes are needed. Copyright © 2013 American Society for Bone and Mineral Research.

  6. Multidisciplinary Design Optimization of a Full Vehicle with High Performance Computing

    NASA Technical Reports Server (NTRS)

    Yang, R. J.; Gu, L.; Tho, C. H.; Sobieszczanski-Sobieski, Jaroslaw

    2001-01-01

    Multidisciplinary design optimization (MDO) of a full vehicle under the constraints of crashworthiness, NVH (Noise, Vibration and Harshness), durability, and other performance attributes is one of the imperative goals for automotive industry. However, it is often infeasible due to the lack of computational resources, robust simulation capabilities, and efficient optimization methodologies. This paper intends to move closer towards that goal by using parallel computers for the intensive computation and combining different approximations for dissimilar analyses in the MDO process. The MDO process presented in this paper is an extension of the previous work reported by Sobieski et al. In addition to the roof crush, two full vehicle crash modes are added: full frontal impact and 50% frontal offset crash. Instead of using an adaptive polynomial response surface method, this paper employs a DOE/RSM method for exploring the design space and constructing highly nonlinear crash functions. Two NMO strategies are used and results are compared. This paper demonstrates that with high performance computing, a conventionally intractable real world full vehicle multidisciplinary optimization problem considering all performance attributes with large number of design variables become feasible.

  7. A Novel Instrument and Methodology for the In-Situ Measurement of the Stress in Thin Films

    NASA Technical Reports Server (NTRS)

    Broadway, David M.; Omokanwaye, Mayowa O.; Ramsey, Brian D.

    2014-01-01

    We introduce a novel methodology for the in-situ measurement of mechanical stress during thin film growth utilizing a highly sensitive non-contact variation of the classic spherometer. By exploiting the known spherical deformation of the substrate the value of the stress induced curvature is inferred by measurement of only one point on the substrate's surface-the sagittal. From the known curvature the stress can be calculated using the well-known Stoney equation. Based on this methodology, a stress sensor has been designed which is simple, highly sensitive, compact, and low cost. As a result of its compact nature, the sensor can be mounted in any orientation to accommodate a given deposition geometry without the need for extensive modification to an already existing deposition system. The technique employs the use of a double side polished substrate that offers good specular reflectivity and is isotropic in its mechanical properties, such as <111> oriented crystalline silicon or amorphous soda lime glass, for example. The measurement of the displacement of the uncoated side during deposition is performed with a high resolution (i.e. 5nm), commercially available, inexpensive, fiber optic sensor which can be used in both high vacuum and high temperature environments (i.e. 10(exp-7) Torr and 480oC, respectively). A key attribute of this instrument lies in its potential to achieve sensitivity that rivals other measurement techniques such as the micro cantilever method but, due to the comparatively larger substrate area, offers a more robust and practical alternative for subsequent measurement of additional characteristics of the film that can might be correlated to film stress. We present measurement results of nickel films deposited by magnetron sputtering which show good qualitative agreement to the know behavior of polycrystalline films previously reported by Hoffman.

  8. Design of invisibility cloaks with an open tunnel.

    PubMed

    Ako, Thomas; Yan, Min; Qiu, Min

    2010-12-20

    In this paper we apply the methodology of transformation optics for design of a novel invisibility cloak which can possess an open tunnel. Such a cloak facilitates the insertion (retrieval) of matter into (from) the cloak's interior without significantly affecting the cloak's performance, overcoming the matter exchange bottleneck inherent to most previously proposed cloak designs.We achieve this by applying a transformation which expands a point at the origin in electromagnetic space to a finite area in physical space in a highly anisotropic manner. The invisibility performance of the proposed cloak is verified by using full-wave finite-element simulations.

  9. A case study of the carbon footprint of milk from high-performing confinement and grass-based dairy farms.

    PubMed

    O'Brien, D; Capper, J L; Garnsworthy, P C; Grainger, C; Shalloo, L

    2014-03-01

    Life-cycle assessment (LCA) is the preferred methodology to assess carbon footprint per unit of milk. The objective of this case study was to apply an LCA method to compare carbon footprints of high-performance confinement and grass-based dairy farms. Physical performance data from research herds were used to quantify carbon footprints of a high-performance Irish grass-based dairy system and a top-performing United Kingdom (UK) confinement dairy system. For the US confinement dairy system, data from the top 5% of herds of a national database were used. Life-cycle assessment was applied using the same dairy farm greenhouse gas (GHG) model for all dairy systems. The model estimated all on- and off-farm GHG sources associated with dairy production until milk is sold from the farm in kilograms of carbon dioxide equivalents (CO2-eq) and allocated emissions between milk and meat. The carbon footprint of milk was calculated by expressing GHG emissions attributed to milk per tonne of energy-corrected milk (ECM). The comparison showed that when GHG emissions were only attributed to milk, the carbon footprint of milk from the Irish grass-based system (837 kg of CO2-eq/t of ECM) was 5% lower than the UK confinement system (884 kg of CO2-eq/t of ECM) and 7% lower than the US confinement system (898 kg of CO2-eq/t of ECM). However, without grassland carbon sequestration, the grass-based and confinement dairy systems had similar carbon footprints per tonne of ECM. Emission algorithms and allocation of GHG emissions between milk and meat also affected the relative difference and order of dairy system carbon footprints. For instance, depending on the method chosen to allocate emissions between milk and meat, the relative difference between the carbon footprints of grass-based and confinement dairy systems varied by 3 to 22%. This indicates that further harmonization of several aspects of the LCA methodology is required to compare carbon footprints of contrasting dairy systems. In comparison to recent reports that assess the carbon footprint of milk from average Irish, UK, and US dairy systems, this case study indicates that top-performing herds of the respective nations have carbon footprints 27 to 32% lower than average dairy systems. Although differences between studies are partly explained by methodological inconsistency, the comparison suggests that potential exists to reduce the carbon footprint of milk in each of the nations by implementing practices that improve productivity. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  10. [What is the methodological quality of articles on therapeutic procedures published in Cirugía Española?].

    PubMed

    Manterola, Carlos; Busquets, Juli; Pascual, Marta; Grande, Luis

    2006-02-01

    The aim of this study was to determine the methodological quality of articles on therapeutic procedures published in Cirugía Española and to study its association with the publication year, center, and subject-matter. A bibliometric study that included all articles on therapeutic procedures published in Cirugía Española between 2001 and 2004 was performed. All kinds of clinical designs were considered, excluding editorials, review articles, letters to editor, and experimental studies. The variables analyzed were: year of publication, center, design, and methodological quality. Methodological quality was determined by a valid and reliable scale. Descriptive statistics (calculation of means, standard deviation and medians) and analytical statistics (Pearson's chi2, nonparametric, ANOVA and Bonferroni tests) were used. A total of 244 articles were studied (197 case series [81%], 28 cohort studies [12%], 17 clinical trials [7%], 1 cross sectional study and 1 case-control study [0.8%]). The studies were performed mainly in Catalonia and Murcia (22% and 16%, respectively). The most frequent subject areas were soft tissue and hepatobiliopancreatic surgery (23% and 19%, respectively). The mean and median of the methodological quality score calculated for the entire series was 10.2 +/- 3.9 points and 9.5 points, respectively. Methodological quality significantly increased by publication year (p < 0.001). An association between methodological quality and subject area was observed but no association was detected with the center performing the study. The methodological quality of articles on therapeutic procedures published in Cirugía Española between 2001 and 2004 is low. However, a statistically significant trend toward improvement was observed.

  11. ISIS and META projects

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth; Cooper, Robert; Marzullo, Keith

    1990-01-01

    The ISIS project has developed a new methodology, virtual synchony, for writing robust distributed software. High performance multicast, large scale applications, and wide area networks are the focus of interest. Several interesting applications that exploit the strengths of ISIS, including an NFS-compatible replicated file system, are being developed. The META project is distributed control in a soft real-time environment incorporating feedback. This domain encompasses examples as diverse as monitoring inventory and consumption on a factory floor, and performing load-balancing on a distributed computing system. One of the first uses of META is for distributed application management: the tasks of configuring a distributed program, dynamically adapting to failures, and monitoring its performance. Recent progress and current plans are reported.

  12. Research opportunities in human behavior and performances

    NASA Technical Reports Server (NTRS)

    Christensen, J. M.; Talbot, J. M.

    1985-01-01

    The NASA research program in the biological and medical aspects of space flight includes investigations of human behavior and performance. The research focuses on psychological and psychophysiological responses to operational and environmental stresses and demands of spaceflight, and encompasses problems in perception, cognition, motivation, psychological stability, small group dynamics, and performance. The primary objective is to acquire the knowledge and methodology to aid in achieving high productivity and essential psychological support of space and ground crews in the Space Shuttle and space station programs. The Life Sciences Research Office (LSRO) of the Federation of American Societies for Experimental Biology reviewed its program in psychology and identified its research for future program planning to be in line with NASA's goals.

  13. Predicted Performance of a Thrust-Enhanced SR-71 Aircraft with an External Payload

    NASA Technical Reports Server (NTRS)

    Conners, Timothy R.

    1997-01-01

    NASA Dryden Flight Research Center has completed a preliminary performance analysis of the SR-71 aircraft for use as a launch platform for high-speed research vehicles and for carrying captive experimental packages to high altitude and Mach number conditions. Externally mounted research platforms can significantly increase drag, limiting test time and, in extreme cases, prohibiting penetration through the high-drag, transonic flight regime. To provide supplemental SR-71 acceleration, methods have been developed that could increase the thrust of the J58 turbojet engines. These methods include temperature and speed increases and augmentor nitrous oxide injection. The thrust-enhanced engines would allow the SR-71 aircraft to carry higher drag research platforms than it could without enhancement. This paper presents predicted SR-71 performance with and without enhanced engines. A modified climb-dive technique is shown to reduce fuel consumption when flying through the transonic flight regime with a large external payload. Estimates are included of the maximum platform drag profiles with which the aircraft could still complete a high-speed research mission. In this case, enhancement was found to increase the SR-71 payload drag capability by 25 percent. The thrust enhancement techniques and performance prediction methodology are described.

  14. Test Methodology Development for Experimental Structural Assessment of ASC Planar Spring Material for Long-Term Durability

    NASA Technical Reports Server (NTRS)

    Yun, Gunjin; Abdullah, A. B. M.; Binienda, Wieslaw; Krause, David L.; Kalluri, Sreeramesh

    2014-01-01

    A vibration-based testing methodology has been developed that will assess fatigue behavior of the metallic material of construction for the Advanced Stirling Convertor displacer (planar) spring component. To minimize the testing duration, the test setup is designed for base-excitation of a multiplespecimen arrangement, driven in a high-frequency resonant mode; this allows completion of fatigue testing in an accelerated period. A high performance electro-dynamic exciter (shaker) is used to generate harmonic oscillation of cantilever beam specimens, which are clasped on the shaker armature with specially-designed clamp fixtures. The shaker operates in closed-loop control with dynamic specimen response feedback provided by a scanning laser vibrometer. A test coordinator function synchronizes the shaker controller and the laser vibrometer to complete the closed-loop scheme. The test coordinator also monitors structural health of the test specimens throughout the test period, recognizing any change in specimen dynamic behavior. As this may be due to fatigue crack initiation, the test coordinator terminates test progression and then acquires test data in an orderly manner. Design of the specimen and fixture geometry was completed by finite element analysis such that peak stress does not occur at the clamping fixture attachment points. Experimental stress evaluation was conducted to verify the specimen stress predictions. A successful application of the experimental methodology was demonstrated by validation tests with carbon steel specimens subjected to fully-reversed bending stress; high-cycle fatigue failures were induced in such specimens using higher-than-prototypical stresses

  15. PRA (Probabilistic Risk Assessments) Participation versus Validation

    NASA Technical Reports Server (NTRS)

    DeMott, Diana; Banke, Richard

    2013-01-01

    Probabilistic Risk Assessments (PRAs) are performed for projects or programs where the consequences of failure are highly undesirable. PRAs primarily address the level of risk those projects or programs posed during operations. PRAs are often developed after the design has been completed. Design and operational details used to develop models include approved and accepted design information regarding equipment, components, systems and failure data. This methodology basically validates the risk parameters of the project or system design. For high risk or high dollar projects, using PRA methodologies during the design process provides new opportunities to influence the design early in the project life cycle to identify, eliminate or mitigate potential risks. Identifying risk drivers before the design has been set allows the design engineers to understand the inherent risk of their current design and consider potential risk mitigation changes. This can become an iterative process where the PRA model can be used to determine if the mitigation technique is effective in reducing risk. This can result in more efficient and cost effective design changes. PRA methodology can be used to assess the risk of design alternatives and can demonstrate how major design changes or program modifications impact the overall program or project risk. PRA has been used for the last two decades to validate risk predictions and acceptability. Providing risk information which can positively influence final system and equipment design the PRA tool can also participate in design development, providing a safe and cost effective product.

  16. Using Six Sigma for Performance Improvement in Business Curriculum: A Case Study

    ERIC Educational Resources Information Center

    Kukreja, Anil; Ricks, Joe M., Jr.; Meyer, Jean A.

    2009-01-01

    During the last few decades, a number of quality improvement methodologies have been used by organizations. This article provides a brief review of the quality improvement literature related to academia and a case study using Six Sigma methodology to analyze students' performance in a standardized examination. We found Six Sigma to be an effective…

  17. Bodily Writing and Performative Inquiry: Inviting an Arts-Based Research Methodology into Collaborative Doctoral Research Vocabularies

    ERIC Educational Resources Information Center

    Buono, Alexia; Gonzalez, Charles H.

    2017-01-01

    In this article, the authors (then two doctoral students) describe their methodology of engaging in an interdisciplinary, collaborative doctoral arts-based research (ABR) project. Education and the arts were integrated utilizing dance methods of bodily writing and performative inquiry to strengthen the analysis of dissertation findings in the…

  18. [Robotic systems for gait re-education in cases of spinal cord injury: a systematic review].

    PubMed

    Gandara-Sambade, T; Fernandez-Pereira, M; Rodriguez-Sotillo, A

    2017-03-01

    The evidence underlying robotic body weight supported treadmill training in patients with spinal cord injury remains poorly characterized. To perform a qualitative systematic review on the efficacy of this therapy. A search on PubMed, CINAHL, Cochrane Library and PEDro was performed from January 2005 to April 2016. The references in these articles were also reviewed to find papers not identified with the initial search strategy. The methodological level of the articles was evaluated with PEDro and Downs and Black scales. A total of 129 potentially interesting articles were found, of which 10 fulfilled the inclusion criteria. Those studies included 286 patients, who were predominantly young and male. Most of them had an incomplete spinal cord injury and were classified as C or D in ASIA scale. Robotic devices employed in these studies were Lokomat, Gait Trainer and LOPES. Improvement in walking parameters evaluated was more evident in young patients, those with subacute spinal cord injury, and those with high ASIA or LEMS scores. Conversely, factors such as etiology, level of injury or sex were less predictive of improvement. The methodological level of these studies was fair according to PEDro and Downs and Black scales. The evidence of gait training with robotic devices in patients with spinal cord injury is positive, although limited and with fair methodological quality.

  19. Optical modeling based on mean free path calculations for quantum dot phosphors applied to optoelectronic devices.

    PubMed

    Shin, Min-Ho; Kim, Hyo-Jun; Kim, Young-Joo

    2017-02-20

    We proposed an optical simulation model for the quantum dot (QD) nanophosphor based on the mean free path concept to understand precisely the optical performance of optoelectronic devices. A measurement methodology was also developed to get the desired optical characteristics such as the mean free path and absorption spectra for QD nanophosphors which are to be incorporated into the simulation. The simulation results for QD-based white LED and OLED displays show good agreement with the experimental values from the fabricated devices in terms of spectral power distribution, chromaticity coordinate, CCT, and CRI. The proposed simulation model and measurement methodology can be applied easily to the design of lots of optoelectronics devices using QD nanophosphors to obtain high efficiency and the desired color characteristics.

  20. HPCC Methodologies for Structural Design and Analysis on Parallel and Distributed Computing Platforms

    NASA Technical Reports Server (NTRS)

    Farhat, Charbel

    1998-01-01

    In this grant, we have proposed a three-year research effort focused on developing High Performance Computation and Communication (HPCC) methodologies for structural analysis on parallel processors and clusters of workstations, with emphasis on reducing the structural design cycle time. Besides consolidating and further improving the FETI solver technology to address plate and shell structures, we have proposed to tackle the following design related issues: (a) parallel coupling and assembly of independently designed and analyzed three-dimensional substructures with non-matching interfaces, (b) fast and smart parallel re-analysis of a given structure after it has undergone design modifications, (c) parallel evaluation of sensitivity operators (derivatives) for design optimization, and (d) fast parallel analysis of mildly nonlinear structures. While our proposal was accepted, support was provided only for one year.

  1. Stacked Autoencoders for Outlier Detection in Over-the-Horizon Radar Signals

    PubMed Central

    Protopapadakis, Eftychios; Doulamis, Anastasios; Doulamis, Nikolaos; Dres, Dimitrios; Bimpas, Matthaios

    2017-01-01

    Detection of outliers in radar signals is a considerable challenge in maritime surveillance applications. High-Frequency Surface-Wave (HFSW) radars have attracted significant interest as potential tools for long-range target identification and outlier detection at over-the-horizon (OTH) distances. However, a number of disadvantages, such as their low spatial resolution and presence of clutter, have a negative impact on their accuracy. In this paper, we explore the applicability of deep learning techniques for detecting deviations from the norm in behavioral patterns of vessels (outliers) as they are tracked from an OTH radar. The proposed methodology exploits the nonlinear mapping capabilities of deep stacked autoencoders in combination with density-based clustering. A comparative experimental evaluation of the approach shows promising results in terms of the proposed methodology's performance. PMID:29312449

  2. Multi-objective models of waste load allocation toward a sustainable reuse of drainage water in irrigation.

    PubMed

    Allam, Ayman; Tawfik, Ahmed; Yoshimura, Chihiro; Fleifle, Amr

    2016-06-01

    The present study proposes a waste load allocation (WLA) framework for a sustainable quality management of agricultural drainage water (ADW). Two multi-objective models, namely, abatement-performance and abatement-equity-performance, were developed through the integration of a water quality model (QAUL2Kw) and a genetic algorithm, by considering (1) the total waste load abatement, and (2) the inequity among waste dischargers. For successfully accomplishing modeling tasks, we developed a comprehensive overall performance measure (E wla ) reflecting possible violations of Egyptian standards for ADW reuse in irrigation. This methodology was applied to the Gharbia drain in the Nile Delta, Egypt, during both summer and winter seasons of 2012. Abatement-performance modeling results for a target of E wla = 100 % corresponded to the abatement ratio of the dischargers ranging from 20.7 to 75.6 % and 29.5 to 78.5 % in summer and in winter, respectively, alongside highly shifting inequity values. Abatement-equity-performance modeling results for a target of E wla = 90 % unraveled the necessity of increasing treatment efforts in three out of five dischargers during summer, and four out of five in winter. The trade-off curves obtained from WLA models proved their reliability in selecting appropriate WLA procedures as a function of budget constraints, principles of social equity, and desired overall performance level. Hence, the proposed framework of methodologies is of great importance to decision makers working toward a sustainable reuse of the ADW in irrigation.

  3. The Effect of Job Performance Aids on Quality Assurance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fosshage, Erik

    Job performance aids (JPAs) have been studied for many decades in a variety of disciplines and for many different types of tasks, yet this is the first known research experiment using JPAs in a quality assurance (QA) context. The objective of this thesis was to assess whether a JPA has an effect on the performance of a QA observer performing the concurrent dual verification technique for a basic assembly task. The JPA used in this study was a simple checklist, and the design borrows heavily from prior research on task analysis and other human factors principles. The assembly task andmore » QA construct of concurrent dual verification are consistent with those of a high consequence manufacturing environment. Results showed that the JPA had only a limited effect on QA performance in the context of this experiment. However, there were three important and unexpected findings that may draw interest from a variety of practitioners. First, a novel testing methodology sensitive enough to measure the effects of a JPA on performance was created. Second, the discovery that there are different probabilities of detection for different types of error in a QA context may be the most far-reaching results. Third, these results highlight the limitations of concurrent dual verification as a control against defects. It is hoped that both the methodology and results of this study are an effective baseline from which to launch future research activities.« less

  4. Mechanical Property Allowables Generated for the Solid Rocket Booster Composite Note Cap

    NASA Technical Reports Server (NTRS)

    Hodge, A. J.

    2000-01-01

    Mechanical property characterization was performed on AS4/3501-6 graphite/epoxy and SC350G syntactic foam for the SRB Composite Nose Cap Shuttle Upgrades Project. Lamina level properties for the graphite/epoxy were determined at room temperature, 240 F, 350 F, 480 F, 600 F, and 350 F after a cycle to 600 F. Graphite/epoxy samples were moisture conditioned prior to testing. The syntactic foam material was tested at room temperature, 350 F, and 480 F. A high-temperature test facility was developed at MSFC. Testing was performed with quartz lamp heaters and high resistance heater strips. The thermal history profile of the nose cap was simulated in order to test materials at various times during launch. A correlation study was performed with Southern Research Institute to confirm the test methodology and validity of test results. A-basis allowables were generated from the results of testing on three lots of material.

  5. Recent Progresses and Development of Advanced Atomic Layer Deposition towards High-Performance Li-Ion Batteries

    PubMed Central

    Lu, Wei; Liang, Longwei; Sun, Xuan; Sun, Xiaofei; Wu, Chen; Hou, Linrui; Sun, Jinfeng

    2017-01-01

    Electrode materials and electrolytes play a vital role in device-level performance of rechargeable Li-ion batteries (LIBs). However, electrode structure/component degeneration and electrode-electrolyte sur-/interface evolution are identified as the most crucial obstacles in practical applications. Thanks to its congenital advantages, atomic layer deposition (ALD) methodology has attracted enormous attention in advanced LIBs. This review mainly focuses upon the up-to-date progress and development of the ALD in high-performance LIBs. The significant roles of the ALD in rational design and fabrication of multi-dimensional nanostructured electrode materials, and finely tailoring electrode-electrolyte sur-/interfaces are comprehensively highlighted. Furthermore, we clearly envision that this contribution will motivate more extensive and insightful studies in the ALD to considerably improve Li-storage behaviors. Future trends and prospects to further develop advanced ALD nanotechnology in next-generation LIBs were also presented. PMID:29036916

  6. On a methodology for robust segmentation of nonideal iris images.

    PubMed

    Schmid, Natalia A; Zuo, Jinyu

    2010-06-01

    Iris biometric is one of the most reliable biometrics with respect to performance. However, this reliability is a function of the ideality of the data. One of the most important steps in processing nonideal data is reliable and precise segmentation of the iris pattern from remaining background. In this paper, a segmentation methodology that aims at compensating various nonidealities contained in iris images during segmentation is proposed. The virtue of this methodology lies in its capability to reliably segment nonideal imagery that is simultaneously affected with such factors as specular reflection, blur, lighting variation, occlusion, and off-angle images. We demonstrate the robustness of our segmentation methodology by evaluating ideal and nonideal data sets, namely, the Chinese Academy of Sciences iris data version 3 interval subdirectory, the iris challenge evaluation data, the West Virginia University (WVU) data, and the WVU off-angle data. Furthermore, we compare our performance to that of our implementation of Camus and Wildes's algorithm and Masek's algorithm. We demonstrate considerable improvement in segmentation performance over the formerly mentioned algorithms.

  7. Factors and competitiveness analysis in rare earth mining, new methodology: case study from Brazil.

    PubMed

    Silva, Gustavo A; Petter, Carlos O; Albuquerque, Nelson R

    2018-03-01

    Rare earths are increasingly being applied in high-tech industries, such as green energy (e.g. wind power), hybrid cars, electric cars, permanent high-performance magnets, superconductors, luminophores and many other industrial sectors involved in modern technologies. Given that China dominates this market and imposes restrictions on production and exports whenever opportunities arise, it is becoming more and more challenging to develop business ventures in this sector. Several initiatives were taken to prospect new resources and develop the production chain, including the mining of these mineral assets around the world, but some factors of uncertainties, including current low prices, increased the challenge of transforming the current resources into deposits or productive mines. Thus, analyzing the competitiveness of advanced projects becomes indispensable. This work has the objective of introducing a new methodology of competitiveness analysis, where some variables are considered as main factors that can contribute strongly to make unfeasible a mining enterprise for the use of rare earth elements (REE) with this methodology, which is quite practical and reproducible, it was possible to verify some real facts, such as: the fact that the Lynas Mount Weld CLD (AUS) Project is resilient to the uncertainties of the RE sector, at the same time as the Molycorp Project is facing major financial difficulties (under judicial reorganization). It was also possible to verify that the Araxá Project of CBMM in Brazil is one of the most competitive in this country. Thus, we contribute to the existing literature, providing a new methodology for competitiveness analysis in rare earth mining.

  8. Steps towards the international regulatory acceptance of non-animal methodology in safety assessment.

    PubMed

    Sewell, Fiona; Doe, John; Gellatly, Nichola; Ragan, Ian; Burden, Natalie

    2017-10-01

    The current animal-based paradigm for safety assessment must change. In September 2016, the UK National Centre for Replacement, Refinement and Reduction of Animals in Research (NC3Rs) brought together scientists from regulatory authorities, academia and industry to review progress in bringing new methodology into regulatory use, and to identify ways to expedite progress. Progress has been slow. Science is advancing to make this possible but changes are necessary. The new paradigm should allow new methodology to be adopted once it is developed rather than being based on a fixed set of studies. Regulatory authorities can help by developing Performance-Based Standards. The most pressing need is in repeat dose toxicology, although setting standards will be more complex than in areas such as sensitization. Performance standards should be aimed directly at human safety, not at reproducing the results of animal studies. Regulatory authorities can also aid progress towards the acceptance of non-animal based methodology by promoting "safe-haven" trials where traditional and new methodology data can be submitted in parallel to build up experience in the new methods. Industry can play its part in the acceptance of new methodology, by contributing to the setting of performance standards and by actively contributing to "safe-haven" trials. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Nonlinear stability and control study of highly maneuverable high performance aircraft, phase 2

    NASA Technical Reports Server (NTRS)

    Mohler, R. R.

    1992-01-01

    Research leading to the development of new nonlinear methodologies for the adaptive control and stability analysis of high angle of attack aircraft such as the F-18 is discussed. The emphasis has been on nonlinear adaptive control, but associated model development, system identification, stability analysis, and simulation were studied in some detail as well. Studies indicated that nonlinear adaptive control can outperform linear adaptive control for rapid maneuvers with large changes in angle of attack. Included here are studies on nonlinear model algorithmic controller design and an analysis of nonlinear system stability using robust stability analysis for linear systems.

  10. High performance MPEG-audio decoder IC

    NASA Technical Reports Server (NTRS)

    Thorn, M.; Benbassat, G.; Cyr, K.; Li, S.; Gill, M.; Kam, D.; Walker, K.; Look, P.; Eldridge, C.; Ng, P.

    1993-01-01

    The emerging digital audio and video compression technology brings both an opportunity and a new challenge to IC design. The pervasive application of compression technology to consumer electronics will require high volume, low cost IC's and fast time to market of the prototypes and production units. At the same time, the algorithms used in the compression technology result in complex VLSI IC's. The conflicting challenges of algorithm complexity, low cost, and fast time to market have an impact on device architecture and design methodology. The work presented in this paper is about the design of a dedicated, high precision, Motion Picture Expert Group (MPEG) audio decoder.

  11. AXAOTHER XL -- A spreadsheet for determining doses for incidents caused by tornadoes or high-velocity straight winds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpkins, A.A.

    1996-09-01

    AXAOTHER XL is an Excel Spreadsheet used to determine dose to the maximally exposed offsite individual during high-velocity straight winds or tornado conditions. Both individual and population doses may be considered. Potential exposure pathways are inhalation and plume shine. For high-velocity straight winds the spreadsheet has the capability to determine the downwind relative air concentration, however for the tornado conditions, the user must enter the relative air concentration. Theoretical models are discussed and hand calculations are performed to ensure proper application of methodologies. A section has also been included that contains user instructions for the spreadsheet.

  12. Organizational Change Efforts: Methodologies for Assessing Organizational Effectiveness and Program Costs versus Benefits.

    ERIC Educational Resources Information Center

    Macy, Barry A.; Mirvis, Philip H.

    1982-01-01

    A standardized methodology for identifying, defining, and measuring work behavior and performance rather than production, and a methodology that estimates the costs and benefits of work innovation are presented for assessing organizational effectiveness and program costs versus benefits in organizational change programs. Factors in a cost-benefit…

  13. The Literature Review of Analytical Support to Defence Transformation: Lessons Learned from Turkish Air Force Transformation Activities

    DTIC Science & Technology

    2010-04-01

    available [11]. Additionally, Table-3 is a guide for DMAIC methodology including 29 different methods [12]. RTO-MP-SAS-081 6 - 4 NATO UNCLASSIFIED NATO...Table 3: DMAIC Methodology (5-Phase Methodology). Define Measure Analyze Improve Control Project Charter Prioritization Matrix 5 Whys Analysis...Methodology Scope [13] DMAIC PDCA Develop performance priorities This is a preliminary stage that precedes specific improvement projects, and the aim

  14. Multiplex cytokine profiling with highly pathogenic material: use of formalin solution in luminex analysis.

    PubMed

    Dowall, Stuart D; Graham, Victoria A; Tipton, Thomas R W; Hewson, Roger

    2009-08-31

    Work with highly pathogenic material mandates the use of biological containment facilities, involving microbiological safety cabinets and specialist laboratory engineering structures typified by containment level 3 (CL3) and CL4 laboratories. Consequences of working in high containment are the practical difficulties associated with containing specialist assays and equipment often essential for experimental analyses. In an era of increased interest in biodefence pathogens and emerging diseases, immunological analysis has developed rapidly alongside traditional techniques in virology and molecular biology. For example, in order to maximise the use of small sample volumes, multiplexing has become a more popular and widespread approach to quantify multiple analytes simultaneously, such as cytokines and chemokines. The luminex microsphere system allows for the detection of many cytokines and chemokines in a single sample, but the detection method of using aligned lasers and fluidics means that samples often have to be analysed in low containment facilities. In order to perform cytokine analysis in materials from high containment (CL3 and CL4 laboratories), we have developed an appropriate inactivation methodology after staining steps, which although results in a reduction of median fluorescent intensity, produces statistically comparable outcomes when judged against non-inactivated samples. This methodology thus extends the use of luminex technology for material that contains highly pathogenic biological agents.

  15. Single-shot and single-sensor high/super-resolution microwave imaging based on metasurface

    PubMed Central

    Wang, Libo; Li, Lianlin; Li, Yunbo; Zhang, Hao Chi; Cui, Tie Jun

    2016-01-01

    Real-time high-resolution (including super-resolution) imaging with low-cost hardware is a long sought-after goal in various imaging applications. Here, we propose broadband single-shot and single-sensor high-/super-resolution imaging by using a spatio-temporal dispersive metasurface and an imaging reconstruction algorithm. The metasurface with spatio-temporal dispersive property ensures the feasibility of the single-shot and single-sensor imager for super- and high-resolution imaging, since it can convert efficiently the detailed spatial information of the probed object into one-dimensional time- or frequency-dependent signal acquired by a single sensor fixed in the far-field region. The imaging quality can be improved by applying a feature-enhanced reconstruction algorithm in post-processing, and the desired imaging resolution is related to the distance between the object and metasurface. When the object is placed in the vicinity of the metasurface, the super-resolution imaging can be realized. The proposed imaging methodology provides a unique means to perform real-time data acquisition, high-/super-resolution images without employing expensive hardware (e.g. mechanical scanner, antenna array, etc.). We expect that this methodology could make potential breakthroughs in the areas of microwave, terahertz, optical, and even ultrasound imaging. PMID:27246668

  16. Experimental Equipment Design and Fabrication Study for Delta-G Experiment

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The Research Machine Shop at UAH did not develop any new technology in the performance of the following tasks. All tasks were performed as specified.UAH RMS shall design and fabricate a "poor" model of a silicon-carbide high-temperature crucible with dimensions of 8 inches in diameter and 4 inches high-temperature crucible for pouring liquid ceramic materials at 1200 C into molds from heating ovens. The crucible shall also be designed with a manipulation fixture to facilitate holding and pouring of the heated liquid material. UAH RMS shall investigate the availability of 400 Hz, high-current (65 volts @ 100 amperes) power systems for use in high-speed rotating disk experiments, UAH RMS shall investigate, develop a methodology, and experiment on the application of filament-wound carbon fibers to the periphery of ceramic superconductors to withstand high levels of rotational g-forces. UAH RMS shall provide analytical data to verify the resulting improved disc with carbon composite fibers.

  17. WAIS-III index score profiles in the Canadian standardization sample.

    PubMed

    Lange, Rael T

    2007-01-01

    Representative index score profiles were examined in the Canadian standardization sample of the Wechsler Adult Intelligence Scale-Third Edition (WAIS-III). The identification of profile patterns was based on the methodology proposed by Lange, Iverson, Senior, and Chelune (2002) that aims to maximize the influence of profile shape and minimize the influence of profile magnitude on the cluster solution. A two-step cluster analysis procedure was used (i.e., hierarchical and k-means analyses). Cluster analysis of the four index scores (i.e., Verbal Comprehension [VCI], Perceptual Organization [POI], Working Memory [WMI], Processing Speed [PSI]) identified six profiles in this sample. Profiles were differentiated by pattern of performance and were primarily characterized as (a) high VCI/POI, low WMI/PSI, (b) low VCI/POI, high WMI/PSI, (c) high PSI, (d) low PSI, (e) high VCI/WMI, low POI/PSI, and (f) low VCI, high POI. These profiles are potentially useful for determining whether a patient's WAIS-III performance is unusual in a normal population.

  18. Author-paper affiliation network architecture influences the methodological quality of systematic reviews and meta-analyses of psoriasis

    PubMed Central

    Gomez-Garcia, Francisco; Alcalde-Mellado, Patricia; Gay-Mimbrera, Jesus; Aguilar-Luque, Macarena; Maestre-Lopez, Beatriz; Gonzalez-Padilla, Marcelino; Carmona-Fernandez, Pedro J.; Velez Garcia-Nieto, Antonio; Isla-Tejera, Beatriz

    2017-01-01

    Moderate-to-severe psoriasis is associated with significant comorbidity, an impaired quality of life, and increased medical costs, including those associated with treatments. Systematic reviews (SRs) and meta-analyses (MAs) of randomized clinical trials are considered two of the best approaches to the summarization of high-quality evidence. However, methodological bias can reduce the validity of conclusions from these types of studies and subsequently impair the quality of decision making. As co-authorship is among the most well-documented forms of research collaboration, the present study aimed to explore whether authors’ collaboration methods might influence the methodological quality of SRs and MAs of psoriasis. Methodological quality was assessed by two raters who extracted information from full articles. After calculating total and per-item Assessment of Multiple Systematic Reviews (AMSTAR) scores, reviews were classified as low (0-4), medium (5-8), or high (9-11) quality. Article metadata and journal-related bibliometric indices were also obtained. A total of 741 authors from 520 different institutions and 32 countries published 220 reviews that were classified as high (17.2%), moderate (55%), or low (27.7%) methodological quality. The high methodological quality subnetwork was larger but had a lower connection density than the low and moderate methodological quality subnetworks; specifically, the former contained relatively fewer nodes (authors and reviews), reviews by authors, and collaborators per author. Furthermore, the high methodological quality subnetwork was highly compartmentalized, with several modules representing few poorly interconnected communities. In conclusion, structural differences in author-paper affiliation network may influence the methodological quality of SRs and MAs on psoriasis. As the author-paper affiliation network structure affects study quality in this research field, authors who maintain an appropriate balance between scientific quality and productivity are more likely to develop higher quality reviews. PMID:28403245

  19. The space station assembly phase: Flight telerobotic servicer feasibility. Volume 2: Methodology and case study

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey H.; Gyamfi, Max A.; Volkmer, Kent; Zimmerman, Wayne F.

    1987-01-01

    A methodology is described for examining the feasibility of a Flight Telerobotic Servicer (FTS) using two assembly scenarios, defined at the EVA task level, for the 30 shuttle flights (beginning with MB-1) over a four-year period. Performing all EVA tasks by crew only is compared to a scenario in which crew EVA is augmented by FTS. A reference FTS concept is used as a technology baseline and life-cycle cost analysis is performed to highlight cost tradeoffs. The methodology, procedure, and data used to complete the analysis are documented in detail.

  20. Assembly line performance and modeling

    NASA Astrophysics Data System (ADS)

    Rane, Arun B.; Sunnapwar, Vivek K.

    2017-09-01

    Automobile sector forms the backbone of manufacturing sector. Vehicle assembly line is important section in automobile plant where repetitive tasks are performed one after another at different workstations. In this thesis, a methodology is proposed to reduce cycle time and time loss due to important factors like equipment failure, shortage of inventory, absenteeism, set-up, material handling, rejection and fatigue to improve output within given cost constraints. Various relationships between these factors, corresponding cost and output are established by scientific approach. This methodology is validated in three different vehicle assembly plants. Proposed methodology may help practitioners to optimize the assembly line using lean techniques.

  1. Human perception testing methodology for evaluating EO/IR imaging systems

    NASA Astrophysics Data System (ADS)

    Graybeal, John J.; Monfort, Samuel S.; Du Bosq, Todd W.; Familoni, Babajide O.

    2018-04-01

    The U.S. Army's RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD) Perception Lab is tasked with supporting the development of sensor systems for the U.S. Army by evaluating human performance of emerging technologies. Typical research questions involve detection, recognition and identification as a function of range, blur, noise, spectral band, image processing techniques, image characteristics, and human factors. NVESD's Perception Lab provides an essential bridge between the physics of the imaging systems and the performance of the human operator. In addition to quantifying sensor performance, perception test results can also be used to generate models of human performance and to drive future sensor requirements. The Perception Lab seeks to develop and employ scientifically valid and efficient perception testing procedures within the practical constraints of Army research, including rapid development timelines for critical technologies, unique guidelines for ethical testing of Army personnel, and limited resources. The purpose of this paper is to describe NVESD Perception Lab capabilities, recent methodological improvements designed to align our methodology more closely with scientific best practice, and to discuss goals for future improvements and expanded capabilities. Specifically, we discuss modifying our methodology to improve training, to account for human fatigue, to improve assessments of human performance, and to increase experimental design consultation provided by research psychologists. Ultimately, this paper outlines a template for assessing human perception and overall system performance related to EO/IR imaging systems.

  2. Evidence for current recommendations concerning the management of foot health for people with chronic long-term conditions: a systematic review.

    PubMed

    Edwards, Katherine; Borthwick, Alan; McCulloch, Louise; Redmond, Anthony; Pinedo-Villanueva, Rafael; Prieto-Alhambra, Daniel; Judge, Andrew; Arden, Nigel; Bowen, Catherine

    2017-01-01

    Research focusing on management of foot health has become more evident over the past decade, especially related to chronic conditions such as diabetes. The level of methodological rigour across this body of work however is varied and outputs do not appear to have been developed or translated into clinical practice. The aim of this systematic review was to assess the latest guidelines, standards of care and current recommendations relative to people with chronic conditions to ascertain the level of supporting evidence concerning the management of foot health. A systematic search of electronic databases (Medline, Embase, Cinahl, Web of Science, SCOPUS and The Cochrane Library) for literature on recommendations for foot health management for people with chronic conditions was performed between 2000 and 2016 using predefined criteria. Data from the included publications was synthesised via template analysis, employing a thematic organisation and structure. The methodological quality of all included publications was appraised using the Appraisal for Research and Evaluation (AGREE II) instrument. A more in-depth analysis was carried out that specifically considered the levels of evidence that underpinned the strength of their recommendations concerning management of foot health. The data collected revealed 166 publications in which the majority (102) were guidelines, standards of care or recommendations related to the treatment and management of diabetes. We noted a trend towards a systematic year on year increase in guidelines standards of care or recommendations related to the treatment and management of long term conditions other than diabetes over the past decade. The most common recommendation is for preventive care or assessments (e.g. vascular tests), followed by clinical interventions such as foot orthoses, foot ulcer care and foot health education. Methodological quality was spread across the range of AGREE II scores with 62 publications falling into the category of high quality (scores 6-7). The number of publications providing a recommendation in the context of a narrative but without an indication of the strength or quality of the underlying evidence was high (79 out of 166). It is clear that evidence needs to be accelerated and in place to support the future of the Podiatry workforce. Whilst high level evidence for podiatry is currently low in quantity, the methodological quality is growing. Where levels of evidence have been given in in high quality guidelines, standards of care or recommendations, they also tend to be strong-moderate quality such that further strategically prioritised research, if performed, is likely to have an important impact in the field.

  3. Ultra-high-performance liquid chromatography-Time-of-flight high resolution mass spectrometry to quantify acidic drugs in wastewater.

    PubMed

    Becerra-Herrera, Mercedes; Honda, Luis; Richter, Pablo

    2015-12-04

    A novel analytical approach involving an improved rotating-disk sorptive extraction (RDSE) procedure and ultra-high-performance liquid chromatography (UHPLC) coupled to an ultraspray electrospray ionization source (UESI) and time-of-flight mass spectrometry (TOF/MS), in trap mode, was developed to identify and quantify four non-steroidal anti-inflammatory drugs (NSAIDs) (naproxen, ibuprofen, ketoprofen and diclofenac) and two anti-cholesterol drugs (ACDs) (clofibric acid and gemfibrozil) that are widely used and typically found in water samples. The method reduced the amount of both sample and reagents used and also the time required for the whole analysis, resulting in a reliable and green analytical strategy. The analytical eco-scale was calculated, showing that this methodology is an excellent green analysis, increasing its ecological worth. The detection limits (LOD) and precision (%RSD) were lower than 90ng/L and 10%, respectively. Matrix effects and recoveries were studied using samples from the influent of a wastewater treatment plant (WWTP). All the compounds exhibited suppression of their signals due to matrix effects, and the recoveries were approximately 100%. The applicability and reliability of this methodology were confirmed through the analysis of influent and effluent samples from a WWTP in Santiago, Chile, obtaining concentrations ranging from 1.1 to 20.5μg/L and from 0.5 to 8.6μg/L, respectively. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. When Ultrasonic Sensors and Computer Vision Join Forces for Efficient Obstacle Detection and Recognition

    PubMed Central

    Mocanu, Bogdan; Tapu, Ruxandra; Zaharia, Titus

    2016-01-01

    In the most recent report published by the World Health Organization concerning people with visual disabilities it is highlighted that by the year 2020, worldwide, the number of completely blind people will reach 75 million, while the number of visually impaired (VI) people will rise to 250 million. Within this context, the development of dedicated electronic travel aid (ETA) systems, able to increase the safe displacement of VI people in indoor/outdoor spaces, while providing additional cognition of the environment becomes of outmost importance. This paper introduces a novel wearable assistive device designed to facilitate the autonomous navigation of blind and VI people in highly dynamic urban scenes. The system exploits two independent sources of information: ultrasonic sensors and the video camera embedded in a regular smartphone. The underlying methodology exploits computer vision and machine learning techniques and makes it possible to identify accurately both static and highly dynamic objects existent in a scene, regardless on their location, size or shape. In addition, the proposed system is able to acquire information about the environment, semantically interpret it and alert users about possible dangerous situations through acoustic feedback. To determine the performance of the proposed methodology we have performed an extensive objective and subjective experimental evaluation with the help of 21 VI subjects from two blind associations. The users pointed out that our prototype is highly helpful in increasing the mobility, while being friendly and easy to learn. PMID:27801834

  5. When Ultrasonic Sensors and Computer Vision Join Forces for Efficient Obstacle Detection and Recognition.

    PubMed

    Mocanu, Bogdan; Tapu, Ruxandra; Zaharia, Titus

    2016-10-28

    In the most recent report published by the World Health Organization concerning people with visual disabilities it is highlighted that by the year 2020, worldwide, the number of completely blind people will reach 75 million, while the number of visually impaired (VI) people will rise to 250 million. Within this context, the development of dedicated electronic travel aid (ETA) systems, able to increase the safe displacement of VI people in indoor/outdoor spaces, while providing additional cognition of the environment becomes of outmost importance. This paper introduces a novel wearable assistive device designed to facilitate the autonomous navigation of blind and VI people in highly dynamic urban scenes. The system exploits two independent sources of information: ultrasonic sensors and the video camera embedded in a regular smartphone. The underlying methodology exploits computer vision and machine learning techniques and makes it possible to identify accurately both static and highly dynamic objects existent in a scene, regardless on their location, size or shape. In addition, the proposed system is able to acquire information about the environment, semantically interpret it and alert users about possible dangerous situations through acoustic feedback. To determine the performance of the proposed methodology we have performed an extensive objective and subjective experimental evaluation with the help of 21 VI subjects from two blind associations. The users pointed out that our prototype is highly helpful in increasing the mobility, while being friendly and easy to learn.

  6. Multidisciplinary Concurrent Design Optimization via the Internet

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Kelkar, Atul G.; Koganti, Gopichand

    2001-01-01

    A methodology is presented which uses commercial design and analysis software and the Internet to perform concurrent multidisciplinary optimization. The methodology provides a means to develop multidisciplinary designs without requiring that all software be accessible from the same local network. The procedures are amenable to design and development teams whose members, expertise and respective software are not geographically located together. This methodology facilitates multidisciplinary teams working concurrently on a design problem of common interest. Partition of design software to different machines allows each constituent software to be used on the machine that provides the most economy and efficiency. The methodology is demonstrated on the concurrent design of a spacecraft structure and attitude control system. Results are compared to those derived from performing the design with an autonomous FORTRAN program.

  7. A Novel Consensus-Based Particle Swarm Optimization-Assisted Trust-Tech Methodology for Large-Scale Global Optimization.

    PubMed

    Zhang, Yong-Feng; Chiang, Hsiao-Dong

    2017-09-01

    A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.

  8. Geodiametris: an integrated geoinformatic approach for monitoring land pollution from the disposal of olive oil mill wastes

    NASA Astrophysics Data System (ADS)

    Alexakis, Dimitrios D.; Sarris, Apostolos; Papadopoulos, Nikos; Soupios, Pantelis; Doula, Maria; Cavvadias, Victor

    2014-08-01

    The olive-oil industry is one of the most important sectors of agricultural production in Greece, which is the third in olive-oil production country worldwide. Olive oil mill wastes (OOMW) constitute a major factor in pollution in olivegrowing regions and an important problem to be solved for the agricultural industry. The olive-oil mill wastes are normally deposited at tanks, or directly in the soil or even on adjacent torrents, rivers and lakes posing a high risk to the environmental pollution and the community health. GEODIAMETRIS project aspires to develop integrated geoinformatic methodologies for performing monitoring of land pollution from the disposal of OOMW in the island of Crete -Greece. These methodologies integrate GPS surveys, satellite remote sensing and risk assessment analysis in GIS environment, application of in situ and laboratory geophysical methodologies as well as soil and water physicochemical analysis. Concerning project's preliminary results, all the operating OOMW areas located in Crete have been already registered through extensive GPS field campaigns. Their spatial and attribute information has been stored in an integrated GIS database and an overall OOMW spectral signature database has been constructed through the analysis of multi-temporal Landsat-8 OLI satellite images. In addition, a specific OOMW area located in Alikianos village (Chania-Crete) has been selected as one of the main case study areas. Various geophysical methodologies, such as Electrical Resistivity Tomography, Induced Polarization, multifrequency electromagnetic, Self Potential measurements and Ground Penetrating Radar have been already implemented. Soil as well as liquid samples have been collected for performing physico-chemical analysis. The preliminary results have already contributed to the gradual development of an integrated environmental monitoring tool for studying and understanding environmental degradation from the disposal of OOMW.

  9. Evaluation of a handheld point-of-care analyser for measurement of creatinine in cats.

    PubMed

    Reeve, Jenny; Warman, Sheena; Lewis, Daniel; Watson, Natalie; Papasouliotis, Kostas

    2017-02-01

    Objectives The aim of the study was to evaluate whether a handheld creatinine analyser (StatSensor Xpress; SSXp), available for human patients, can be used to measure creatinine reliably in cats. Methods Analytical performance was evaluated by determining within- and between-run coefficient of variation (CV, %), total error observed (TE obs , %) and sigma metrics. Fifty client-owned cats presenting for investigation of clinical disease had creatinine measured simultaneously, using SSXp (whole blood and plasma) and a reference instrument (Konelab, serum); 48 paired samples were included in the study. Creatinine correlation between methodologies (SSXp vs Konelab) and sample types (SSXp whole blood vs SSXp plasma ) was assessed by Spearman's correlation coefficient and agreement was determined using Bland-Altman difference plots. Each creatinine value was assigned an IRIS stage (1-4); correlation and agreement between Konelab and SSXp IRIS stages were evaluated. Results Within-run CV (4.23-8.85%), between-run CV (8.95-11.72%), TE obs (22.15-34.92%) and sigma metrics (⩽3) did not meet desired analytical requirements. Correlation between sample types was high (SSXp whole blood vs SSXp plasma ; r = 0.89), and between instruments was high (SSXp whole blood vs Konelab serum ; r = 0.85) to very high (SSXp plasma vs Konelab serum ; r = 0.91). Konelab and SSXp whole blood IRIS scores exhibited high correlation ( r = 0.76). Packed cell volume did not significantly affect SSXp determination of creatinine. Bland-Altman difference plots identified a positive bias for the SSXp (7.13 μmol/l SSXp whole blood ; 20.23 μmol/l SSXp plasma ) compared with the Konelab. Outliers (1/48 whole blood; 2/48 plasma) occurred exclusively at very high creatinine concentrations. The SSXp failed to identify 2/21 azotaemic cats. Conclusions and relevance Analytical performance of the SSXp in feline patients is not considered acceptable. The SSXp exhibited a high to very high correlation compared with the reference methodology but the two instruments cannot be used interchangeably. Improvements in the SSXp analytical performance are needed before its use can be recommended in feline clinical practice.

  10. INTEGRATING DATA ANALYTICS AND SIMULATION METHODS TO SUPPORT MANUFACTURING DECISION MAKING

    PubMed Central

    Kibira, Deogratias; Hatim, Qais; Kumara, Soundar; Shao, Guodong

    2017-01-01

    Modern manufacturing systems are installed with smart devices such as sensors that monitor system performance and collect data to manage uncertainties in their operations. However, multiple parameters and variables affect system performance, making it impossible for a human to make informed decisions without systematic methodologies and tools. Further, the large volume and variety of streaming data collected is beyond simulation analysis alone. Simulation models are run with well-prepared data. Novel approaches, combining different methods, are needed to use this data for making guided decisions. This paper proposes a methodology whereby parameters that most affect system performance are extracted from the data using data analytics methods. These parameters are used to develop scenarios for simulation inputs; system optimizations are performed on simulation data outputs. A case study of a machine shop demonstrates the proposed methodology. This paper also reviews candidate standards for data collection, simulation, and systems interfaces. PMID:28690363

  11. Current target acquisition methodology in force on force simulations

    NASA Astrophysics Data System (ADS)

    Hixson, Jonathan G.; Miller, Brian; Mazz, John P.

    2017-05-01

    The U.S. Army RDECOM CERDEC NVESD MSD's target acquisition models have been used for many years by the military community in force on force simulations for training, testing, and analysis. There have been significant improvements to these models over the past few years. The significant improvements are the transition of ACQUIRE TTP-TAS (ACQUIRE Targeting Task Performance Target Angular Size) methodology for all imaging sensors and the development of new discrimination criteria for urban environments and humans. This paper is intended to provide an overview of the current target acquisition modeling approach and provide data for the new discrimination tasks. This paper will discuss advances and changes to the models and methodologies used to: (1) design and compare sensors' performance, (2) predict expected target acquisition performance in the field, (3) predict target acquisition performance for combat simulations, and (4) how to conduct model data validation for combat simulations.

  12. Robust decentralized controller for minimizing coupling effect in single inductor multiple output DC-DC converter operating in continuous conduction mode.

    PubMed

    Medeiros, Renan Landau Paiva de; Barra, Walter; Bessa, Iury Valente de; Chaves Filho, João Edgar; Ayres, Florindo Antonio de Cavalho; Neves, Cleonor Crescêncio das

    2018-02-01

    This paper describes a novel robust decentralized control design methodology for a single inductor multiple output (SIMO) DC-DC converter. Based on a nominal multiple input multiple output (MIMO) plant model and performance requirements, a pairing input-output analysis is performed to select the suitable input to control each output aiming to attenuate the loop coupling. Thus, the plant uncertainty limits are selected and expressed in interval form with parameter values of the plant model. A single inductor dual output (SIDO) DC-DC buck converter board is developed for experimental tests. The experimental results show that the proposed methodology can maintain a desirable performance even in the presence of parametric uncertainties. Furthermore, the performance indexes calculated from experimental data show that the proposed methodology outperforms classical MIMO control techniques. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Biomimetic Dissolution: A Tool to Predict Amorphous Solid Dispersion Performance.

    PubMed

    Puppolo, Michael M; Hughey, Justin R; Dillon, Traciann; Storey, David; Jansen-Varnum, Susan

    2017-11-01

    The presented study describes the development of a membrane permeation non-sink dissolution method that can provide analysis of complete drug speciation and emulate the in vivo performance of poorly water-soluble Biopharmaceutical Classification System class II compounds. The designed membrane permeation methodology permits evaluation of free/dissolved/unbound drug from amorphous solid dispersion formulations with the use of a two-cell apparatus, biorelevant dissolution media, and a biomimetic polymer membrane. It offers insight into oral drug dissolution, permeation, and absorption. Amorphous solid dispersions of felodipine were prepared by hot melt extrusion and spray drying techniques and evaluated for in vitro performance. Prior to ranking performance of extruded and spray-dried felodipine solid dispersions, optimization of the dissolution methodology was performed for parameters such as agitation rate, membrane type, and membrane pore size. The particle size and zeta potential were analyzed during dissolution experiments to understand drug/polymer speciation and supersaturation sustainment of felodipine solid dispersions. Bland-Altman analysis was performed to measure the agreement or equivalence between dissolution profiles acquired using polymer membranes and porcine intestines and to establish the biomimetic nature of the treated polymer membranes. The utility of the membrane permeation dissolution methodology is seen during the evaluation of felodipine solid dispersions produced by spray drying and hot melt extrusion. The membrane permeation dissolution methodology can suggest formulation performance and be employed as a screening tool for selection of candidates to move forward to pharmacokinetic studies. Furthermore, the presented model is a cost-effective technique.

  14. Deep Throttle Turbopump Technology Design Concepts

    NASA Technical Reports Server (NTRS)

    Guinzburg, Adiel; Williams, Morgan; Ferguson, Tom; Garcia, Roberto (Technical Monitor)

    2002-01-01

    The objective of this project is to increase the throttling range of turbopumps from 30 to 120% of the design value, while maintaining high performance levels. Details are given on wide flow range issues, H-Q characteristics, stall characteristics, energy levels, pressure fluctuations at impeller exit, WFR impeller characteristics, commercial diffuser pumps, slotted or tandem vanes, leading edge characteristics, leading edge models, throat models, diffusion passage models, computational fluid dynamics (CFD) methodologies, and CFD flow cases.

  15. A controlled genetic algorithm by fuzzy logic and belief functions for job-shop scheduling.

    PubMed

    Hajri, S; Liouane, N; Hammadi, S; Borne, P

    2000-01-01

    Most scheduling problems are highly complex combinatorial problems. However, stochastic methods such as genetic algorithm yield good solutions. In this paper, we present a controlled genetic algorithm (CGA) based on fuzzy logic and belief functions to solve job-shop scheduling problems. For better performance, we propose an efficient representational scheme, heuristic rules for creating the initial population, and a new methodology for mixing and computing genetic operator probabilities.

  16. Evaluation methodologies for an advanced information processing system

    NASA Technical Reports Server (NTRS)

    Schabowsky, R. S., Jr.; Gai, E.; Walker, B. K.; Lala, J. H.; Motyka, P.

    1984-01-01

    The system concept and requirements for an Advanced Information Processing System (AIPS) are briefly described, but the emphasis of this paper is on the evaluation methodologies being developed and utilized in the AIPS program. The evaluation tasks include hardware reliability, maintainability and availability, software reliability, performance, and performability. Hardware RMA and software reliability are addressed with Markov modeling techniques. The performance analysis for AIPS is based on queueing theory. Performability is a measure of merit which combines system reliability and performance measures. The probability laws of the performance measures are obtained from the Markov reliability models. Scalar functions of this law such as the mean and variance provide measures of merit in the AIPS performability evaluations.

  17. A methodology for reduced order modeling and calibration of the upper atmosphere

    NASA Astrophysics Data System (ADS)

    Mehta, Piyush M.; Linares, Richard

    2017-10-01

    Atmospheric drag is the largest source of uncertainty in accurately predicting the orbit of satellites in low Earth orbit (LEO). Accurately predicting drag for objects that traverse LEO is critical to space situational awareness. Atmospheric models used for orbital drag calculations can be characterized either as empirical or physics-based (first principles based). Empirical models are fast to evaluate but offer limited real-time predictive/forecasting ability, while physics based models offer greater predictive/forecasting ability but require dedicated parallel computational resources. Also, calibration with accurate data is required for either type of models. This paper presents a new methodology based on proper orthogonal decomposition toward development of a quasi-physical, predictive, reduced order model that combines the speed of empirical and the predictive/forecasting capabilities of physics-based models. The methodology is developed to reduce the high dimensionality of physics-based models while maintaining its capabilities. We develop the methodology using the Naval Research Lab's Mass Spectrometer Incoherent Scatter model and show that the diurnal and seasonal variations can be captured using a small number of modes and parameters. We also present calibration of the reduced order model using the CHAMP and GRACE accelerometer-derived densities. Results show that the method performs well for modeling and calibration of the upper atmosphere.

  18. Benefit-cost methodology study with example application of the use of wind generators

    NASA Technical Reports Server (NTRS)

    Zimmer, R. P.; Justus, C. G.; Mason, R. M.; Robinette, S. L.; Sassone, P. G.; Schaffer, W. A.

    1975-01-01

    An example application for cost-benefit methodology is presented for the use of wind generators. The approach adopted for the example application consisted of the following activities: (1) surveying of the available wind data and wind power system information, (2) developing models which quantitatively described wind distributions, wind power systems, and cost-benefit differences between conventional systems and wind power systems, and (3) applying the cost-benefit methodology to compare a conventional electrical energy generation system with systems which included wind power generators. Wind speed distribution data were obtained from sites throughout the contiguous United States and were used to compute plant factor contours shown on an annual and seasonal basis. Plant factor values (ratio of average output power to rated power) are found to be as high as 0.6 (on an annual average basis) in portions of the central U. S. and in sections of the New England coastal area. Two types of wind power systems were selected for the application of the cost-benefit methodology. A cost-benefit model was designed and implemented on a computer to establish a practical tool for studying the relative costs and benefits of wind power systems under a variety of conditions and to efficiently and effectively perform associated sensitivity analyses.

  19. Rational Variety Mapping for Contrast-Enhanced Nonlinear Unsupervised Segmentation of Multispectral Images of Unstained Specimen

    PubMed Central

    Kopriva, Ivica; Hadžija, Mirko; Popović Hadžija, Marijana; Korolija, Marina; Cichocki, Andrzej

    2011-01-01

    A methodology is proposed for nonlinear contrast-enhanced unsupervised segmentation of multispectral (color) microscopy images of principally unstained specimens. The methodology exploits spectral diversity and spatial sparseness to find anatomical differences between materials (cells, nuclei, and background) present in the image. It consists of rth-order rational variety mapping (RVM) followed by matrix/tensor factorization. Sparseness constraint implies duality between nonlinear unsupervised segmentation and multiclass pattern assignment problems. Classes not linearly separable in the original input space become separable with high probability in the higher-dimensional mapped space. Hence, RVM mapping has two advantages: it takes implicitly into account nonlinearities present in the image (ie, they are not required to be known) and it increases spectral diversity (ie, contrast) between materials, due to increased dimensionality of the mapped space. This is expected to improve performance of systems for automated classification and analysis of microscopic histopathological images. The methodology was validated using RVM of the second and third orders of the experimental multispectral microscopy images of unstained sciatic nerve fibers (nervus ischiadicus) and of unstained white pulp in the spleen tissue, compared with a manually defined ground truth labeled by two trained pathophysiologists. The methodology can also be useful for additional contrast enhancement of images of stained specimens. PMID:21708116

  20. Development of combinatorial chemistry methods for coatings: high-throughput adhesion evaluation and scale-up of combinatorial leads.

    PubMed

    Potyrailo, Radislav A; Chisholm, Bret J; Morris, William G; Cawse, James N; Flanagan, William P; Hassib, Lamyaa; Molaison, Chris A; Ezbiansky, Karin; Medford, George; Reitz, Hariklia

    2003-01-01

    Coupling of combinatorial chemistry methods with high-throughput (HT) performance testing and measurements of resulting properties has provided a powerful set of tools for the 10-fold accelerated discovery of new high-performance coating materials for automotive applications. Our approach replaces labor-intensive steps with automated systems for evaluation of adhesion of 8 x 6 arrays of coating elements that are discretely deposited on a single 9 x 12 cm plastic substrate. Performance of coatings is evaluated with respect to their resistance to adhesion loss, because this parameter is one of the primary considerations in end-use automotive applications. Our HT adhesion evaluation provides previously unavailable capabilities of high speed and reproducibility of testing by using a robotic automation, an expanded range of types of tested coatings by using the coating tagging strategy, and an improved quantitation by using high signal-to-noise automatic imaging. Upon testing, the coatings undergo changes that are impossible to quantitatively predict using existing knowledge. Using our HT methodology, we have developed several coatings leads. These HT screening results for the best coating compositions have been validated on the traditional scales of coating formulation and adhesion loss testing. These validation results have confirmed the superb performance of combinatorially developed coatings over conventional coatings on the traditional scale.

  1. An integrated framework for high level design of high performance signal processing circuits on FPGAs

    NASA Astrophysics Data System (ADS)

    Benkrid, K.; Belkacemi, S.; Sukhsawas, S.

    2005-06-01

    This paper proposes an integrated framework for the high level design of high performance signal processing algorithms' implementations on FPGAs. The framework emerged from a constant need to rapidly implement increasingly complicated algorithms on FPGAs while maintaining the high performance needed in many real time digital signal processing applications. This is particularly important for application developers who often rely on iterative and interactive development methodologies. The central idea behind the proposed framework is to dynamically integrate high performance structural hardware description languages with higher level hardware languages in other to help satisfy the dual requirement of high level design and high performance implementation. The paper illustrates this by integrating two environments: Celoxica's Handel-C language, and HIDE, a structural hardware environment developed at the Queen's University of Belfast. On the one hand, Handel-C has been proven to be very useful in the rapid design and prototyping of FPGA circuits, especially control intensive ones. On the other hand, HIDE, has been used extensively, and successfully, in the generation of highly optimised parameterisable FPGA cores. In this paper, this is illustrated in the construction of a scalable and fully parameterisable core for image algebra's five core neighbourhood operations, where fully floorplanned efficient FPGA configurations, in the form of EDIF netlists, are generated automatically for instances of the core. In the proposed combined framework, highly optimised data paths are invoked dynamically from within Handel-C, and are synthesized using HIDE. Although the idea might seem simple prima facie, it could have serious implications on the design of future generations of hardware description languages.

  2. Protocol for Reliability Assessment of Structural Health Monitoring Systems Incorporating Model-assisted Probability of Detection (MAPOD) Approach

    DTIC Science & Technology

    2011-09-01

    a quality evaluation with limited data, a model -based assessment must be...that affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a ...affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a wide range

  3. Biochemical Assays of Cultured Cells

    NASA Technical Reports Server (NTRS)

    Barlow, G. H.

    1985-01-01

    Subpopulations of human embryonic kidney cells isolated from continuous flow electrophoresis experiments performed at McDonnell Douglas and on STS-8 have been analyzed. These analyses have included plasminogen activator assays involving indirect methodology on fibrin plated and direct methodology using chromogenic substrates. Immunological studies were performed and the conditioned media for erythropoietin activity and human granulocyte colony stimulating (HGCSF) activity was analyzed.

  4. Rethinking Fragile Landscapes during the Greek Crisis: Precarious Aesthetics and Methodologies in Athenian Dance Performances

    ERIC Educational Resources Information Center

    Zervou, Natalie

    2017-01-01

    The financial crisis in Greece brought about significant changes in the sociopolitical and financial landscape of the country. Severe budget cuts imposed on the arts and performing practices have given rise to a new aesthetic which has impacted the themes and methodologies of contemporary productions. To unpack this aesthetic, I explore the ways…

  5. The Backyard Human Performance Technologist: Applying the Development Research Methodology to Develop and Validate a New Instructional Design Framework

    ERIC Educational Resources Information Center

    Brock, Timothy R.

    2009-01-01

    Development research methodology (DRM) has been recommended as a viable research approach to expand the practice-to-theory/theory-to-practice literature that human performance technology (HPT) practitioners can integrate into the day-to-day work flow they already use to develop instructional products. However, little has been written about how it…

  6. Survey of Header Compression Techniques

    NASA Technical Reports Server (NTRS)

    Ishac, Joseph

    2001-01-01

    This report provides a summary of several different header compression techniques. The different techniques included are: (1) Van Jacobson's header compression (RFC 1144); (2) SCPS (Space Communications Protocol Standards) header compression (SCPS-TP, SCPS-NP); (3) Robust header compression (ROHC); and (4) The header compression techniques in RFC2507 and RFC2508. The methodology for compression and error correction for these schemes are described in the remainder of this document. All of the header compression schemes support compression over simplex links, provided that the end receiver has some means of sending data back to the sender. However, if that return path does not exist, then neither Van Jacobson's nor SCPS can be used, since both rely on TCP (Transmission Control Protocol). In addition, under link conditions of low delay and low error, all of the schemes perform as expected. However, based on the methodology of the schemes, each scheme is likely to behave differently as conditions degrade. Van Jacobson's header compression relies heavily on the TCP retransmission timer and would suffer an increase in loss propagation should the link possess a high delay and/or bit error rate (BER). The SCPS header compression scheme protects against high delay environments by avoiding delta encoding between packets. Thus, loss propagation is avoided. However, SCPS is still affected by an increased BER (bit-error-rate) since the lack of delta encoding results in larger header sizes. Next, the schemes found in RFC2507 and RFC2508 perform well for non-TCP connections in poor conditions. RFC2507 performance with TCP connections is improved by various techniques over Van Jacobson's, but still suffers a performance hit with poor link properties. Also, RFC2507 offers the ability to send TCP data without delta encoding, similar to what SCPS offers. ROHC is similar to the previous two schemes, but adds additional CRCs (cyclic redundancy check) into headers and improves compression schemes which provide better tolerances in conditions with a high BER.

  7. Transforming State-of-the-Art into Best Practice: A Guide for High-Performance Energy Efficient Buildings in India

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Reshma; Ravache, Baptiste; Sartor, Dale

    India launched the Energy Conservation Building Code (ECBC) in 2007, and a revised version in 2017 as ambitious first steps towards promoting energy efficiency in the building sector. Pioneering early adopters—building owners, A&E firms, and energy consultants—have taken the lead to design customized solutions for their energy-efficient buildings. This Guide offers a synthesizing framework, critical lessons, and guidance to meet and exceed ECBC. Its whole-building lifecycle assurance framework provides a user-friendly methodology to achieve high performance in terms of energy, environmental, and societal impact. Class A offices are selected as a target typology, being a high-growth sector, with significant opportunitiesmore » for energy savings. The practices may be extrapolated to other commercial building sectors, as well as extended to other regions with similar cultural, climatic, construction, and developmental contexts« less

  8. Development of New Sensing Materials Using Combinatorial and High-Throughput Experimentation

    NASA Astrophysics Data System (ADS)

    Potyrailo, Radislav A.; Mirsky, Vladimir M.

    New sensors with improved performance characteristics are needed for applications as diverse as bedside continuous monitoring, tracking of environmental pollutants, monitoring of food and water quality, monitoring of chemical processes, and safety in industrial, consumer, and automotive settings. Typical requirements in sensor improvement are selectivity, long-term stability, sensitivity, response time, reversibility, and reproducibility. Design of new sensing materials is the important cornerstone in the effort to develop new sensors. Often, sensing materials are too complex to predict their performance quantitatively in the design stage. Thus, combinatorial and high-throughput experimentation methodologies provide an opportunity to generate new required data to discover new sensing materials and/or to optimize existing material compositions. The goal of this chapter is to provide an overview of the key concepts of experimental development of sensing materials using combinatorial and high-throughput experimentation tools, and to promote additional fruitful interactions between computational scientists and experimentalists.

  9. Coating and Patterning Functional Materials for Large Area Electrofluidic Arrays

    PubMed Central

    Wu, Hao; Tang, Biao; Hayes, Robert A.; Dou, Yingying; Guo, Yuanyuan; Jiang, Hongwei; Zhou, Guofu

    2016-01-01

    Industrialization of electrofluidic devices requires both high performance coating laminates and efficient material utilization on large area substrates. Here we show that screen printing can be effectively used to provide homogeneous pin-hole free patterned amorphous fluoropolymer dielectric layers to provide both the insulating and fluidic reversibility required for devices. Subsequently, we over-coat photoresist using slit coating on this normally extremely hydrophobic layer. In this way, we are able to pattern the photoresist by conventional lithography to provide the chemical contrast required for liquids dosing by self-assembly and highly-reversible electrofluidic switching. Materials, interfacial chemistry, and processing all contribute to the provision of the required engineered substrate properties. Coating homogeneity as characterized by metrology and device performance data are used to validate the methodology, which is well-suited for transfer to high volume production in existing LCD cell-making facilities. PMID:28773826

  10. Coating and Patterning Functional Materials for Large Area Electrofluidic Arrays.

    PubMed

    Wu, Hao; Tang, Biao; Hayes, Robert A; Dou, Yingying; Guo, Yuanyuan; Jiang, Hongwei; Zhou, Guofu

    2016-08-19

    Industrialization of electrofluidic devices requires both high performance coating laminates and efficient material utilization on large area substrates. Here we show that screen printing can be effectively used to provide homogeneous pin-hole free patterned amorphous fluoropolymer dielectric layers to provide both the insulating and fluidic reversibility required for devices. Subsequently, we over-coat photoresist using slit coating on this normally extremely hydrophobic layer. In this way, we are able to pattern the photoresist by conventional lithography to provide the chemical contrast required for liquids dosing by self-assembly and highly-reversible electrofluidic switching. Materials, interfacial chemistry, and processing all contribute to the provision of the required engineered substrate properties. Coating homogeneity as characterized by metrology and device performance data are used to validate the methodology, which is well-suited for transfer to high volume production in existing LCD cell-making facilities.

  11. User-Defined Data Distributions in High-Level Programming Languages

    NASA Technical Reports Server (NTRS)

    Diaconescu, Roxana E.; Zima, Hans P.

    2006-01-01

    One of the characteristic features of today s high performance computing systems is a physically distributed memory. Efficient management of locality is essential for meeting key performance requirements for these architectures. The standard technique for dealing with this issue has involved the extension of traditional sequential programming languages with explicit message passing, in the context of a processor-centric view of parallel computation. This has resulted in complex and error-prone assembly-style codes in which algorithms and communication are inextricably interwoven. This paper presents a high-level approach to the design and implementation of data distributions. Our work is motivated by the need to improve the current parallel programming methodology by introducing a paradigm supporting the development of efficient and reusable parallel code. This approach is currently being implemented in the context of a new programming language called Chapel, which is designed in the HPCS project Cascade.

  12. The effects of group supervision of nurses: a systematic literature review.

    PubMed

    Francke, Anneke L; de Graaff, Fuusje M

    2012-09-01

    To gain insight into the existing scientific evidence on the effects of group supervision for nurses. A systematic literature study of original research publications. Searches were performed in February 2010 in PubMed, CINAHL, Cochrane Library, Embase, ERIC, the NIVEL catalogue, and PsycINFO. No limitations were applied regarding date of publication, language or country. Original research publications were eligible for review when they described group supervision programmes directed at nurses; used a control group or a pre-test post-test design; and gave information about the effects of group supervision on nurse or patient outcomes. The two review authors independently assessed studies for inclusion. The methodological quality of included studies was also independently assessed by the review authors, using a check list developed by Van Tulder et al. in collaboration with the Dutch Cochrane Centre. Data related to the original publications were extracted by one review author and checked by a second review author. No statistical pooling of outcomes was performed, because there was large heterogeneity of outcomes. A total of 1087 potentially relevant references were found. After screening of the references, eight studies with a control group and nine with a pre-test post-test design were included. Most of the 17 studies included have serious methodological limitations, but four Swedish publications in the field of dementia care had high methodological quality and all point to positive effects on nurses' attitudes and skills and/or nurse-patient interactions. However, in interpreting these positive results, it must be taken into account that these four high-quality publications concern sub-studies of one 'sliced' research project using the same study sample. Moreover, these four publications combined a group supervision intervention with the introduction of individual care planning, which also hampers conclusions about the effectiveness of group supervision alone. Although there are rather a lot of indications that group supervision of nurses is effective, evidence on the effects is still scarce. Further methodologically sound research is needed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. ACCF/AHA methodology for the development of quality measures for cardiovascular technology: a report of the American College of Cardiology Foundation/American Heart Association Task Force on Performance Measures.

    PubMed

    Bonow, Robert O; Douglas, Pamela S; Buxton, Alfred E; Cohen, David J; Curtis, Jeptha P; Delong, Elizabeth; Drozda, Joseph P; Ferguson, T Bruce; Heidenreich, Paul A; Hendel, Robert C; Masoudi, Frederick A; Peterson, Eric D; Taylor, Allen J

    2011-09-27

    Consistent with the growing national focus on healthcare quality, the American College of Cardiology Foundation (ACCF) and the American Heart Association (AHA) have taken a leadership role over the past decade in developing measures of the quality of cardiovascular care by convening a joint ACCF/AHA Task Force on Performance Measures. The Task Force is charged with identifying the clinical topics appropriate for the development of performance measures and with assembling writing committees composed of clinical and methodological experts in collaboration with appropriate subspecialty societies. The Task Force has also created methodology documents that offer guidance in the development of process, outcome, composite, and efficiency measures. Cardiovascular performance measures using existing ACCF/AHA methodology are based on Class I or Class III guidelines recommendations, usually with Level A evidence. These performance measures, based on evidence-based ACCF/AHA guidelines, remain the most rigorous quality measures for both internal quality improvement and public reporting. However, many of the tools for diagnosis and treatment of cardiovascular disease involve advanced technologies, such as cardiac imaging, for which there are often no underlying guideline documents. Because these technologies affect the quality of cardiovascular care and also have the potential to contribute to cardiovascular health expenditures, there is a need for more critical assessment of the use of technology, including the development of quality and performance measures in areas in which guideline recommendations are absent. The evaluation of quality in the use of cardiovascular technologies requires consideration of multiple parameters that differ from other healthcare processes. The present document describes methodology for development of 2 new classes of quality measures in these situations, appropriate use measures and structure/safety measures. Appropriate use measures are based on specific indications, processes, or parameters of care for which high level of evidence data and Class I or Class III guideline recommendations may be lacking but are addressed in ACCF appropriate use criteria documents. Structure/safety measures represent measures developed to address structural aspects of the use of healthcare technology (e.g., laboratory accreditation, personnel training, and credentialing) or quality issues related to patient safety when there are neither guidelines recommendations nor appropriate use criteria. Although the strength of evidence for appropriate use measures and structure/safety measures may not be as strong as that for formal performance measures, they are quality measures that are otherwise rigorously developed, reviewed, tested, and approved in the same manner as ACCF/AHA performance measures. The ultimate goal of the present document is to provide direction in defining and measuring the appropriate use-avoiding not only underuse but also overuse and misuse-and proper application of cardiovascular technology and to describe how such appropriate use measures and structure/safety measures might be developed for the purposes of quality improvement and public reporting. It is anticipated that this effort will help focus the national dialogue on the use of cardiovascular technology and away from the current concerns about volume and cost alone to a more holistic emphasis on value.

  14. Performance evaluation of contrast-detail in full field digital mammography systems using ideal (Hotelling) observer vs. conventional automated analysis of CDMAM images for quality control of contrast-detail characteristics.

    PubMed

    Delakis, Ioannis; Wise, Robert; Morris, Lauren; Kulama, Eugenia

    2015-11-01

    The purpose of this work was to evaluate the contrast-detail performance of full field digital mammography (FFDM) systems using ideal (Hotelling) observer Signal-to-Noise Ratio (SNR) methodology and ascertain whether it can be considered an alternative to the conventional, automated analysis of CDMAM phantom images. Five FFDM units currently used in the national breast screening programme were evaluated, which differed with respect to age, detector, Automatic Exposure Control (AEC) and target/filter combination. Contrast-detail performance was analysed using CDMAM and ideal observer SNR methodology. The ideal observer SNR was calculated for input signal originating from gold discs of varying thicknesses and diameters, and then used to estimate the threshold gold thickness for each diameter as per CDMAM analysis. The variability of both methods and the dependence of CDMAM analysis on phantom manufacturing discrepancies also investigated. Results from both CDMAM and ideal observer methodologies were informative differentiators of FFDM systems' contrast-detail performance, displaying comparable patterns with respect to the FFDM systems' type and age. CDMAM results suggested higher threshold gold thickness values compared with the ideal observer methodology, especially for small-diameter details, which can be attributed to the behaviour of the CDMAM phantom used in this study. In addition, ideal observer methodology results showed lower variability than CDMAM results. The Ideal observer SNR methodology can provide a useful metric of the FFDM systems' contrast detail characteristics and could be considered a surrogate for conventional, automated analysis of CDMAM images. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  15. Improving Face Verification in Photo Albums by Combining Facial Recognition and Metadata With Cross-Matching

    DTIC Science & Technology

    2017-12-01

    satisfactory performance. We do not use statistical models, and we do not create patterns that require supervised learning. Our methodology is intended...statistical models, and we do not create patterns that require supervised learning. Our methodology is intended for use in personal digital image...THESIS MOTIVATION .........................................................................19 III. METHODOLOGY

  16. ICT Expenditures and Education Outputs/Outcomes in Selected Developed Countries: An Assessment of Relative Efficiency

    ERIC Educational Resources Information Center

    Aristovnik, Aleksander

    2013-01-01

    Purpose: The aim of the paper is to review some previous researches examining ICT efficiency and the impact of ICT on educational output/outcome as well as different conceptual and methodological issues related to performance measurement. Design/methodology/approach: This paper adopts a non-parametric methodology, i.e. data envelopment analysis…

  17. NEW METHODOLOGY FOR DEVELOPMENT OF ORODISPERSIBLE TABLETS USING HIGH-SHEAR GRANULATION PROCESS.

    PubMed

    Ali, Bahaa E; Al-Shedfat, Ramadan I; Fayed, Mohamed H; Alanazi, Fars K

    2017-05-01

    Development of orodispersible delivery system of high mechanical properties and low disintegration time is a big challenge. The aim of the current work was to assess and optimize the high shear granulation process as a new methodology for development of orodispersible tablets of high quality attributes using design of experiment approach. A two factor, three levels (32), full factorial design was carried out to investigate the main and interaction effects of independent variables, water amount (XI) and granulation time (X2) on the characteristics of granules and final product, tablet. The produced granules were analyzed for their granule size, density and flowability. Furthermore, the produced tablets were tested for: weight variation, breaking force/ crushing strength, friability, disintegration time and drug dissolution. Regression analysis results of multiple linear models showed a high correlation between the adjusted R-squared and predicted R-squared for all granules and tablets characteristics, the difference is less than 0.2. All dependent responses of granules and tablets were found to be impacted significantly (p < 0.05) by the two independent variables. However, water amount demonstrated the most dominant effect for all granules and tablet characteristics as shown by higher its coefficient estimate for all selected responses. Numerical optimization using desirability function was performed to optimize the variables under study to provide orodispersible system within the USP limit with respect of mechanical properties and disintegration time. It was found that the higher desirability (0.915) could be attained at the low level pf water (180 g) and short granulation time (1.65 min). Eventually, this study provides the formulator with helpful information in selecting the proper level of water and granulation time to provide an orodispersible system of high crushing strength and very low disintegration time, when high shear granulation methodology was used as a method of manufacture.

  18. Laser-assisted patch clamping: a methodology

    NASA Technical Reports Server (NTRS)

    Henriksen, G. H.; Assmann, S. M.; Evans, M. L. (Principal Investigator)

    1997-01-01

    Laser microsurgery can be used to perform both cell biological manipulations, such as targeted cell ablation, and molecular genetic manipulations, such as genetic transformation and chromosome dissection. In this report, we describe a laser microsurgical method that can be used either to ablate single cells or to ablate a small area (1-3 microns diameter) of the extracellular matrix. In plants and microorganisms, the extracellular matrix consists of the cell wall. While conventional patch clamping of these cells, as well as of many animal cells, requires enzymatic digestion of the extracellular matrix, we illustrate that laser microsurgery of a portion of the wall enables patch clamp access to the plasma membrane of higher plant cells remaining situated in their tissue environment. What follows is a detailed description of the construction and use of an economical laser microsurgery system, including procedures for single cell and targeted cell wall ablation. This methodology will be of interest to scientists wishing to perform cellular or subcellular ablation with a high degree of accuracy, or wishing to study how the extracellular matrix affects ion channel function.

  19. Development of garlic bioactive compounds analytical methodology based on liquid phase microextraction using response surface design. Implications for dual analysis: Cooked and biological fluids samples.

    PubMed

    Ramirez, Daniela Andrea; Locatelli, Daniela Ana; Torres-Palazzolo, Carolina Andrea; Altamirano, Jorgelina Cecilia; Camargo, Alejandra Beatriz

    2017-01-15

    Organosulphur compounds (OSCs) present in garlic (Allium sativum L.) are responsible of several biological properties. Functional foods researches indicate the importance of quantifying these compounds in food matrices and biological fluids. For this purpose, this paper introduces a novel methodology based on dispersive liquid-liquid microextraction (DLLME) coupled to high performance liquid chromatography with ultraviolet detector (HPLC-UV) for the extraction and determination of organosulphur compounds in different matrices. The target analytes were allicin, (E)- and (Z)-ajoene, 2-vinyl-4H-1,2-dithiin (2-VD), diallyl sulphide (DAS) and diallyl disulphide (DADS). The microextraction technique was optimized using an experimental design, and the analytical performance was evaluated under optimum conditions. The desirability function presented an optimal value for 600μL of chloroform as extraction solvent using acetonitrile as dispersant. The method proved to be reliable, precise and accurate. It was successfully applied to determine OSCs in cooked garlic samples as well as blood plasma and digestive fluids. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. A Statistical Evaluation of the Diagnostic Performance of MEDAS-The Medical Emergency Decision Assistance System

    PubMed Central

    Georgakis, D. Christine; Trace, David A.; Naeymi-Rad, Frank; Evens, Martha

    1990-01-01

    Medical expert systems require comprehensive evaluation of their diagnostic accuracy. The usefulness of these systems is limited without established evaluation methods. We propose a new methodology for evaluating the diagnostic accuracy and the predictive capacity of a medical expert system. We have adapted to the medical domain measures that have been used in the social sciences to examine the performance of human experts in the decision making process. Thus, in addition to the standard summary measures, we use measures of agreement and disagreement, and Goodman and Kruskal's λ and τ measures of predictive association. This methodology is illustrated by a detailed retrospective evaluation of the diagnostic accuracy of the MEDAS system. In a study using 270 patients admitted to the North Chicago Veterans Administration Hospital, diagnoses produced by MEDAS are compared with the discharge diagnoses of the attending physicians. The results of the analysis confirm the high diagnostic accuracy and predictive capacity of the MEDAS system. Overall, the agreement of the MEDAS system with the “gold standard” diagnosis of the attending physician has reached a 90% level.

Top