Sample records for optimization reliability analysis

  1. A Most Probable Point-Based Method for Reliability Analysis, Sensitivity Analysis and Design Optimization

    NASA Technical Reports Server (NTRS)

    Hou, Gene J.-W; Newman, Perry A. (Technical Monitor)

    2004-01-01

    A major step in a most probable point (MPP)-based method for reliability analysis is to determine the MPP. This is usually accomplished by using an optimization search algorithm. The minimum distance associated with the MPP provides a measurement of safety probability, which can be obtained by approximate probability integration methods such as FORM or SORM. The reliability sensitivity equations are derived first in this paper, based on the derivatives of the optimal solution. Examples are provided later to demonstrate the use of these derivatives for better reliability analysis and reliability-based design optimization (RBDO).

  2. Reliability Analysis and Reliability-Based Design Optimization of Circular Composite Cylinders Under Axial Compression

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    2001-01-01

    This report describes the preliminary results of an investigation on component reliability analysis and reliability-based design optimization of thin-walled circular composite cylinders with average diameter and average length of 15 inches. Structural reliability is based on axial buckling strength of the cylinder. Both Monte Carlo simulation and First Order Reliability Method are considered for reliability analysis with the latter incorporated into the reliability-based structural optimization problem. To improve the efficiency of reliability sensitivity analysis and design optimization solution, the buckling strength of the cylinder is estimated using a second-order response surface model. The sensitivity of the reliability index with respect to the mean and standard deviation of each random variable is calculated and compared. The reliability index is found to be extremely sensitive to the applied load and elastic modulus of the material in the fiber direction. The cylinder diameter was found to have the third highest impact on the reliability index. Also the uncertainty in the applied load, captured by examining different values for its coefficient of variation, is found to have a large influence on cylinder reliability. The optimization problem for minimum weight is solved subject to a design constraint on element reliability index. The methodology, solution procedure and optimization results are included in this report.

  3. Probabilistic Finite Element Analysis & Design Optimization for Structural Designs

    NASA Astrophysics Data System (ADS)

    Deivanayagam, Arumugam

    This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on experimental data analysis focusing on probabilistic distribution models which characterize the randomness associated with the experimental data. The material properties of Kevlar® 49 are modeled using experimental data analysis and implemented along with an existing spiral modeling scheme (SMS) and user defined constitutive model (UMAT) for fabric based engine containment simulations in LS-DYNA. MCS of the model are performed to observe the failure pattern and exit velocities of the models. Then the solutions are compared with NASA experimental tests and deterministic results. MCS with probabilistic material data give a good prospective on results rather than a single deterministic simulation results. The next part of research is to implement the probabilistic material properties in engineering designs. The main aim of structural design is to obtain optimal solutions. In any case, in a deterministic optimization problem even though the structures are cost effective, it becomes highly unreliable if the uncertainty that may be associated with the system (material properties, loading etc.) is not represented or considered in the solution process. Reliable and optimal solution can be obtained by performing reliability optimization along with the deterministic optimization, which is RBDO. In RBDO problem formulation, in addition to structural performance constraints, reliability constraints are also considered. This part of research starts with introduction to reliability analysis such as first order reliability analysis, second order reliability analysis followed by simulation technique that are performed to obtain probability of failure and reliability of structures. Next, decoupled RBDO procedure is proposed with a new reliability analysis formulation with sensitivity analysis, which is performed to remove the highly reliable constraints in the RBDO, thereby reducing the computational time and function evaluations. Followed by implementation of the reliability analysis concepts and RBDO in finite element 2D truss problems and a planar beam problem are presented and discussed.

  4. A single-loop optimization method for reliability analysis with second order uncertainty

    NASA Astrophysics Data System (ADS)

    Xie, Shaojun; Pan, Baisong; Du, Xiaoping

    2015-08-01

    Reliability analysis may involve random variables and interval variables. In addition, some of the random variables may have interval distribution parameters owing to limited information. This kind of uncertainty is called second order uncertainty. This article develops an efficient reliability method for problems involving the three aforementioned types of uncertain input variables. The analysis produces the maximum and minimum reliability and is computationally demanding because two loops are needed: a reliability analysis loop with respect to random variables and an interval analysis loop for extreme responses with respect to interval variables. The first order reliability method and nonlinear optimization are used for the two loops, respectively. For computational efficiency, the two loops are combined into a single loop by treating the Karush-Kuhn-Tucker (KKT) optimal conditions of the interval analysis as constraints. Three examples are presented to demonstrate the proposed method.

  5. Optimization of life support systems and their systems reliability

    NASA Technical Reports Server (NTRS)

    Fan, L. T.; Hwang, C. L.; Erickson, L. E.

    1971-01-01

    The identification, analysis, and optimization of life support systems and subsystems have been investigated. For each system or subsystem that has been considered, the procedure involves the establishment of a set of system equations (or mathematical model) based on theory and experimental evidences; the analysis and simulation of the model; the optimization of the operation, control, and reliability; analysis of sensitivity of the system based on the model; and, if possible, experimental verification of the theoretical and computational results. Research activities include: (1) modeling of air flow in a confined space; (2) review of several different gas-liquid contactors utilizing centrifugal force: (3) review of carbon dioxide reduction contactors in space vehicles and other enclosed structures: (4) application of modern optimal control theory to environmental control of confined spaces; (5) optimal control of class of nonlinear diffusional distributed parameter systems: (6) optimization of system reliability of life support systems and sub-systems: (7) modeling, simulation and optimal control of the human thermal system: and (8) analysis and optimization of the water-vapor eletrolysis cell.

  6. Researcher Biographies

    Science.gov Websites

    interest: mechanical system design sensitivity analysis and optimization of linear and nonlinear structural systems, reliability analysis and reliability-based design optimization, computational methods in committee member, ISSMO; Associate Editor, Mechanics Based Design of Structures and Machines; Associate

  7. Data mining-based coefficient of influence factors optimization of test paper reliability

    NASA Astrophysics Data System (ADS)

    Xu, Peiyao; Jiang, Huiping; Wei, Jieyao

    2018-05-01

    Test is a significant part of the teaching process. It demonstrates the final outcome of school teaching through teachers' teaching level and students' scores. The analysis of test paper is a complex operation that has the characteristics of non-linear relation in the length of the paper, time duration and the degree of difficulty. It is therefore difficult to optimize the coefficient of influence factors under different conditions in order to get text papers with clearly higher reliability with general methods [1]. With data mining techniques like Support Vector Regression (SVR) and Genetic Algorithm (GA), we can model the test paper analysis and optimize the coefficient of impact factors for higher reliability. It's easy to find that the combination of SVR and GA can get an effective advance in reliability from the test results. The optimal coefficient of influence factors optimization has a practicability in actual application, and the whole optimizing operation can offer model basis for test paper analysis.

  8. Optimization of structures on the basis of fracture mechanics and reliability criteria

    NASA Technical Reports Server (NTRS)

    Heer, E.; Yang, J. N.

    1973-01-01

    Systematic summary of factors which are involved in optimization of given structural configuration is part of report resulting from study of analysis of objective function. Predicted reliability of performance of finished structure is sharply dependent upon results of coupon tests. Optimization analysis developed by study also involves expected cost of proof testing.

  9. Scheduling structural health monitoring activities for optimizing life-cycle costs and reliability of wind turbines

    NASA Astrophysics Data System (ADS)

    Hanish Nithin, Anu; Omenzetter, Piotr

    2017-04-01

    Optimization of the life-cycle costs and reliability of offshore wind turbines (OWTs) is an area of immense interest due to the widespread increase in wind power generation across the world. Most of the existing studies have used structural reliability and the Bayesian pre-posterior analysis for optimization. This paper proposes an extension to the previous approaches in a framework for probabilistic optimization of the total life-cycle costs and reliability of OWTs by combining the elements of structural reliability/risk analysis (SRA), the Bayesian pre-posterior analysis with optimization through a genetic algorithm (GA). The SRA techniques are adopted to compute the probabilities of damage occurrence and failure associated with the deterioration model. The probabilities are used in the decision tree and are updated using the Bayesian analysis. The output of this framework would determine the optimal structural health monitoring and maintenance schedules to be implemented during the life span of OWTs while maintaining a trade-off between the life-cycle costs and risk of the structural failure. Numerical illustrations with a generic deterioration model for one monitoring exercise in the life cycle of a system are demonstrated. Two case scenarios, namely to build initially an expensive and robust or a cheaper but more quickly deteriorating structures and to adopt expensive monitoring system, are presented to aid in the decision-making process.

  10. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.

  11. Kuhn-Tucker optimization based reliability analysis for probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Besterfield, G.; Lawrence, M.; Belytschko, T.

    1988-01-01

    The fusion of probability finite element method (PFEM) and reliability analysis for fracture mechanics is considered. Reliability analysis with specific application to fracture mechanics is presented, and computational procedures are discussed. Explicit expressions for the optimization procedure with regard to fracture mechanics are given. The results show the PFEM is a very powerful tool in determining the second-moment statistics. The method can determine the probability of failure or fracture subject to randomness in load, material properties and crack length, orientation, and location.

  12. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    NASA Astrophysics Data System (ADS)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  13. Optimization Testbed Cometboards Extended into Stochastic Domain

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.; Patnaik, Surya N.

    2010-01-01

    COMparative Evaluation Testbed of Optimization and Analysis Routines for the Design of Structures (CometBoards) is a multidisciplinary design optimization software. It was originally developed for deterministic calculation. It has now been extended into the stochastic domain for structural design problems. For deterministic problems, CometBoards is introduced through its subproblem solution strategy as well as the approximation concept in optimization. In the stochastic domain, a design is formulated as a function of the risk or reliability. Optimum solution including the weight of a structure, is also obtained as a function of reliability. Weight versus reliability traced out an inverted-S-shaped graph. The center of the graph corresponded to 50 percent probability of success, or one failure in two samples. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponded to unity for reliability. Weight can be reduced to a small value for the most failure-prone design with a compromised reliability approaching zero. The stochastic design optimization (SDO) capability for an industrial problem was obtained by combining three codes: MSC/Nastran code was the deterministic analysis tool, fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life airframe component made of metallic and composite materials.

  14. Reliability- and performance-based robust design optimization of MEMS structures considering technological uncertainties

    NASA Astrophysics Data System (ADS)

    Martowicz, Adam; Uhl, Tadeusz

    2012-10-01

    The paper discusses the applicability of a reliability- and performance-based multi-criteria robust design optimization technique for micro-electromechanical systems, considering their technological uncertainties. Nowadays, micro-devices are commonly applied systems, especially in the automotive industry, taking advantage of utilizing both the mechanical structure and electronic control circuit on one board. Their frequent use motivates the elaboration of virtual prototyping tools that can be applied in design optimization with the introduction of technological uncertainties and reliability. The authors present a procedure for the optimization of micro-devices, which is based on the theory of reliability-based robust design optimization. This takes into consideration the performance of a micro-device and its reliability assessed by means of uncertainty analysis. The procedure assumes that, for each checked design configuration, the assessment of uncertainty propagation is performed with the meta-modeling technique. The described procedure is illustrated with an example of the optimization carried out for a finite element model of a micro-mirror. The multi-physics approach allowed the introduction of several physical phenomena to correctly model the electrostatic actuation and the squeezing effect present between electrodes. The optimization was preceded by sensitivity analysis to establish the design and uncertain domains. The genetic algorithms fulfilled the defined optimization task effectively. The best discovered individuals are characterized by a minimized value of the multi-criteria objective function, simultaneously satisfying the constraint on material strength. The restriction of the maximum equivalent stresses was introduced with the conditionally formulated objective function with a penalty component. The yielded results were successfully verified with a global uniform search through the input design domain.

  15. Control optimization, stabilization and computer algorithms for aircraft applications

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Research related to reliable aircraft design is summarized. Topics discussed include systems reliability optimization, failure detection algorithms, analysis of nonlinear filters, design of compensators incorporating time delays, digital compensator design, estimation for systems with echoes, low-order compensator design, descent-phase controller for 4-D navigation, infinite dimensional mathematical programming problems and optimal control problems with constraints, robust compensator design, numerical methods for the Lyapunov equations, and perturbation methods in linear filtering and control.

  16. Turbine Performance Optimization Task Status

    NASA Technical Reports Server (NTRS)

    Griffin, Lisa W.; Turner, James E. (Technical Monitor)

    2001-01-01

    Capability to optimize for turbine performance and accurately predict unsteady loads will allow for increased reliability, Isp, and thrust-to-weight. The development of a fast, accurate aerodynamic design, analysis, and optimization system is required.

  17. Optimizing preventive maintenance policy: A data-driven application for a light rail braking system.

    PubMed

    Corman, Francesco; Kraijema, Sander; Godjevac, Milinko; Lodewijks, Gabriel

    2017-10-01

    This article presents a case study determining the optimal preventive maintenance policy for a light rail rolling stock system in terms of reliability, availability, and maintenance costs. The maintenance policy defines one of the three predefined preventive maintenance actions at fixed time-based intervals for each of the subsystems of the braking system. Based on work, maintenance, and failure data, we model the reliability degradation of the system and its subsystems under the current maintenance policy by a Weibull distribution. We then analytically determine the relation between reliability, availability, and maintenance costs. We validate the model against recorded reliability and availability and get further insights by a dedicated sensitivity analysis. The model is then used in a sequential optimization framework determining preventive maintenance intervals to improve on the key performance indicators. We show the potential of data-driven modelling to determine optimal maintenance policy: same system availability and reliability can be achieved with 30% maintenance cost reduction, by prolonging the intervals and re-grouping maintenance actions.

  18. Optimizing preventive maintenance policy: A data-driven application for a light rail braking system

    PubMed Central

    Corman, Francesco; Kraijema, Sander; Godjevac, Milinko; Lodewijks, Gabriel

    2017-01-01

    This article presents a case study determining the optimal preventive maintenance policy for a light rail rolling stock system in terms of reliability, availability, and maintenance costs. The maintenance policy defines one of the three predefined preventive maintenance actions at fixed time-based intervals for each of the subsystems of the braking system. Based on work, maintenance, and failure data, we model the reliability degradation of the system and its subsystems under the current maintenance policy by a Weibull distribution. We then analytically determine the relation between reliability, availability, and maintenance costs. We validate the model against recorded reliability and availability and get further insights by a dedicated sensitivity analysis. The model is then used in a sequential optimization framework determining preventive maintenance intervals to improve on the key performance indicators. We show the potential of data-driven modelling to determine optimal maintenance policy: same system availability and reliability can be achieved with 30% maintenance cost reduction, by prolonging the intervals and re-grouping maintenance actions. PMID:29278245

  19. System Risk Assessment and Allocation in Conceptual Design

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Smith, Natasha L.; Zang, Thomas A. (Technical Monitor)

    2003-01-01

    As aerospace systems continue to evolve in addressing newer challenges in air and space transportation, there exists a heightened priority for significant improvement in system performance, cost effectiveness, reliability, and safety. Tools, which synthesize multidisciplinary integration, probabilistic analysis, and optimization, are needed to facilitate design decisions allowing trade-offs between cost and reliability. This study investigates tools for probabilistic analysis and probabilistic optimization in the multidisciplinary design of aerospace systems. A probabilistic optimization methodology is demonstrated for the low-fidelity design of a reusable launch vehicle at two levels, a global geometry design and a local tank design. Probabilistic analysis is performed on a high fidelity analysis of a Navy missile system. Furthermore, decoupling strategies are introduced to reduce the computational effort required for multidisciplinary systems with feedback coupling.

  20. Reliability prediction of large fuel cell stack based on structure stress analysis

    NASA Astrophysics Data System (ADS)

    Liu, L. F.; Liu, B.; Wu, C. W.

    2017-09-01

    The aim of this paper is to improve the reliability of Proton Electrolyte Membrane Fuel Cell (PEMFC) stack by designing the clamping force and the thickness difference between the membrane electrode assembly (MEA) and the gasket. The stack reliability is directly determined by the component reliability, which is affected by the material property and contact stress. The component contact stress is a random variable because it is usually affected by many uncertain factors in the production and clamping process. We have investigated the influences of parameter variation coefficient on the probability distribution of contact stress using the equivalent stiffness model and the first-order second moment method. The optimal contact stress to make the component stay in the highest level reliability is obtained by the stress-strength interference model. To obtain the optimal contact stress between the contact components, the optimal thickness of the component and the stack clamping force are optimally designed. Finally, a detailed description is given how to design the MEA and gasket dimensions to obtain the highest stack reliability. This work can provide a valuable guidance in the design of stack structure for a high reliability of fuel cell stack.

  1. Reliability Sensitivity Analysis and Design Optimization of Composite Structures Based on Response Surface Methodology

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    2003-01-01

    This report discusses the development and application of two alternative strategies in the form of global and sequential local response surface (RS) techniques for the solution of reliability-based optimization (RBO) problems. The problem of a thin-walled composite circular cylinder under axial buckling instability is used as a demonstrative example. In this case, the global technique uses a single second-order RS model to estimate the axial buckling load over the entire feasible design space (FDS) whereas the local technique uses multiple first-order RS models with each applied to a small subregion of FDS. Alternative methods for the calculation of unknown coefficients in each RS model are explored prior to the solution of the optimization problem. The example RBO problem is formulated as a function of 23 uncorrelated random variables that include material properties, thickness and orientation angle of each ply, cylinder diameter and length, as well as the applied load. The mean values of the 8 ply thicknesses are treated as independent design variables. While the coefficients of variation of all random variables are held fixed, the standard deviations of ply thicknesses can vary during the optimization process as a result of changes in the design variables. The structural reliability analysis is based on the first-order reliability method with reliability index treated as the design constraint. In addition to the probabilistic sensitivity analysis of reliability index, the results of the RBO problem are presented for different combinations of cylinder length and diameter and laminate ply patterns. The two strategies are found to produce similar results in terms of accuracy with the sequential local RS technique having a considerably better computational efficiency.

  2. A diameter-sensitive flow entropy method for reliability consideration in water distribution system design

    NASA Astrophysics Data System (ADS)

    Liu, Haixing; Savić, Dragan; Kapelan, Zoran; Zhao, Ming; Yuan, Yixing; Zhao, Hongbin

    2014-07-01

    Flow entropy is a measure of uniformity of pipe flows in water distribution systems. By maximizing flow entropy one can identify reliable layouts or connectivity in networks. In order to overcome the disadvantage of the common definition of flow entropy that does not consider the impact of pipe diameter on reliability, an extended definition of flow entropy, termed as diameter-sensitive flow entropy, is proposed. This new methodology is then assessed by using other reliability methods, including Monte Carlo Simulation, a pipe failure probability model, and a surrogate measure (resilience index) integrated with water demand and pipe failure uncertainty. The reliability assessment is based on a sample of WDS designs derived from an optimization process for each of the two benchmark networks. Correlation analysis is used to evaluate quantitatively the relationship between entropy and reliability. To ensure reliability, a comparative analysis between the flow entropy and the new method is conducted. The results demonstrate that the diameter-sensitive flow entropy shows consistently much stronger correlation with the three reliability measures than simple flow entropy. Therefore, the new flow entropy method can be taken as a better surrogate measure for reliability and could be potentially integrated into the optimal design problem of WDSs. Sensitivity analysis results show that the velocity parameters used in the new flow entropy has no significant impact on the relationship between diameter-sensitive flow entropy and reliability.

  3. Behaviour State Analysis in Rett Syndrome: Continuous Data Reliability Measurement

    ERIC Educational Resources Information Center

    Woodyatt, Gail; Marinac, Julie; Darnell, Ross; Sigafoos, Jeff; Halle, James

    2004-01-01

    Awareness of optimal behaviour states of children with profound intellectual disability has been reported in the literature as a potentially useful tool for planning intervention within this population. Some arguments have been raised, however, which question the reliability and validity of previously published work on behaviour state analysis.…

  4. Reliable Thermoelectric Module Design under Opposing Requirements from Structural and Thermoelectric Considerations

    NASA Astrophysics Data System (ADS)

    Karri, Naveen K.; Mo, Changki

    2018-06-01

    Structural reliability of thermoelectric generation (TEG) systems still remains an issue, especially for applications such as large-scale industrial or automobile exhaust heat recovery, in which TEG systems are subject to dynamic loads and thermal cycling. Traditional thermoelectric (TE) system design and optimization techniques, focused on performance alone, could result in designs that may fail during operation as the geometric requirements for optimal performance (especially the power) are often in conflict with the requirements for mechanical reliability. This study focused on reducing the thermomechanical stresses in a TEG system without compromising the optimized system performance. Finite element simulations were carried out to study the effect of TE element (leg) geometry such as leg length and cross-sectional shape under constrained material volume requirements. Results indicated that the element length has a major influence on the element stresses whereas regular cross-sectional shapes have minor influence. The impact of TE element stresses on the mechanical reliability is evaluated using brittle material failure theory based on Weibull analysis. An alternate couple configuration that relies on the industry practice of redundant element design is investigated. Results showed that the alternate configuration considerably reduced the TE element and metallization stresses, thereby enhancing the structural reliability, with little trade-off in the optimized performance. The proposed alternate configuration could serve as a potential design modification for improving the reliability of systems optimized for thermoelectric performance.

  5. NASA program decisions using reliability analysis.

    NASA Technical Reports Server (NTRS)

    Steinberg, A.

    1972-01-01

    NASA made use of the analytical outputs of reliability people to make management decisions on the Apollo program. Such decisions affected the amount of the incentive fees, how much acceptance testing was necessary, how to optimize development testing, whether to approve engineering changes, and certification of flight readiness. Examples of such analysis are discussed and related to programmatic decisions.-

  6. Reliability-Based Design Optimization of a Composite Airframe Component

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.

    2009-01-01

    A stochastic design optimization methodology (SDO) has been developed to design components of an airframe structure that can be made of metallic and composite materials. The design is obtained as a function of the risk level, or reliability, p. The design method treats uncertainties in load, strength, and material properties as distribution functions, which are defined with mean values and standard deviations. A design constraint or a failure mode is specified as a function of reliability p. Solution to stochastic optimization yields the weight of a structure as a function of reliability p. Optimum weight versus reliability p traced out an inverted-S-shaped graph. The center of the inverted-S graph corresponded to 50 percent (p = 0.5) probability of success. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponds to unity for reliability p (or p = 1). Weight can be reduced to a small value for the most failure-prone design with a reliability that approaches zero (p = 0). Reliability can be changed for different components of an airframe structure. For example, the landing gear can be designed for a very high reliability, whereas it can be reduced to a small extent for a raked wingtip. The SDO capability is obtained by combining three codes: (1) The MSC/Nastran code was the deterministic analysis tool, (2) The fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and (3) NASA Glenn Research Center s optimization testbed CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life raked wingtip structure of the Boeing 767-400 extended range airliner made of metallic and composite materials.

  7. An overview of the mathematical and statistical analysis component of RICIS

    NASA Technical Reports Server (NTRS)

    Hallum, Cecil R.

    1987-01-01

    Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.

  8. Cross Layer Design for Optimizing Transmission Reliability, Energy Efficiency, and Lifetime in Body Sensor Networks.

    PubMed

    Chen, Xi; Xu, Yixuan; Liu, Anfeng

    2017-04-19

    High transmission reliability, energy efficiency, and long lifetime are pivotal issues for wireless body area networks (WBANs. However, these performance metrics are not independent of each other, making it hard to obtain overall improvements through optimizing one single aspect. Therefore, a Cross Layer Design Optimal (CLDO) scheme is proposed to simultaneously optimize transmission reliability, energy efficiency, and lifetime of WBANs from several layers. Firstly, due to the fact that the transmission power of nodes directly influences the reliability of links, the optimized transmission power of different nodes is deduced, which is able to maximize energy efficiency in theory under the premise that requirements on delay and jitter are fulfilled. Secondly, a relay decision algorithm is proposed to choose optimized relay nodes. Using this algorithm, nodes will choose relay nodes that ensure a balance of network energy consumption, provided that all nodes transmit with optimized transmission power and the same packet size. Thirdly, the energy consumption of nodes is still unbalanced even with optimized transmission power because of their different locations in the topology of the network. In addition, packet size also has an impact on final performance metrics. Therefore, a synthesized cross layer method for optimization is proposed. With this method, the transmission power of nodes with more residual energy will be enhanced while suitable packet size is determined for different links in the network, leading to further improvements in the WBAN system. Both our comprehensive theoretical analysis and experimental results indicate that the performance of our proposed scheme is better than reported in previous studies. Relative to the relay selection and power control game (RSPCG) scheme, the CLDO scheme can enhance transmission reliability by more than 44.6% and prolong the lifetime by as much as 33.2%.

  9. Cross Layer Design for Optimizing Transmission Reliability, Energy Efficiency, and Lifetime in Body Sensor Networks

    PubMed Central

    Chen, Xi; Xu, Yixuan; Liu, Anfeng

    2017-01-01

    High transmission reliability, energy efficiency, and long lifetime are pivotal issues for wireless body area networks (WBANs). However, these performance metrics are not independent of each other, making it hard to obtain overall improvements through optimizing one single aspect. Therefore, a Cross Layer Design Optimal (CLDO) scheme is proposed to simultaneously optimize transmission reliability, energy efficiency, and lifetime of WBANs from several layers. Firstly, due to the fact that the transmission power of nodes directly influences the reliability of links, the optimized transmission power of different nodes is deduced, which is able to maximize energy efficiency in theory under the premise that requirements on delay and jitter are fulfilled. Secondly, a relay decision algorithm is proposed to choose optimized relay nodes. Using this algorithm, nodes will choose relay nodes that ensure a balance of network energy consumption, provided that all nodes transmit with optimized transmission power and the same packet size. Thirdly, the energy consumption of nodes is still unbalanced even with optimized transmission power because of their different locations in the topology of the network. In addition, packet size also has an impact on final performance metrics. Therefore, a synthesized cross layer method for optimization is proposed. With this method, the transmission power of nodes with more residual energy will be enhanced while suitable packet size is determined for different links in the network, leading to further improvements in the WBAN system. Both our comprehensive theoretical analysis and experimental results indicate that the performance of our proposed scheme is better than reported in previous studies. Relative to the relay selection and power control game (RSPCG) scheme, the CLDO scheme can enhance transmission reliability by more than 44.6% and prolong the lifetime by as much as 33.2%. PMID:28422062

  10. Reliability Impacts in Life Support Architecture and Technology Selection

    NASA Technical Reports Server (NTRS)

    Lange Kevin E.; Anderson, Molly S.

    2012-01-01

    Quantitative assessments of system reliability and equivalent system mass (ESM) were made for different life support architectures based primarily on International Space Station technologies. The analysis was applied to a one-year deep-space mission. System reliability was increased by adding redundancy and spares, which added to the ESM. Results were thus obtained allowing a comparison of the ESM for each architecture at equivalent levels of reliability. Although the analysis contains numerous simplifications and uncertainties, the results suggest that achieving necessary reliabilities for deep-space missions will add substantially to the life support ESM and could influence the optimal degree of life support closure. Approaches for reducing reliability impacts were investigated and are discussed.

  11. Optimizing Probability of Detection Point Estimate Demonstration

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  12. Reliability Constrained Priority Load Shedding for Aerospace Power System Automation

    NASA Technical Reports Server (NTRS)

    Momoh, James A.; Zhu, Jizhong; Kaddah, Sahar S.; Dolce, James L. (Technical Monitor)

    2000-01-01

    The need for improving load shedding on board the space station is one of the goals of aerospace power system automation. To accelerate the optimum load-shedding functions, several constraints must be involved. These constraints include congestion margin determined by weighted probability contingency, component/system reliability index, generation rescheduling. The impact of different faults and indices for computing reliability were defined before optimization. The optimum load schedule is done based on priority, value and location of loads. An optimization strategy capable of handling discrete decision making, such as Everett optimization, is proposed. We extended Everett method to handle expected congestion margin and reliability index as constraints. To make it effective for real time load dispatch process, a rule-based scheme is presented in the optimization method. It assists in selecting which feeder load to be shed, the location of the load, the value, priority of the load and cost benefit analysis of the load profile is included in the scheme. The scheme is tested using a benchmark NASA system consisting of generators, loads and network.

  13. Approximation Model Building for Reliability & Maintainability Characteristics of Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Morris, W. Douglas; White, Nancy H.; Lepsch, Roger A.; Brown, Richard W.

    2000-01-01

    This paper describes the development of parametric models for estimating operational reliability and maintainability (R&M) characteristics for reusable vehicle concepts, based on vehicle size and technology support level. A R&M analysis tool (RMAT) and response surface methods are utilized to build parametric approximation models for rapidly estimating operational R&M characteristics such as mission completion reliability. These models that approximate RMAT, can then be utilized for fast analysis of operational requirements, for lifecycle cost estimating and for multidisciplinary sign optimization.

  14. The Reliability Mandate: Optimizing the Use of Highly Reliable Parts, Materials, and Processes (PM&P) to Maximize System Component Reliability in the Life Cycle

    DTIC Science & Technology

    2002-06-01

    projects are converted into bricks and mortar , as Figure 5 illustrates. Making major changes in LCC after projects are turned over to production is...matter experts ( SMEs ) in the parts, materials, and processes functional area. Data gathering and analysis were conducted through structured interviews...The analysis synthesized feedback and searched for collective issues from the various SMEs on managing PM&P Program requirements, the

  15. Reliability models: the influence of model specification in generation expansion planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stremel, J.P.

    1982-10-01

    This paper is a critical evaluation of reliability methods used for generation expansion planning. It is shown that the methods for treating uncertainty are critical for determining the relative reliability value of expansion alternatives. It is also shown that the specification of the reliability model will not favor all expansion options equally. Consequently, the model is biased. In addition, reliability models should be augmented with an economic value of reliability (such as the cost of emergency procedures or energy not served). Generation expansion evaluations which ignore the economic value of excess reliability can be shown to be inconsistent. The conclusionsmore » are that, in general, a reliability model simplifies generation expansion planning evaluations. However, for a thorough analysis, the expansion options should be reviewed for candidates which may be unduly rejected because of the bias of the reliability model. And this implies that for a consistent formulation in an optimization framework, the reliability model should be replaced with a full economic optimization which includes the costs of emergency procedures and interruptions in the objective function.« less

  16. Overview of RICOR's reliability theoretical analysis, accelerated life demonstration test results and verification by field data

    NASA Astrophysics Data System (ADS)

    Vainshtein, Igor; Baruch, Shlomi; Regev, Itai; Segal, Victor; Filis, Avishai; Riabzev, Sergey

    2018-05-01

    The growing demand for EO applications that work around the clock 24hr/7days a week, such as in border surveillance systems, emphasizes the need for a highly reliable cryocooler having increased operational availability and optimized system's Integrated Logistic Support (ILS). In order to meet this need, RICOR developed linear and rotary cryocoolers which achieved successfully this goal. Cryocoolers MTTF was analyzed by theoretical reliability evaluation methods, demonstrated by normal and accelerated life tests at Cryocooler level and finally verified by field data analysis derived from Cryocoolers operating at system level. The following paper reviews theoretical reliability analysis methods together with analyzing reliability test results derived from standard and accelerated life demonstration tests performed at Ricor's advanced reliability laboratory. As a summary for the work process, reliability verification data will be presented as a feedback from fielded systems.

  17. Reducing maintenance costs in agreement with CNC machine tools reliability

    NASA Astrophysics Data System (ADS)

    Ungureanu, A. L.; Stan, G.; Butunoi, P. A.

    2016-08-01

    Aligning maintenance strategy with reliability is a challenge due to the need to find an optimal balance between them. Because the various methods described in the relevant literature involve laborious calculations or use of software that can be costly, this paper proposes a method that is easier to implement on CNC machine tools. The new method, called the Consequence of Failure Analysis (CFA) is based on technical and economic optimization, aimed at obtaining a level of required performance with minimum investment and maintenance costs.

  18. Regulation of transmission line capacity and reliability in electric networks

    NASA Astrophysics Data System (ADS)

    Celebi, Metin

    This thesis is composed of two essays that analyze the incentives and optimal regulation of a monopolist line owner in providing capacity and reliability. Similar analyses in the economic literature resulted in under-investment by an unregulated line owner when line reliability was treated as an exogenous variable. However, reliability should be chosen on the basis of economic principles as well, taking into account not only engineering principles but also the preferences of electricity users. When reliability is treated as a choice variable, both over- and under-investment by the line owner becomes possible. The result depends on the cross-cost elasticity of line construction and on the interval in which the optimal choices of capacity take place. We present some sufficient conditions that lead to definite results about the incentives of the line owner. We also characterize the optimal regulation of the line owner under incomplete information. Our analysis shows that the existence of a line is justified for the social planner when the reliability of other lines on the network is not too high, or when the marginal cost of generation at the expensive generating plant is high. The expectation of higher demand in the future makes the regulator less likely to build the line if it will be congested and reliability of other lines is high enough. It is always optimal to have a congested line under complete information, but not necessarily under incomplete information.

  19. Efficient sensitivity analysis and optimization of a helicopter rotor

    NASA Technical Reports Server (NTRS)

    Lim, Joon W.; Chopra, Inderjit

    1989-01-01

    Aeroelastic optimization of a system essentially consists of the determination of the optimum values of design variables which minimize the objective function and satisfy certain aeroelastic and geometric constraints. The process of aeroelastic optimization analysis is illustrated. To carry out aeroelastic optimization effectively, one needs a reliable analysis procedure to determine steady response and stability of a rotor system in forward flight. The rotor dynamic analysis used in the present study developed inhouse at the University of Maryland is based on finite elements in space and time. The analysis consists of two major phases: vehicle trim and rotor steady response (coupled trim analysis), and aeroelastic stability of the blade. For a reduction of helicopter vibration, the optimization process requires the sensitivity derivatives of the objective function and aeroelastic stability constraints. For this, the derivatives of steady response, hub loads and blade stability roots are calculated using a direct analytical approach. An automated optimization procedure is developed by coupling the rotor dynamic analysis, design sensitivity analysis and constrained optimization code CONMIN.

  20. Difficult Decisions Made Easier

    NASA Technical Reports Server (NTRS)

    2006-01-01

    NASA missions are extremely complex and prone to sudden, catastrophic failure if equipment falters or if an unforeseen event occurs. For these reasons, NASA trains to expect the unexpected. It tests its equipment and systems in extreme conditions, and it develops risk-analysis tests to foresee any possible problems. The Space Agency recently worked with an industry partner to develop reliability analysis software capable of modeling complex, highly dynamic systems, taking into account variations in input parameters and the evolution of the system over the course of a mission. The goal of this research was multifold. It included performance and risk analyses of complex, multiphase missions, like the insertion of the Mars Reconnaissance Orbiter; reliability analyses of systems with redundant and/or repairable components; optimization analyses of system configurations with respect to cost and reliability; and sensitivity analyses to identify optimal areas for uncertainty reduction or performance enhancement.

  1. Two tradeoffs between economy and reliability in loss of load probability constrained unit commitment

    NASA Astrophysics Data System (ADS)

    Liu, Yuan; Wang, Mingqiang; Ning, Xingyao

    2018-02-01

    Spinning reserve (SR) should be scheduled considering the balance between economy and reliability. To address the computational intractability cursed by the computation of loss of load probability (LOLP), many probabilistic methods use simplified formulations of LOLP to improve the computational efficiency. Two tradeoffs embedded in the SR optimization model are not explicitly analyzed in these methods. In this paper, two tradeoffs including primary tradeoff and secondary tradeoff between economy and reliability in the maximum LOLP constrained unit commitment (UC) model are explored and analyzed in a small system and in IEEE-RTS System. The analysis on the two tradeoffs can help in establishing new efficient simplified LOLP formulations and new SR optimization models.

  2. Power transfer systems for future navy helicopters. Final report 25 Jun 70--28 Jun 72

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bossler, R.B. Jr.

    1972-11-01

    The purpose of this program was to conduct an analysis of helicopter power transfer systems (pts), both conventional and advanced concept type, with the objective of reducing specific weights and improving reliability beyond present values. The analysis satisfied requirements specified for a 200,000 pound cargo transport helicopter (CTH), a 70,000 pound heavy assault helicopter, and a 15,000 pound non-combat search and rescue helicopter. Four selected gearing systems (out of seven studied), optimized for lightest weight and equal reliability for the CTH, using component proportioning via stress and stiffness equations, had no significant difference between their aircraft payloads. All optimized ptsmore » were approximately 70% of statistically predicted weight. Reliability increase is predicted via gearbox derating using Weibull relationships. Among advanced concepts, the Turbine Integrated Geared Rotor was competitive for weight, technology availability and reliability increase but handicapped by a special engine requirement. The warm cycle system was found not competitive. Helicopter parametric weight analysis is shown. Advanced development Plans are presented for the pts for the CTH, including total pts system, selected pts components, and scale model flight testing in a Kaman HH2 helicopter.« less

  3. Structural reliability analysis under evidence theory using the active learning kriging model

    NASA Astrophysics Data System (ADS)

    Yang, Xufeng; Liu, Yongshou; Ma, Panke

    2017-11-01

    Structural reliability analysis under evidence theory is investigated. It is rigorously proved that a surrogate model providing only correct sign prediction of the performance function can meet the accuracy requirement of evidence-theory-based reliability analysis. Accordingly, a method based on the active learning kriging model which only correctly predicts the sign of the performance function is proposed. Interval Monte Carlo simulation and a modified optimization method based on Karush-Kuhn-Tucker conditions are introduced to make the method more efficient in estimating the bounds of failure probability based on the kriging model. Four examples are investigated to demonstrate the efficiency and accuracy of the proposed method.

  4. Optimization of b-value distribution for biexponential diffusion-weighted MR imaging of normal prostate.

    PubMed

    Jambor, Ivan; Merisaari, Harri; Aronen, Hannu J; Järvinen, Jukka; Saunavaara, Jani; Kauko, Tommi; Borra, Ronald; Pesola, Marko

    2014-05-01

    To determine the optimal b-value distribution for biexponential diffusion-weighted imaging (DWI) of normal prostate using both a computer modeling approach and in vivo measurements. Optimal b-value distributions for the fit of three parameters (fast diffusion Df, slow diffusion Ds, and fraction of fast diffusion f) were determined using Monte-Carlo simulations. The optimal b-value distribution was calculated using four individual optimization methods. Eight healthy volunteers underwent four repeated 3 Tesla prostate DWI scans using both 16 equally distributed b-values and an optimized b-value distribution obtained from the simulations. The b-value distributions were compared in terms of measurement reliability and repeatability using Shrout-Fleiss analysis. Using low noise levels, the optimal b-value distribution formed three separate clusters at low (0-400 s/mm2), mid-range (650-1200 s/mm2), and high b-values (1700-2000 s/mm2). Higher noise levels resulted into less pronounced clustering of b-values. The clustered optimized b-value distribution demonstrated better measurement reliability and repeatability in Shrout-Fleiss analysis compared with 16 equally distributed b-values. The optimal b-value distribution was found to be a clustered distribution with b-values concentrated in the low, mid, and high ranges and was shown to improve the estimation quality of biexponential DWI parameters of in vivo experiments. Copyright © 2013 Wiley Periodicals, Inc.

  5. Multidisciplinary design optimization of vehicle instrument panel based on multi-objective genetic algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Ping; Wu, Guangqiang

    2013-03-01

    Typical multidisciplinary design optimization(MDO) has gradually been proposed to balance performances of lightweight, noise, vibration and harshness(NVH) and safety for instrument panel(IP) structure in the automotive development. Nevertheless, plastic constitutive relation of Polypropylene(PP) under different strain rates, has not been taken into consideration in current reliability-based and collaborative IP MDO design. In this paper, based on tensile test under different strain rates, the constitutive relation of Polypropylene material is studied. Impact simulation tests for head and knee bolster are carried out to meet the regulation of FMVSS 201 and FMVSS 208, respectively. NVH analysis is performed to obtain mainly the natural frequencies and corresponding mode shapes, while the crashworthiness analysis is employed to examine the crash behavior of IP structure. With the consideration of lightweight, NVH, head and knee bolster impact performance, design of experiment(DOE), response surface model(RSM), and collaborative optimization(CO) are applied to realize the determined and reliability-based optimizations, respectively. Furthermore, based on multi-objective genetic algorithm(MOGA), the optimal Pareto sets are completed to solve the multi-objective optimization(MOO) problem. The proposed research ensures the smoothness of Pareto set, enhances the ability of engineers to make a comprehensive decision about multi-objectives and choose the optimal design, and improves the quality and efficiency of MDO.

  6. Deterministic and reliability based optimization of integrated thermal protection system composite panel using adaptive sampling techniques

    NASA Astrophysics Data System (ADS)

    Ravishankar, Bharani

    Conventional space vehicles have thermal protection systems (TPS) that provide protection to an underlying structure that carries the flight loads. In an attempt to save weight, there is interest in an integrated TPS (ITPS) that combines the structural function and the TPS function. This has weight saving potential, but complicates the design of the ITPS that now has both thermal and structural failure modes. The main objectives of this dissertation was to optimally design the ITPS subjected to thermal and mechanical loads through deterministic and reliability based optimization. The optimization of the ITPS structure requires computationally expensive finite element analyses of 3D ITPS (solid) model. To reduce the computational expenses involved in the structural analysis, finite element based homogenization method was employed, homogenizing the 3D ITPS model to a 2D orthotropic plate. However it was found that homogenization was applicable only for panels that are much larger than the characteristic dimensions of the repeating unit cell in the ITPS panel. Hence a single unit cell was used for the optimization process to reduce the computational cost. Deterministic and probabilistic optimization of the ITPS panel required evaluation of failure constraints at various design points. This further demands computationally expensive finite element analyses which was replaced by efficient, low fidelity surrogate models. In an optimization process, it is important to represent the constraints accurately to find the optimum design. Instead of building global surrogate models using large number of designs, the computational resources were directed towards target regions near constraint boundaries for accurate representation of constraints using adaptive sampling strategies. Efficient Global Reliability Analyses (EGRA) facilitates sequentially sampling of design points around the region of interest in the design space. EGRA was applied to the response surface construction of the failure constraints in the deterministic and reliability based optimization of the ITPS panel. It was shown that using adaptive sampling, the number of designs required to find the optimum were reduced drastically, while improving the accuracy. System reliability of ITPS was estimated using Monte Carlo Simulation (MCS) based method. Separable Monte Carlo method was employed that allowed separable sampling of the random variables to predict the probability of failure accurately. The reliability analysis considered uncertainties in the geometry, material properties, loading conditions of the panel and error in finite element modeling. These uncertainties further increased the computational cost of MCS techniques which was also reduced by employing surrogate models. In order to estimate the error in the probability of failure estimate, bootstrapping method was applied. This research work thus demonstrates optimization of the ITPS composite panel with multiple failure modes and large number of uncertainties using adaptive sampling techniques.

  7. Reliability-based trajectory optimization using nonintrusive polynomial chaos for Mars entry mission

    NASA Astrophysics Data System (ADS)

    Huang, Yuechen; Li, Haiyang

    2018-06-01

    This paper presents the reliability-based sequential optimization (RBSO) method to settle the trajectory optimization problem with parametric uncertainties in entry dynamics for Mars entry mission. First, the deterministic entry trajectory optimization model is reviewed, and then the reliability-based optimization model is formulated. In addition, the modified sequential optimization method, in which the nonintrusive polynomial chaos expansion (PCE) method and the most probable point (MPP) searching method are employed, is proposed to solve the reliability-based optimization problem efficiently. The nonintrusive PCE method contributes to the transformation between the stochastic optimization (SO) and the deterministic optimization (DO) and to the approximation of trajectory solution efficiently. The MPP method, which is used for assessing the reliability of constraints satisfaction only up to the necessary level, is employed to further improve the computational efficiency. The cycle including SO, reliability assessment and constraints update is repeated in the RBSO until the reliability requirements of constraints satisfaction are satisfied. Finally, the RBSO is compared with the traditional DO and the traditional sequential optimization based on Monte Carlo (MC) simulation in a specific Mars entry mission to demonstrate the effectiveness and the efficiency of the proposed method.

  8. Design optimization of a fuzzy distributed generation (DG) system with multiple renewable energy sources

    NASA Astrophysics Data System (ADS)

    Ganesan, T.; Elamvazuthi, I.; Shaari, Ku Zilati Ku; Vasant, P.

    2012-09-01

    The global rise in energy demands brings major obstacles to many energy organizations in providing adequate energy supply. Hence, many techniques to generate cost effective, reliable and environmentally friendly alternative energy source are being explored. One such method is the integration of photovoltaic cells, wind turbine generators and fuel-based generators, included with storage batteries. This sort of power systems are known as distributed generation (DG) power system. However, the application of DG power systems raise certain issues such as cost effectiveness, environmental impact and reliability. The modelling as well as the optimization of this DG power system was successfully performed in the previous work using Particle Swarm Optimization (PSO). The central idea of that work was to minimize cost, minimize emissions and maximize reliability (multi-objective (MO) setting) with respect to the power balance and design requirements. In this work, we introduce a fuzzy model that takes into account the uncertain nature of certain variables in the DG system which are dependent on the weather conditions (such as; the insolation and wind speed profiles). The MO optimization in a fuzzy environment was performed by applying the Hopfield Recurrent Neural Network (HNN). Analysis on the optimized results was then carried out.

  9. Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology.

    PubMed

    Faltermeier, Rupert; Proescholdt, Martin A; Bele, Sylvia; Brawanski, Alexander

    2015-01-01

    Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses.

  10. Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology

    PubMed Central

    Faltermeier, Rupert; Proescholdt, Martin A.; Bele, Sylvia; Brawanski, Alexander

    2015-01-01

    Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses. PMID:26693250

  11. Inspection planning development: An evolutionary approach using reliability engineering as a tool

    NASA Technical Reports Server (NTRS)

    Graf, David A.; Huang, Zhaofeng

    1994-01-01

    This paper proposes an evolutionary approach for inspection planning which introduces various reliability engineering tools into the process and assess system trade-offs among reliability, engineering requirement, manufacturing capability and inspection cost to establish an optimal inspection plan. The examples presented in the paper illustrate some advantages and benefits of the new approach. Through the analysis, reliability and engineering impacts due to manufacturing process capability and inspection uncertainty are clearly understood; the most cost effective and efficient inspection plan can be established and associated risks are well controlled; some inspection reductions and relaxations are well justified; and design feedbacks and changes may be initiated from the analysis conclusion to further enhance reliability and reduce cost. The approach is particularly promising as global competitions and customer quality improvement expectations are rapidly increasing.

  12. Decision-theoretic methodology for reliability and risk allocation in nuclear power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.

    1985-01-01

    This paper describes a methodology for allocating reliability and risk to various reactor systems, subsystems, components, operations, and structures in a consistent manner, based on a set of global safety criteria which are not rigid. The problem is formulated as a multiattribute decision analysis paradigm; the multiobjective optimization, which is performed on a PRA model and reliability cost functions, serves as the guiding principle for reliability and risk allocation. The concept of noninferiority is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The assessment of the decision maker's preferencesmore » could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided and several outstanding issues such as generic allocation and preference assessment are discussed.« less

  13. Reliability Analysis for AFTI-F16 SRFCS Using ASSIST and SURE

    NASA Technical Reports Server (NTRS)

    Wu, N. Eva

    2001-01-01

    This paper reports the results of a study on reliability analysis of an AFTI-16 Self-Repairing Flight Control System (SRFCS) using software tools SURE (Semi-Markov Unreliability Range Evaluator and ASSIST (Abstract Semi-Markov Specification Interface to the SURE Tool). The purpose of the study is to investigate the potential utility of the software tools in the ongoing effort of the NASA Aviation Safety Program, where the class of systems must be extended beyond the originally intended serving class of electronic digital processors. The study concludes that SURE and ASSIST are applicable to reliability, analysis of flight control systems. They are especially efficient for sensitivity analysis that quantifies the dependence of system reliability on model parameters. The study also confirms an earlier finding on the dominant role of a parameter called a failure coverage. The paper will remark on issues related to the improvement of coverage and the optimization of redundancy level.

  14. System principles, mathematical models and methods to ensure high reliability of safety systems

    NASA Astrophysics Data System (ADS)

    Zaslavskyi, V.

    2017-04-01

    Modern safety and security systems are composed of a large number of various components designed for detection, localization, tracking, collecting, and processing of information from the systems of monitoring, telemetry, control, etc. They are required to be highly reliable in a view to correctly perform data aggregation, processing and analysis for subsequent decision making support. On design and construction phases of the manufacturing of such systems a various types of components (elements, devices, and subsystems) are considered and used to ensure high reliability of signals detection, noise isolation, and erroneous commands reduction. When generating design solutions for highly reliable systems a number of restrictions and conditions such as types of components and various constrains on resources should be considered. Various types of components perform identical functions; however, they are implemented using diverse principles, approaches and have distinct technical and economic indicators such as cost or power consumption. The systematic use of different component types increases the probability of tasks performing and eliminates the common cause failure. We consider type-variety principle as an engineering principle of system analysis, mathematical models based on this principle, and algorithms for solving optimization problems of highly reliable safety and security systems design. Mathematical models are formalized in a class of two-level discrete optimization problems of large dimension. The proposed approach, mathematical models, algorithms can be used for problem solving of optimal redundancy on the basis of a variety of methods and control devices for fault and defects detection in technical systems, telecommunication networks, and energy systems.

  15. CAVEing the MMPI for an Optimism-Pessimism Scale: Seligman's Attributional Model and the Assessment of Explanatory Style.

    ERIC Educational Resources Information Center

    Colligan, Robert C.; And Others

    1994-01-01

    Developed bipolar Minnesota Multiphasic Personality Inventory (MMPI) Optimism-Pessimism (PSM) scale based on results on Content Analysis of Verbatim Explanation applied to MMPI. Reliability and validity indices show that PSM scale is highly accurate and consistent with Seligman's theory that pessimistic explanatory style predicts increased…

  16. Development and Initial Validation of an Instrument to Measure Physicians' Use of, Knowledge about, and Attitudes Toward Computers

    PubMed Central

    Cork, Randy D.; Detmer, William M.; Friedman, Charles P.

    1998-01-01

    This paper describes details of four scales of a questionnaire—“Computers in Medical Care”—measuring attributes of computer use, self-reported computer knowledge, computer feature demand, and computer optimism of academic physicians. The reliability (i.e., precision, or degree to which the scale's result is reproducible) and validity (i.e., accuracy, or degree to which the scale actually measures what it is supposed to measure) of each scale were examined by analysis of the responses of 771 full-time academic physicians across four departments at five academic medical centers in the United States. The objectives of this paper were to define the psychometric properties of the scales as the basis for a future demonstration study and, pending the results of further validity studies, to provide the questionnaire and scales to the medical informatics community as a tool for measuring the attitudes of health care providers. Methodology: The dimensionality of each scale and degree of association of each item with the attribute of interest were determined by principal components factor analysis with othogonal varimax rotation. Weakly associated items (factor loading <.40) were deleted. The reliability of each resultant scale was computed using Cronbach's alpha coefficient. Content validity was addressed during scale construction; construct validity was examined through factor analysis and by correlational analyses. Results: Attributes of computer use, computer knowledge, and computer optimism were unidimensional, with the corresponding scales having reliabilities of.79,.91, and.86, respectively. The computer-feature demand attribute differentiated into two dimensions: the first reflecting demand for high-level functionality with reliability of.81 and the second demand for usability with reliability of.69. There were significant positive correlations between computer use, computer knowledge, and computer optimism scale scores and respondents' hands-on computer use, computer training, and self-reported computer sophistication. In addition, items posited on the computer knowledge scale to be more difficult generated significantly lower scores. Conclusion: The four scales of the questionnaire appear to measure with adequate reliability five attributes of academic physicians' attitudes toward computers in medical care: computer use, self-reported computer knowledge, demand for computer functionality, demand for computer usability, and computer optimism. Results of initial validity studies are positive, but further validation of the scales is needed. The URL of a downloadable HTML copy of the questionnaire is provided. PMID:9524349

  17. An R package for the design, analysis and operation of reservoir systems

    NASA Astrophysics Data System (ADS)

    Turner, Sean; Ng, Jia Yi; Galelli, Stefano

    2016-04-01

    We present a new R package - named "reservoir" - which has been designed for rapid and easy routing of runoff through storage. The package comprises well-established tools for capacity design (e.g., the sequent peak algorithm), performance analysis (storage-yield-reliability and reliability-resilience-vulnerability analysis) and release policy optimization (Stochastic Dynamic Programming). Operating rules can be optimized for water supply, flood control and amenity objectives, as well as for maximum hydropower production. Storage-depth-area relationships are in-built, allowing users to incorporate evaporation from the reservoir surface. We demonstrate the capabilities of the software for global studies using thousands of reservoirs from the Global Reservoir and Dam (GRanD) database fed by historical monthly inflow time series from a 0.5 degree gridded global runoff dataset. The package is freely available through the Comprehensive R Archive Network (CRAN).

  18. Removing Barriers for Effective Deployment of Intermittent Renewable Generation

    NASA Astrophysics Data System (ADS)

    Arabali, Amirsaman

    The stochastic nature of intermittent renewable resources is the main barrier to effective integration of renewable generation. This problem can be studied from feeder-scale and grid-scale perspectives. Two new stochastic methods are proposed to meet the feeder-scale controllable load with a hybrid renewable generation (including wind and PV) and energy storage system. For the first method, an optimization problem is developed whose objective function is the cost of the hybrid system including the cost of renewable generation and storage subject to constraints on energy storage and shifted load. A smart-grid strategy is developed to shift the load and match the renewable energy generation and controllable load. Minimizing the cost function guarantees minimum PV and wind generation installation, as well as storage capacity selection for supplying the controllable load. A confidence coefficient is allocated to each stochastic constraint which shows to what degree the constraint is satisfied. In the second method, a stochastic framework is developed for optimal sizing and reliability analysis of a hybrid power system including renewable resources (PV and wind) and energy storage system. The hybrid power system is optimally sized to satisfy the controllable load with a specified reliability level. A load-shifting strategy is added to provide more flexibility for the system and decrease the installation cost. Load shifting strategies and their potential impacts on the hybrid system reliability/cost analysis are evaluated trough different scenarios. Using a compromise-solution method, the best compromise between the reliability and cost will be realized for the hybrid system. For the second problem, a grid-scale stochastic framework is developed to examine the storage application and its optimal placement for the social cost and transmission congestion relief of wind integration. Storage systems are optimally placed and adequately sized to minimize the sum of operation and congestion costs over a scheduling period. A technical assessment framework is developed to enhance the efficiency of wind integration and evaluate the economics of storage technologies and conventional gas-fired alternatives. The proposed method is used to carry out a cost-benefit analysis for the IEEE 24-bus system and determine the most economical technology. In order to mitigate the financial and technical concerns of renewable energy integration into the power system, a stochastic framework is proposed for transmission grid reinforcement studies in a power system with wind generation. A multi-stage multi-objective transmission network expansion planning (TNEP) methodology is developed which considers the investment cost, absorption of private investment and reliability of the system as the objective functions. A Non-dominated Sorting Genetic Algorithm (NSGA II) optimization approach is used in combination with a probabilistic optimal power flow (POPF) to determine the Pareto optimal solutions considering the power system uncertainties. Using a compromise-solution method, the best final plan is then realized based on the decision maker preferences. The proposed methodology is applied to the IEEE 24-bus Reliability Tests System (RTS) to evaluate the feasibility and practicality of the developed planning strategy.

  19. Information fusion based techniques for HEVC

    NASA Astrophysics Data System (ADS)

    Fernández, D. G.; Del Barrio, A. A.; Botella, Guillermo; Meyer-Baese, Uwe; Meyer-Baese, Anke; Grecos, Christos

    2017-05-01

    Aiming at the conflict circumstances of multi-parameter H.265/HEVC encoder system, the present paper introduces the analysis of many optimizations' set in order to improve the trade-off between quality, performance and power consumption for different reliable and accurate applications. This method is based on the Pareto optimization and has been tested with different resolutions on real-time encoders.

  20. Choosing the optimal wind turbine variant using the ”ELECTRE” method

    NASA Astrophysics Data System (ADS)

    Ţişcă, I. A.; Anuşca, D.; Dumitrescu, C. D.

    2017-08-01

    This paper presents a method of choosing the “optimal” alternative, both under certainty and under uncertainty, based on relevant analysis criteria. Taking into account that a product can be assimilated to a system and that the reliability of the system depends on the reliability of its components, the choice of product (the appropriate system decision) can be done using the “ELECTRE” method and depending on the level of reliability of each product. In the paper, the “ELECTRE” method is used in choosing the optimal version of a wind turbine required to equip a wind farm in western Romania. The problems to be solved are related to the current situation of wind turbines that involves reliability problems. A set of criteria has been proposed to compare two or more products from a range of available products: Operating conditions, Environmental conditions during operation, Time requirements. Using the ELECTRE hierarchical mathematical method it was established that on the basis of the obtained coefficients of concordance the optimal variant of the wind turbine and the order of preference of the variants are determined, the values chosen as limits being arbitrary.

  1. Structural Reliability Analysis and Optimization: Use of Approximations

    NASA Technical Reports Server (NTRS)

    Grandhi, Ramana V.; Wang, Liping

    1999-01-01

    This report is intended for the demonstration of function approximation concepts and their applicability in reliability analysis and design. Particularly, approximations in the calculation of the safety index, failure probability and structural optimization (modification of design variables) are developed. With this scope in mind, extensive details on probability theory are avoided. Definitions relevant to the stated objectives have been taken from standard text books. The idea of function approximations is to minimize the repetitive use of computationally intensive calculations by replacing them with simpler closed-form equations, which could be nonlinear. Typically, the approximations provide good accuracy around the points where they are constructed, and they need to be periodically updated to extend their utility. There are approximations in calculating the failure probability of a limit state function. The first one, which is most commonly discussed, is how the limit state is approximated at the design point. Most of the time this could be a first-order Taylor series expansion, also known as the First Order Reliability Method (FORM), or a second-order Taylor series expansion (paraboloid), also known as the Second Order Reliability Method (SORM). From the computational procedure point of view, this step comes after the design point identification; however, the order of approximation for the probability of failure calculation is discussed first, and it is denoted by either FORM or SORM. The other approximation of interest is how the design point, or the most probable failure point (MPP), is identified. For iteratively finding this point, again the limit state is approximated. The accuracy and efficiency of the approximations make the search process quite practical for analysis intensive approaches such as the finite element methods; therefore, the crux of this research is to develop excellent approximations for MPP identification and also different approximations including the higher-order reliability methods (HORM) for representing the failure surface. This report is divided into several parts to emphasize different segments of the structural reliability analysis and design. Broadly, it consists of mathematical foundations, methods and applications. Chapter I discusses the fundamental definitions of the probability theory, which are mostly available in standard text books. Probability density function descriptions relevant to this work are addressed. In Chapter 2, the concept and utility of function approximation are discussed for a general application in engineering analysis. Various forms of function representations and the latest developments in nonlinear adaptive approximations are presented with comparison studies. Research work accomplished in reliability analysis is presented in Chapter 3. First, the definition of safety index and most probable point of failure are introduced. Efficient ways of computing the safety index with a fewer number of iterations is emphasized. In chapter 4, the probability of failure prediction is presented using first-order, second-order and higher-order methods. System reliability methods are discussed in chapter 5. Chapter 6 presents optimization techniques for the modification and redistribution of structural sizes for improving the structural reliability. The report also contains several appendices on probability parameters.

  2. Parts and Components Reliability Assessment: A Cost Effective Approach

    NASA Technical Reports Server (NTRS)

    Lee, Lydia

    2009-01-01

    System reliability assessment is a methodology which incorporates reliability analyses performed at parts and components level such as Reliability Prediction, Failure Modes and Effects Analysis (FMEA) and Fault Tree Analysis (FTA) to assess risks, perform design tradeoffs, and therefore, to ensure effective productivity and/or mission success. The system reliability is used to optimize the product design to accommodate today?s mandated budget, manpower, and schedule constraints. Stand ard based reliability assessment is an effective approach consisting of reliability predictions together with other reliability analyses for electronic, electrical, and electro-mechanical (EEE) complex parts and components of large systems based on failure rate estimates published by the United States (U.S.) military or commercial standards and handbooks. Many of these standards are globally accepted and recognized. The reliability assessment is especially useful during the initial stages when the system design is still in the development and hard failure data is not yet available or manufacturers are not contractually obliged by their customers to publish the reliability estimates/predictions for their parts and components. This paper presents a methodology to assess system reliability using parts and components reliability estimates to ensure effective productivity and/or mission success in an efficient manner, low cost, and tight schedule.

  3. Large-scale-system effectiveness analysis. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, A.D.; Ayoub, A.K.; Foster, J.W.

    1979-11-01

    Objective of the research project has been the investigation and development of methods for calculating system reliability indices which have absolute, and measurable, significance to consumers. Such indices are a necessary prerequisite to any scheme for system optimization which includes the economic consequences of consumer service interruptions. A further area of investigation has been joint consideration of generation and transmission in reliability studies. Methods for finding or estimating the probability distributions of some measures of reliability performance have been developed. The application of modern Monte Carlo simulation methods to compute reliability indices in generating systems has been studied.

  4. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.« less

  5. Direct labeling of serum proteins by fluorescent dye for antibody microarray.

    PubMed

    Klimushina, M V; Gumanova, N G; Metelskaya, V A

    2017-05-06

    Analysis of serum proteome by antibody microarray is used to identify novel biomarkers and to study signaling pathways including protein phosphorylation and protein-protein interactions. Labeling of serum proteins is important for optimal performance of the antibody microarray. Proper choice of fluorescent label and optimal concentration of protein loaded on the microarray ensure good quality of imaging that can be reliably scanned and processed by the software. We have optimized direct serum protein labeling using fluorescent dye Arrayit Green 540 (Arrayit Corporation, USA) for antibody microarray. Optimized procedure produces high quality images that can be readily scanned and used for statistical analysis of protein composition of the serum. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Coping with occupational stress: the role of optimism and coping flexibility.

    PubMed

    Reed, Daniel J

    2016-01-01

    The current study aimed at measuring whether coping flexibility is a reliable and valid construct in a UK sample and subsequently investigating the association between coping flexibility, optimism, and psychological health - measured by perceived stress and life satisfaction. A UK university undergraduate student sample (N=95) completed an online questionnaire. The study is among the first to examine the validity and reliability of the English version of a scale measuring coping flexibility in a Western population and is also the first to investigate the association between optimism and coping flexibility. The results revealed that the scale had good reliability overall; however, factor analysis revealed no support for the existing two-factor structure of the scale. Coping flexibility and optimism were found to be strongly correlated, and hierarchical regression analyses revealed that the interaction between them predicted a large proportion of the variance in both perceived stress and life satisfaction. In addition, structural equation modeling revealed that optimism completely mediated the relationship between coping flexibility and both perceived stress and life satisfaction. The findings add to the occupational stress literature to further our understanding of how optimism is important in psychological health. Furthermore, given that optimism is a personality trait, and consequently relatively stable, the study also provides preliminary support for the potential of targeting coping flexibility to improve psychological health in Western populations. These findings must be replicated, and further analyses of the English version of the Coping Flexibility Scale are needed.

  7. Fault trees for decision making in systems analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lambert, Howard E.

    1975-10-09

    The application of fault tree analysis (FTA) to system safety and reliability is presented within the framework of system safety analysis. The concepts and techniques involved in manual and automated fault tree construction are described and their differences noted. The theory of mathematical reliability pertinent to FTA is presented with emphasis on engineering applications. An outline of the quantitative reliability techniques of the Reactor Safety Study is given. Concepts of probabilistic importance are presented within the fault tree framework and applied to the areas of system design, diagnosis and simulation. The computer code IMPORTANCE ranks basic events and cut setsmore » according to a sensitivity analysis. A useful feature of the IMPORTANCE code is that it can accept relative failure data as input. The output of the IMPORTANCE code can assist an analyst in finding weaknesses in system design and operation, suggest the most optimal course of system upgrade, and determine the optimal location of sensors within a system. A general simulation model of system failure in terms of fault tree logic is described. The model is intended for efficient diagnosis of the causes of system failure in the event of a system breakdown. It can also be used to assist an operator in making decisions under a time constraint regarding the future course of operations. The model is well suited for computer implementation. New results incorporated in the simulation model include an algorithm to generate repair checklists on the basis of fault tree logic and a one-step-ahead optimization procedure that minimizes the expected time to diagnose system failure.« less

  8. Reliability Centred Maintenance (RCM) Analysis of Laser Machine in Filling Lithos at PT X

    NASA Astrophysics Data System (ADS)

    Suryono, M. A. E.; Rosyidi, C. N.

    2018-03-01

    PT. X used automated machines which work for sixteen hours per day. Therefore, the machines should be maintained to keep the availability of the machines. The aim of this research is to determine maintenance tasks according to the cause of component’s failure using Reliability Centred Maintenance (RCM) and determine the amount of optimal inspection frequency which must be performed to the machine at filling lithos process. In this research, RCM is used as an analysis tool to determine the critical component and find optimal inspection frequencies to maximize machine’s reliability. From the analysis, we found that the critical machine in filling lithos process is laser machine in Line 2. Then we proceed to determine the cause of machine’s failure. Lastube component has the highest Risk Priority Number (RPN) among other components such as power supply, lens, chiller, laser siren, encoder, conveyor, and mirror galvo. Most of the components have operational consequences and the others have hidden failure consequences and safety consequences. Time-directed life-renewal task, failure finding task, and servicing task can be used to overcome these consequences. The results of data analysis show that the inspection must be performed once a month for laser machine in the form of preventive maintenance to lowering the downtime.

  9. Reliability Analysis of Sealing Structure of Electromechanical System Based on Kriging Model

    NASA Astrophysics Data System (ADS)

    Zhang, F.; Wang, Y. M.; Chen, R. W.; Deng, W. W.; Gao, Y.

    2018-05-01

    The sealing performance of aircraft electromechanical system has a great influence on flight safety, and the reliability of its typical seal structure is analyzed by researcher. In this paper, we regard reciprocating seal structure as a research object to study structural reliability. Having been based on the finite element numerical simulation method, the contact stress between the rubber sealing ring and the cylinder wall is calculated, and the relationship between the contact stress and the pressure of the hydraulic medium is built, and the friction force on different working conditions are compared. Through the co-simulation, the adaptive Kriging model obtained by EFF learning mechanism is used to describe the failure probability of the seal ring, so as to evaluate the reliability of the sealing structure. This article proposes a new idea of numerical evaluation for the reliability analysis of sealing structure, and also provides a theoretical basis for the optimal design of sealing structure.

  10. DAKOTA Design Analysis Kit for Optimization and Terascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Dalbey, Keith R.; Eldred, Michael S.

    2010-02-24

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes (computational models) and iterative analysis methods. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and analysis of computational models on high performance computers.A user provides a set of DAKOTA commands in an input file and launches DAKOTA. DAKOTA invokes instances of the computational models, collects their results, and performs systems analyses. DAKOTA contains algorithms for optimization with gradient and nongradient-basedmore » methods; uncertainty quantification with sampling, reliability, polynomial chaos, stochastic collocation, and epistemic methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as hybrid optimization, surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. Services for parallel computing, simulation interfacing, approximation modeling, fault tolerance, restart, and graphics are also included.« less

  11. Thermal-Structural Optimization of Integrated Cryogenic Propellant Tank Concepts for a Reusable Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore F.; Waters, W. Allen; Singer, Thomas N.; Haftka, Raphael T.

    2004-01-01

    A next generation reusable launch vehicle (RLV) will require thermally efficient and light-weight cryogenic propellant tank structures. Since these tanks will be weight-critical, analytical tools must be developed to aid in sizing the thickness of insulation layers and structural geometry for optimal performance. Finite element method (FEM) models of the tank and insulation layers were created to analyze the thermal performance of the cryogenic insulation layer and thermal protection system (TPS) of the tanks. The thermal conditions of ground-hold and re-entry/soak-through for a typical RLV mission were used in the thermal sizing study. A general-purpose nonlinear FEM analysis code, capable of using temperature and pressure dependent material properties, was used as the thermal analysis code. Mechanical loads from ground handling and proof-pressure testing were used to size the structural geometry of an aluminum cryogenic tank wall. Nonlinear deterministic optimization and reliability optimization techniques were the analytical tools used to size the geometry of the isogrid stiffeners and thickness of the skin. The results from the sizing study indicate that a commercial FEM code can be used for thermal analyses to size the insulation thicknesses where the temperature and pressure were varied. The results from the structural sizing study show that using combined deterministic and reliability optimization techniques can obtain alternate and lighter designs than the designs obtained from deterministic optimization methods alone.

  12. Probabilistic Multi-Scale, Multi-Level, Multi-Disciplinary Analysis and Optimization of Engine Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2000-01-01

    Aircraft engines are assemblies of dynamically interacting components. Engine updates to keep present aircraft flying safely and engines for new aircraft are progressively required to operate in more demanding technological and environmental requirements. Designs to effectively meet those requirements are necessarily collections of multi-scale, multi-level, multi-disciplinary analysis and optimization methods and probabilistic methods are necessary to quantify respective uncertainties. These types of methods are the only ones that can formally evaluate advanced composite designs which satisfy those progressively demanding requirements while assuring minimum cost, maximum reliability and maximum durability. Recent research activities at NASA Glenn Research Center have focused on developing multi-scale, multi-level, multidisciplinary analysis and optimization methods. Multi-scale refers to formal methods which describe complex material behavior metal or composite; multi-level refers to integration of participating disciplines to describe a structural response at the scale of interest; multidisciplinary refers to open-ended for various existing and yet to be developed discipline constructs required to formally predict/describe a structural response in engine operating environments. For example, these include but are not limited to: multi-factor models for material behavior, multi-scale composite mechanics, general purpose structural analysis, progressive structural fracture for evaluating durability and integrity, noise and acoustic fatigue, emission requirements, hot fluid mechanics, heat-transfer and probabilistic simulations. Many of these, as well as others, are encompassed in an integrated computer code identified as Engine Structures Technology Benefits Estimator (EST/BEST) or Multi-faceted/Engine Structures Optimization (MP/ESTOP). The discipline modules integrated in MP/ESTOP include: engine cycle (thermodynamics), engine weights, internal fluid mechanics, cost, mission and coupled structural/thermal, various composite property simulators and probabilistic methods to evaluate uncertainty effects (scatter ranges) in all the design parameters. The objective of the proposed paper is to briefly describe a multi-faceted design analysis and optimization capability for coupled multi-discipline engine structures optimization. Results are presented for engine and aircraft type metrics to illustrate the versatility of that capability. Results are also presented for reliability, noise and fatigue to illustrate its inclusiveness. For example, replacing metal rotors with composites reduces the engine weight by 20 percent, 15 percent noise reduction, and an order of magnitude improvement in reliability. Composite designs exist to increase fatigue life by at least two orders of magnitude compared to state-of-the-art metals.

  13. Reliability culture at La Silla Paranal Observatory

    NASA Astrophysics Data System (ADS)

    Gonzalez, Sergio

    2010-07-01

    The Maintenance Department at the La Silla - Paranal Observatory has been an important base to keep the operations of the observatory at a good level of reliability and availability. Several strategies have been implemented and improved in order to cover these requirements and keep the system and equipment working properly when it is required. For that reason, one of the latest improvements has been the introduction of the concept of reliability, which implies that we don't simply speak about reliability concepts. It involves much more than that. It involves the use of technologies, data collecting, data analysis, decision making, committees concentrated in analysis of failure modes and how they can be eliminated, aligning the results with the requirements of our internal partners and establishing steps to achieve success. Some of these steps have already been implemented: data collection, use of technologies, analysis of data, development of priority tools, committees dedicated to analyze data and people dedicated to reliability analysis. This has permitted us to optimize our process, analyze where we can improve, avoid functional failures, reduce the failures range in several systems and subsystems; all this has had a positive impact in terms of results for our Observatory. All these tools are part of the reliability culture that allows our system to operate with a high level of reliability and availability.

  14. Optimal Bi-Objective Redundancy Allocation for Systems Reliability and Risk Management.

    PubMed

    Govindan, Kannan; Jafarian, Ahmad; Azbari, Mostafa E; Choi, Tsan-Ming

    2016-08-01

    In the big data era, systems reliability is critical to effective systems risk management. In this paper, a novel multiobjective approach, with hybridization of a known algorithm called NSGA-II and an adaptive population-based simulated annealing (APBSA) method is developed to solve the systems reliability optimization problems. In the first step, to create a good algorithm, we use a coevolutionary strategy. Since the proposed algorithm is very sensitive to parameter values, the response surface method is employed to estimate the appropriate parameters of the algorithm. Moreover, to examine the performance of our proposed approach, several test problems are generated, and the proposed hybrid algorithm and other commonly known approaches (i.e., MOGA, NRGA, and NSGA-II) are compared with respect to four performance measures: 1) mean ideal distance; 2) diversification metric; 3) percentage of domination; and 4) data envelopment analysis. The computational studies have shown that the proposed algorithm is an effective approach for systems reliability and risk management.

  15. Optimization Based Efficiencies in First Order Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Peck, Jeffrey A.; Mahadevan, Sankaran

    2003-01-01

    This paper develops a method for updating the gradient vector of the limit state function in reliability analysis using Broyden's rank one updating technique. In problems that use commercial code as a black box, the gradient calculations are usually done using a finite difference approach, which becomes very expensive for large system models. The proposed method replaces the finite difference gradient calculations in a standard first order reliability method (FORM) with Broyden's Quasi-Newton technique. The resulting algorithm of Broyden updates within a FORM framework (BFORM) is used to run several example problems, and the results compared to standard FORM results. It is found that BFORM typically requires fewer functional evaluations that FORM to converge to the same answer.

  16. Deterministic Design Optimization of Structures in OpenMDAO Framework

    NASA Technical Reports Server (NTRS)

    Coroneos, Rula M.; Pai, Shantaram S.

    2012-01-01

    Nonlinear programming algorithms play an important role in structural design optimization. Several such algorithms have been implemented in OpenMDAO framework developed at NASA Glenn Research Center (GRC). OpenMDAO is an open source engineering analysis framework, written in Python, for analyzing and solving Multi-Disciplinary Analysis and Optimization (MDAO) problems. It provides a number of solvers and optimizers, referred to as components and drivers, which users can leverage to build new tools and processes quickly and efficiently. Users may download, use, modify, and distribute the OpenMDAO software at no cost. This paper summarizes the process involved in analyzing and optimizing structural components by utilizing the framework s structural solvers and several gradient based optimizers along with a multi-objective genetic algorithm. For comparison purposes, the same structural components were analyzed and optimized using CometBoards, a NASA GRC developed code. The reliability and efficiency of the OpenMDAO framework was compared and reported in this report.

  17. The Role of Probabilistic Design Analysis Methods in Safety and Affordability

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    2016-01-01

    For the last several years, NASA and its contractors have been working together to build space launch systems to commercialize space. Developing commercial affordable and safe launch systems becomes very important and requires a paradigm shift. This paradigm shift enforces the need for an integrated systems engineering environment where cost, safety, reliability, and performance need to be considered to optimize the launch system design. In such an environment, rule based and deterministic engineering design practices alone may not be sufficient to optimize margins and fault tolerance to reduce cost. As a result, introduction of Probabilistic Design Analysis (PDA) methods to support the current deterministic engineering design practices becomes a necessity to reduce cost without compromising reliability and safety. This paper discusses the importance of PDA methods in NASA's new commercial environment, their applications, and the key role they can play in designing reliable, safe, and affordable launch systems. More specifically, this paper discusses: 1) The involvement of NASA in PDA 2) Why PDA is needed 3) A PDA model structure 4) A PDA example application 5) PDA link to safety and affordability.

  18. Optimal Measurement Conditions for Spatiotemporal EEG/MEG Source Analysis.

    ERIC Educational Resources Information Center

    Huizenga, Hilde M.; Heslenfeld, Dirk J.; Molenaar, Peter C. M.

    2002-01-01

    Developed a method to determine the required number and position of sensors for human brain electromagnetic source analysis. Studied the method through a simulation study and an empirical study on visual evoked potentials in one adult male. Results indicate the method is fast and reliable and improves source precision. (SLD)

  19. Reliability Estimation of Aero-engine Based on Mixed Weibull Distribution Model

    NASA Astrophysics Data System (ADS)

    Yuan, Zhongda; Deng, Junxiang; Wang, Dawei

    2018-02-01

    Aero-engine is a complex mechanical electronic system, based on analysis of reliability of mechanical electronic system, Weibull distribution model has an irreplaceable role. Till now, only two-parameter Weibull distribution model and three-parameter Weibull distribution are widely used. Due to diversity of engine failure modes, there is a big error with single Weibull distribution model. By contrast, a variety of engine failure modes can be taken into account with mixed Weibull distribution model, so it is a good statistical analysis model. Except the concept of dynamic weight coefficient, in order to make reliability estimation result more accurately, three-parameter correlation coefficient optimization method is applied to enhance Weibull distribution model, thus precision of mixed distribution reliability model is improved greatly. All of these are advantageous to popularize Weibull distribution model in engineering applications.

  20. Total systems design analysis of high performance structures

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1993-01-01

    Designer-control parameters were identified at interdiscipline interfaces to optimize structural systems performance and downstream development and operations with reliability and least life-cycle cost. Interface tasks and iterations are tracked through a matrix of performance disciplines integration versus manufacturing, verification, and operations interactions for a total system design analysis. Performance integration tasks include shapes, sizes, environments, and materials. Integrity integrating tasks are reliability and recurring structural costs. Significant interface designer control parameters were noted as shapes, dimensions, probability range factors, and cost. Structural failure concept is presented, and first-order reliability and deterministic methods, benefits, and limitations are discussed. A deterministic reliability technique combining benefits of both is proposed for static structures which is also timely and economically verifiable. Though launch vehicle environments were primarily considered, the system design process is applicable to any surface system using its own unique filed environments.

  1. Optimization of deformation monitoring networks using finite element strain analysis

    NASA Astrophysics Data System (ADS)

    Alizadeh-Khameneh, M. Amin; Eshagh, Mehdi; Jensen, Anna B. O.

    2018-04-01

    An optimal design of a geodetic network can fulfill the requested precision and reliability of the network, and decrease the expenses of its execution by removing unnecessary observations. The role of an optimal design is highlighted in deformation monitoring network due to the repeatability of these networks. The core design problem is how to define precision and reliability criteria. This paper proposes a solution, where the precision criterion is defined based on the precision of deformation parameters, i. e. precision of strain and differential rotations. A strain analysis can be performed to obtain some information about the possible deformation of a deformable object. In this study, we split an area into a number of three-dimensional finite elements with the help of the Delaunay triangulation and performed the strain analysis on each element. According to the obtained precision of deformation parameters in each element, the precision criterion of displacement detection at each network point is then determined. The developed criterion is implemented to optimize the observations from the Global Positioning System (GPS) in Skåne monitoring network in Sweden. The network was established in 1989 and straddled the Tornquist zone, which is one of the most active faults in southern Sweden. The numerical results show that 17 out of all 21 possible GPS baseline observations are sufficient to detect minimum 3 mm displacement at each network point.

  2. Numerical aerodynamic simulation facility. Preliminary study extension

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The production of an optimized design of key elements of the candidate facility was the primary objective of this report. This was accomplished by effort in the following tasks: (1) to further develop, optimize and describe the function description of the custom hardware; (2) to delineate trade off areas between performance, reliability, availability, serviceability, and programmability; (3) to develop metrics and models for validation of the candidate systems performance; (4) to conduct a functional simulation of the system design; (5) to perform a reliability analysis of the system design; and (6) to develop the software specifications to include a user level high level programming language, a correspondence between the programming language and instruction set and outline the operation system requirements.

  3. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.« less

  4. A hyperspectral image optimizing method based on sub-pixel MTF analysis

    NASA Astrophysics Data System (ADS)

    Wang, Yun; Li, Kai; Wang, Jinqiang; Zhu, Yajie

    2015-04-01

    Hyperspectral imaging is used to collect tens or hundreds of images continuously divided across electromagnetic spectrum so that the details under different wavelengths could be represented. A popular hyperspectral imaging methods uses a tunable optical band-pass filter settled in front of the focal plane to acquire images of different wavelengths. In order to alleviate the influence of chromatic aberration in some segments in a hyperspectral series, in this paper, a hyperspectral optimizing method uses sub-pixel MTF to evaluate image blurring quality was provided. This method acquired the edge feature in the target window by means of the line spread function (LSF) to calculate the reliable position of the edge feature, then the evaluation grid in each line was interpolated by the real pixel value based on its relative position to the optimal edge and the sub-pixel MTF was used to analyze the image in frequency domain, by which MTF calculation dimension was increased. The sub-pixel MTF evaluation was reliable, since no image rotation and pixel value estimation was needed, and no artificial information was introduced. With theoretical analysis, the method proposed in this paper is reliable and efficient when evaluation the common images with edges of small tilt angle in real scene. It also provided a direction for the following hyperspectral image blurring evaluation and the real-time focal plane adjustment in real time in related imaging system.

  5. Methodology for safety optimization of highway cross-sections for horizontal curves with restricted sight distance.

    PubMed

    Ibrahim, Shewkar E; Sayed, Tarek; Ismail, Karim

    2012-11-01

    Several earlier studies have noted the shortcomings with existing geometric design guides which provide deterministic standards. In these standards the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from the standards. To mitigate these shortcomings, probabilistic geometric design has been advocated where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a mechanism for risk measurement to evaluate the safety impact of deviations from design standards. This paper applies reliability analysis for optimizing the safety of highway cross-sections. The paper presents an original methodology to select a suitable combination of cross-section elements with restricted sight distance to result in reduced collisions and consistent risk levels. The purpose of this optimization method is to provide designers with a proactive approach to the design of cross-section elements in order to (i) minimize the risk associated with restricted sight distance, (ii) balance the risk across the two carriageways of the highway, and (iii) reduce the expected collision frequency. A case study involving nine cross-sections that are parts of two major highway developments in British Columbia, Canada, was presented. The results showed that an additional reduction in collisions can be realized by incorporating the reliability component, P(nc) (denoting the probability of non-compliance), in the optimization process. The proposed approach results in reduced and consistent risk levels for both travel directions in addition to further collision reductions. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Probabilistic Analysis and Design of a Raked Wing Tip for a Commercial Transport

    NASA Technical Reports Server (NTRS)

    Mason Brian H.; Chen, Tzi-Kang; Padula, Sharon L.; Ransom, Jonathan B.; Stroud, W. Jefferson

    2008-01-01

    An approach for conducting reliability-based design and optimization (RBDO) of a Boeing 767 raked wing tip (RWT) is presented. The goal is to evaluate the benefits of RBDO for design of an aircraft substructure. A finite-element (FE) model that includes eight critical static load cases is used to evaluate the response of the wing tip. Thirteen design variables that describe the thickness of the composite skins and stiffeners are selected to minimize the weight of the wing tip. A strain-based margin of safety is used to evaluate the performance of the structure. The randomness in the load scale factor and in the strain limits is considered. Of the 13 variables, the wing-tip design was controlled primarily by the thickness of the thickest plies in the upper skins. The report includes an analysis of the optimization results and recommendations for future reliability-based studies.

  7. Design for a Crane Metallic Structure Based on Imperialist Competitive Algorithm and Inverse Reliability Strategy

    NASA Astrophysics Data System (ADS)

    Fan, Xiao-Ning; Zhi, Bo

    2017-07-01

    Uncertainties in parameters such as materials, loading, and geometry are inevitable in designing metallic structures for cranes. When considering these uncertainty factors, reliability-based design optimization (RBDO) offers a more reasonable design approach. However, existing RBDO methods for crane metallic structures are prone to low convergence speed and high computational cost. A unilevel RBDO method, combining a discrete imperialist competitive algorithm with an inverse reliability strategy based on the performance measure approach, is developed. Application of the imperialist competitive algorithm at the optimization level significantly improves the convergence speed of this RBDO method. At the reliability analysis level, the inverse reliability strategy is used to determine the feasibility of each probabilistic constraint at each design point by calculating its α-percentile performance, thereby avoiding convergence failure, calculation error, and disproportionate computational effort encountered using conventional moment and simulation methods. Application of the RBDO method to an actual crane structure shows that the developed RBDO realizes a design with the best tradeoff between economy and safety together with about one-third of the convergence speed and the computational cost of the existing method. This paper provides a scientific and effective design approach for the design of metallic structures of cranes.

  8. Microgrid Design Analysis Using Technology Management Optimization and the Performance Reliability Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stamp, Jason E.; Eddy, John P.; Jensen, Richard P.

    Microgrids are a focus of localized energy production that support resiliency, security, local con- trol, and increased access to renewable resources (among other potential benefits). The Smart Power Infrastructure Demonstration for Energy Reliability and Security (SPIDERS) Joint Capa- bility Technology Demonstration (JCTD) program between the Department of Defense (DOD), Department of Energy (DOE), and Department of Homeland Security (DHS) resulted in the pre- liminary design and deployment of three microgrids at military installations. This paper is focused on the analysis process and supporting software used to determine optimal designs for energy surety microgrids (ESMs) in the SPIDERS project. There aremore » two key pieces of software, an ex- isting software application developed by Sandia National Laboratories (SNL) called Technology Management Optimization (TMO) and a new simulation developed for SPIDERS called the per- formance reliability model (PRM). TMO is a decision support tool that performs multi-objective optimization over a mixed discrete/continuous search space for which the performance measures are unrestricted in form. The PRM is able to statistically quantify the performance and reliability of a microgrid operating in islanded mode (disconnected from any utility power source). Together, these two software applications were used as part of the ESM process to generate the preliminary designs presented by SNL-led DOE team to the DOD. Acknowledgements Sandia National Laboratories and the SPIDERS technical team would like to acknowledge the following for help in the project: * Mike Hightower, who has been the key driving force for Energy Surety Microgrids * Juan Torres and Abbas Akhil, who developed the concept of microgrids for military instal- lations * Merrill Smith, U.S. Department of Energy SPIDERS Program Manager * Ross Roley and Rich Trundy from U.S. Pacific Command * Bill Waugaman and Bill Beary from U.S. Northern Command * Tarek Abdallah, Melanie Johnson, and Harold Sanborn of the U.S. Army Corps of Engineers Construction Engineering Research Laboratory * Colleagues from Sandia National Laboratories (SNL) for their reviews, suggestions, and participation in the work.« less

  9. Estimates Of The Orbiter RSI Thermal Protection System Thermal Reliability

    NASA Technical Reports Server (NTRS)

    Kolodziej, P.; Rasky, D. J.

    2002-01-01

    In support of the Space Shuttle Orbiter post-flight inspection, structure temperatures are recorded at selected positions on the windward, leeward, starboard and port surfaces. Statistical analysis of this flight data and a non-dimensional load interference (NDLI) method are used to estimate the thermal reliability at positions were reusable surface insulation (RSI) is installed. In this analysis, structure temperatures that exceed the design limit define the critical failure mode. At thirty-three positions the RSI thermal reliability is greater than 0.999999 for the missions studied. This is not the overall system level reliability of the thermal protection system installed on an Orbiter. The results from two Orbiters, OV-102 and OV-105, are in good agreement. The original RSI designs on the OV-102 Orbital Maneuvering System pods, which had low reliability, were significantly improved on OV-105. The NDLI method was also used to estimate thermal reliability from an assessment of TPS uncertainties that was completed shortly before the first Orbiter flight. Results fiom the flight data analysis and the pre-flight assessment agree at several positions near each other. The NDLI method is also effective for optimizing RSI designs to provide uniform thermal reliability on the acreage surface of reusable launch vehicles.

  10. Understanding Expenditure Data.

    ERIC Educational Resources Information Center

    Dyke, Frances L.

    2000-01-01

    Stresses the importance of common understandings of cost definitions and data collection in order to create reliable databases with optimal utility for inter-institutional analysis. Examines definitions of common expenditure categories, discusses cost-accumulation rules governing financial reporting, and explains differences between direct costs…

  11. Reliability Analysis and Optimal Release Problem Considering Maintenance Time of Software Components for an Embedded OSS Porting Phase

    NASA Astrophysics Data System (ADS)

    Tamura, Yoshinobu; Yamada, Shigeru

    OSS (open source software) systems which serve as key components of critical infrastructures in our social life are still ever-expanding now. Especially, embedded OSS systems have been gaining a lot of attention in the embedded system area, i.e., Android, BusyBox, TRON, etc. However, the poor handling of quality problem and customer support prohibit the progress of embedded OSS. Also, it is difficult for developers to assess the reliability and portability of embedded OSS on a single-board computer. In this paper, we propose a method of software reliability assessment based on flexible hazard rates for the embedded OSS. Also, we analyze actual data of software failure-occurrence time-intervals to show numerical examples of software reliability assessment for the embedded OSS. Moreover, we compare the proposed hazard rate model for the embedded OSS with the typical conventional hazard rate models by using the comparison criteria of goodness-of-fit. Furthermore, we discuss the optimal software release problem for the porting-phase based on the total expected software maintenance cost.

  12. Reliability and Productivity Modeling for the Optimization of Separated Spacecraft Interferometers

    NASA Technical Reports Server (NTRS)

    Kenny, Sean (Technical Monitor); Wertz, Julie

    2002-01-01

    As technological systems grow in capability, they also grow in complexity. Due to this complexity, it is no longer possible for a designer to use engineering judgement to identify the components that have the largest impact on system life cycle metrics, such as reliability, productivity, cost, and cost effectiveness. One way of identifying these key components is to build quantitative models and analysis tools that can be used to aid the designer in making high level architecture decisions. Once these key components have been identified, two main approaches to improving a system using these components exist: add redundancy or improve the reliability of the component. In reality, the most effective approach to almost any system will be some combination of these two approaches, in varying orders of magnitude for each component. Therefore, this research tries to answer the question of how to divide funds, between adding redundancy and improving the reliability of components, to most cost effectively improve the life cycle metrics of a system. While this question is relevant to any complex system, this research focuses on one type of system in particular: Separate Spacecraft Interferometers (SSI). Quantitative models are developed to analyze the key life cycle metrics of different SSI system architectures. Next, tools are developed to compare a given set of architectures in terms of total performance, by coupling different life cycle metrics together into one performance metric. Optimization tools, such as simulated annealing and genetic algorithms, are then used to search the entire design space to find the "optimal" architecture design. Sensitivity analysis tools have been developed to determine how sensitive the results of these analyses are to uncertain user defined parameters. Finally, several possibilities for the future work that could be done in this area of research are presented.

  13. Power System Reliability Assessment by Analysing Voltage Dips on the Blue Horizon Bay 22KV Overhead Line in the Nelson Mandela Bay Municipality

    NASA Astrophysics Data System (ADS)

    Lamour, B. G.; Harris, R. T.; Roberts, A. G.

    2010-06-01

    Power system reliability problems are very difficult to solve because the power systems are complex and geographically widely distributed and influenced by numerous unexpected events. It is therefore imperative to employ the most efficient optimization methods in solving the problems relating to reliability of the power system. This paper presents a reliability analysis and study of the power interruptions resulting from severe power outages in the Nelson Mandela Bay Municipality (NMBM), South Africa and includes an overview of the important factors influencing reliability, and methods to improve the reliability. The Blue Horizon Bay 22 kV overhead line, supplying a 6.6 kV residential sector has been selected. It has been established that 70% of the outages, recorded at the source, originate on this feeder.

  14. Optimized Vertex Method and Hybrid Reliability

    NASA Technical Reports Server (NTRS)

    Smith, Steven A.; Krishnamurthy, T.; Mason, B. H.

    2002-01-01

    A method of calculating the fuzzy response of a system is presented. This method, called the Optimized Vertex Method (OVM), is based upon the vertex method but requires considerably fewer function evaluations. The method is demonstrated by calculating the response membership function of strain-energy release rate for a bonded joint with a crack. The possibility of failure of the bonded joint was determined over a range of loads. After completing the possibilistic analysis, the possibilistic (fuzzy) membership functions were transformed to probability density functions and the probability of failure of the bonded joint was calculated. This approach is called a possibility-based hybrid reliability assessment. The possibility and probability of failure are presented and compared to a Monte Carlo Simulation (MCS) of the bonded joint.

  15. Reliability-based structural optimization: A proposed analytical-experimental study

    NASA Technical Reports Server (NTRS)

    Stroud, W. Jefferson; Nikolaidis, Efstratios

    1993-01-01

    An analytical and experimental study for assessing the potential of reliability-based structural optimization is proposed and described. In the study, competing designs obtained by deterministic and reliability-based optimization are compared. The experimental portion of the study is practical because the structure selected is a modular, actively and passively controlled truss that consists of many identical members, and because the competing designs are compared in terms of their dynamic performance and are not destroyed if failure occurs. The analytical portion of this study is illustrated on a 10-bar truss example. In the illustrative example, it is shown that reliability-based optimization can yield a design that is superior to an alternative design obtained by deterministic optimization. These analytical results provide motivation for the proposed study, which is underway.

  16. A Method to Predict the Reliability of Military Ground Vehicles Using High Performance Computing

    DTIC Science & Technology

    2006-11-01

    Krayterman U.S. Army RDECOM-TARDEC Warren, MI 48397 K.K. Choi, Ed Hardee University of Iowa Coralville , IA 52242 Byeng D. Youn Michigan...University of Iowa , performed an optimization of the design for an A-arm on a military ground vehicle (a Stryker), using no sources of uncertainty...LSF for the queueing system. 3.3 Reliability/Fatigue Analysis software We used several pieces of propriety code from the University of Iowa

  17. Water quality parameter measurement using spectral signatures

    NASA Technical Reports Server (NTRS)

    White, P. E.

    1973-01-01

    Regression analysis is applied to the problem of measuring water quality parameters from remote sensing spectral signature data. The equations necessary to perform regression analysis are presented and methods of testing the strength and reliability of a regression are described. An efficient algorithm for selecting an optimal subset of the independent variables available for a regression is also presented.

  18. Integrated topology for an aircraft electric power distribution system using MATLAB and ILP optimization technique and its implementation

    NASA Astrophysics Data System (ADS)

    Madhikar, Pratik Ravindra

    The most important and crucial design feature while designing an Aircraft Electric Power Distribution System (EPDS) is reliability. In EPDS, the distribution of power is from top level generators to bottom level loads through various sensors, actuators and rectifiers with the help of AC & DC buses and control switches. As the demands of the consumer is never ending and the safety is utmost important, there is an increase in loads and as a result increase in power management. Therefore, the design of an EPDS should be optimized to have maximum efficiency. This thesis discusses an integrated tool that is based on a Need Based Design method and Fault Tree Analysis (FTA) to achieve the optimum design of an EPDS to provide maximum reliability in terms of continuous connectivity, power management and minimum cost. If an EPDS is formulated as an optimization problem then it can be solved with the help of connectivity, cost and power constraints by using a linear solver to get the desired output of maximum reliability at minimum cost. Furthermore, the thesis also discusses the viability and implementation of the resulted topology on typical large aircraft specifications.

  19. System Analysis and Performance Benefits of an Optimized Rotorcraft Propulsion System

    NASA Technical Reports Server (NTRS)

    Bruckner, Robert J.

    2007-01-01

    The propulsion system of rotorcraft vehicles is the most critical system to the vehicle in terms of safety and performance. The propulsion system must provide both vertical lift and forward flight propulsion during the entire mission. Whereas propulsion is a critical element for all flight vehicles, it is particularly critical for rotorcraft due to their limited safe, un-powered landing capability. This unparalleled reliability requirement has led rotorcraft power plants down a certain evolutionary path in which the system looks and performs quite similarly to those of the 1960 s. By and large the advancements in rotorcraft propulsion have come in terms of safety and reliability and not in terms of performance. The concept of the optimized propulsion system is a means by which both reliability and performance can be improved for rotorcraft vehicles. The optimized rotorcraft propulsion system which couples an oil-free turboshaft engine to a highly loaded gearbox that provides axial load support for the power turbine can be designed with current laboratory proven technology. Such a system can provide up to 60% weight reduction of the propulsion system of rotorcraft vehicles. Several technical challenges are apparent at the conceptual design level and should be addressed with current research.

  20. Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method

    NASA Astrophysics Data System (ADS)

    Zhang, Xiangnan

    2018-03-01

    A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.

  1. Framework to evaluate the worth of hydraulic conductivity data for optimal groundwater resources management in ecologically sensitive areas

    NASA Astrophysics Data System (ADS)

    Feyen, Luc; Gorelick, Steven M.

    2005-03-01

    We propose a framework that combines simulation optimization with Bayesian decision analysis to evaluate the worth of hydraulic conductivity data for optimal groundwater resources management in ecologically sensitive areas. A stochastic simulation optimization management model is employed to plan regionally distributed groundwater pumping while preserving the hydroecological balance in wetland areas. Because predictions made by an aquifer model are uncertain, groundwater supply systems operate below maximum yield. Collecting data from the groundwater system can potentially reduce predictive uncertainty and increase safe water production. The price paid for improvement in water management is the cost of collecting the additional data. Efficient data collection using Bayesian decision analysis proceeds in three stages: (1) The prior analysis determines the optimal pumping scheme and profit from water sales on the basis of known information. (2) The preposterior analysis estimates the optimal measurement locations and evaluates whether each sequential measurement will be cost-effective before it is taken. (3) The posterior analysis then revises the prior optimal pumping scheme and consequent profit, given the new information. Stochastic simulation optimization employing a multiple-realization approach is used to determine the optimal pumping scheme in each of the three stages. The cost of new data must not exceed the expected increase in benefit obtained in optimal groundwater exploitation. An example based on groundwater management practices in Florida aimed at wetland protection showed that the cost of data collection more than paid for itself by enabling a safe and reliable increase in production.

  2. A Delay-Aware and Reliable Data Aggregation for Cyber-Physical Sensing

    PubMed Central

    Zhang, Jinhuan; Long, Jun; Zhang, Chengyuan; Zhao, Guihu

    2017-01-01

    Physical information sensed by various sensors in a cyber-physical system should be collected for further operation. In many applications, data aggregation should take reliability and delay into consideration. To address these problems, a novel Tiered Structure Routing-based Delay-Aware and Reliable Data Aggregation scheme named TSR-DARDA for spherical physical objects is proposed. By dividing the spherical network constructed by dispersed sensor nodes into circular tiers with specifically designed widths and cells, TSTR-DARDA tries to enable as many nodes as possible to transmit data simultaneously. In order to ensure transmission reliability, lost packets are retransmitted. Moreover, to minimize the latency while maintaining reliability for data collection, in-network aggregation and broadcast techniques are adopted to deal with the transmission between data collecting nodes in the outer layer and their parent data collecting nodes in the inner layer. Thus, the optimization problem is transformed to minimize the delay under reliability constraints by controlling the system parameters. To demonstrate the effectiveness of the proposed scheme, we have conducted extensive theoretical analysis and comparisons to evaluate the performance of TSR-DARDA. The analysis and simulations show that TSR-DARDA leads to lower delay with reliability satisfaction. PMID:28218668

  3. Subject-level reliability analysis of fast fMRI with application to epilepsy.

    PubMed

    Hao, Yongfu; Khoo, Hui Ming; von Ellenrieder, Nicolas; Gotman, Jean

    2017-07-01

    Recent studies have applied the new magnetic resonance encephalography (MREG) sequence to the study of interictal epileptic discharges (IEDs) in the electroencephalogram (EEG) of epileptic patients. However, there are no criteria to quantitatively evaluate different processing methods, to properly use the new sequence. We evaluated different processing steps of this new sequence under the common generalized linear model (GLM) framework by assessing the reliability of results. A bootstrap sampling technique was first used to generate multiple replicated data sets; a GLM with different processing steps was then applied to obtain activation maps, and the reliability of these maps was assessed. We applied our analysis in an event-related GLM related to IEDs. A higher reliability was achieved by using a GLM with head motion confound regressor with 24 components rather than the usual 6, with an autoregressive model of order 5 and with a canonical hemodynamic response function (HRF) rather than variable latency or patient-specific HRFs. Comparison of activation with IED field also favored the canonical HRF, consistent with the reliability analysis. The reliability analysis helps to optimize the processing methods for this fast fMRI sequence, in a context in which we do not know the ground truth of activation areas. Magn Reson Med 78:370-382, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  4. 76 FR 64330 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-18

    ... talks on HPC Reliability, Diffusion on Complex Networks, and Reversible Software Execution Systems Report from Applied Math Workshop on Mathematics for the Analysis, Simulation, and Optimization of Complex Systems Report from ASCR-BES Workshop on Data Challenges from Next Generation Facilities Public...

  5. A Comparative Analysis of Taguchi Methodology and Shainin System DoE in the Optimization of Injection Molding Process Parameters

    NASA Astrophysics Data System (ADS)

    Khavekar, Rajendra; Vasudevan, Hari, Dr.; Modi, Bhavik

    2017-08-01

    Two well-known Design of Experiments (DoE) methodologies, such as Taguchi Methods (TM) and Shainin Systems (SS) are compared and analyzed in this study through their implementation in a plastic injection molding unit. Experiments were performed at a perfume bottle cap manufacturing company (made by acrylic material) using TM and SS to find out the root cause of defects and to optimize the process parameters for minimum rejection. Experiments obtained the rejection rate to be 8.57% from 40% (appx.) during trial runs, which is quiet low, representing successful implementation of these DoE methods. The comparison showed that both methodologies gave same set of variables as critical for defect reduction, but with change in their significance order. Also, Taguchi methods require more number of experiments and consume more time compared to the Shainin System. Shainin system is less complicated and is easy to implement, whereas Taguchi methods is statistically more reliable for optimization of process parameters. Finally, experimentations implied that DoE methods are strong and reliable in implementation, as organizations attempt to improve the quality through optimization.

  6. Probabilistic framework for product design optimization and risk management

    NASA Astrophysics Data System (ADS)

    Keski-Rahkonen, J. K.

    2018-05-01

    Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.

  7. Distributed collaborative response surface method for mechanical dynamic assembly reliability design

    NASA Astrophysics Data System (ADS)

    Bai, Guangchen; Fei, Chengwei

    2013-11-01

    Because of the randomness of many impact factors influencing the dynamic assembly relationship of complex machinery, the reliability analysis of dynamic assembly relationship needs to be accomplished considering the randomness from a probabilistic perspective. To improve the accuracy and efficiency of dynamic assembly relationship reliability analysis, the mechanical dynamic assembly reliability(MDAR) theory and a distributed collaborative response surface method(DCRSM) are proposed. The mathematic model of DCRSM is established based on the quadratic response surface function, and verified by the assembly relationship reliability analysis of aeroengine high pressure turbine(HPT) blade-tip radial running clearance(BTRRC). Through the comparison of the DCRSM, traditional response surface method(RSM) and Monte Carlo Method(MCM), the results show that the DCRSM is not able to accomplish the computational task which is impossible for the other methods when the number of simulation is more than 100 000 times, but also the computational precision for the DCRSM is basically consistent with the MCM and improved by 0.40˜4.63% to the RSM, furthermore, the computational efficiency of DCRSM is up to about 188 times of the MCM and 55 times of the RSM under 10000 times simulations. The DCRSM is demonstrated to be a feasible and effective approach for markedly improving the computational efficiency and accuracy of MDAR analysis. Thus, the proposed research provides the promising theory and method for the MDAR design and optimization, and opens a novel research direction of probabilistic analysis for developing the high-performance and high-reliability of aeroengine.

  8. Uncertainty in the analysis of the overall equipment effectiveness on the shop floor

    NASA Astrophysics Data System (ADS)

    Rößler, M. P.; Abele, E.

    2013-06-01

    In this article an approach will be presented which supports transparency regarding the effectiveness of manufacturing equipment by combining the fuzzy set theory with the method of the overall equipment effectiveness analysis. One of the key principles of lean production and also a fundamental task in production optimization projects is the prior analysis of the current state of a production system by the use of key performance indicators to derive possible future states. The current state of the art in overall equipment effectiveness analysis is usually performed by cumulating different machine states by means of decentralized data collection without the consideration of uncertainty. In manual data collection or semi-automated plant data collection systems the quality of derived data often diverges and leads optimization teams to distorted conclusions about the real optimization potential of manufacturing equipment. The method discussed in this paper is to help practitioners to get more reliable results in the analysis phase and so better results of optimization projects. Under consideration of a case study obtained results are discussed.

  9. A Monte Carlo Simulation Study of the Reliability of Intraindividual Variability

    PubMed Central

    Estabrook, Ryne; Grimm, Kevin J.; Bowles, Ryan P.

    2012-01-01

    Recent research has seen intraindividual variability (IIV) become a useful technique to incorporate trial-to-trial variability into many types of psychological studies. IIV as measured by individual standard deviations (ISDs) has shown unique prediction to several types of positive and negative outcomes (Ram, Rabbit, Stollery, & Nesselroade, 2005). One unanswered question regarding measuring intraindividual variability is its reliability and the conditions under which optimal reliability is achieved. Monte Carlo simulation studies were conducted to determine the reliability of the ISD compared to the intraindividual mean. The results indicate that ISDs generally have poor reliability and are sensitive to insufficient measurement occasions, poor test reliability, and unfavorable amounts and distributions of variability in the population. Secondary analysis of psychological data shows that use of individual standard deviations in unfavorable conditions leads to a marked reduction in statistical power, although careful adherence to underlying statistical assumptions allows their use as a basic research tool. PMID:22268793

  10. Optimizing the Reliability and Performance of Service Composition Applications with Fault Tolerance in Wireless Sensor Networks

    PubMed Central

    Wu, Zhao; Xiong, Naixue; Huang, Yannong; Xu, Degang; Hu, Chunyang

    2015-01-01

    The services composition technology provides flexible methods for building service composition applications (SCAs) in wireless sensor networks (WSNs). The high reliability and high performance of SCAs help services composition technology promote the practical application of WSNs. The optimization methods for reliability and performance used for traditional software systems are mostly based on the instantiations of software components, which are inapplicable and inefficient in the ever-changing SCAs in WSNs. In this paper, we consider the SCAs with fault tolerance in WSNs. Based on a Universal Generating Function (UGF) we propose a reliability and performance model of SCAs in WSNs, which generalizes a redundancy optimization problem to a multi-state system. Based on this model, an efficient optimization algorithm for reliability and performance of SCAs in WSNs is developed based on a Genetic Algorithm (GA) to find the optimal structure of SCAs with fault-tolerance in WSNs. In order to examine the feasibility of our algorithm, we have evaluated the performance. Furthermore, the interrelationships between the reliability, performance and cost are investigated. In addition, a distinct approach to determine the most suitable parameters in the suggested algorithm is proposed. PMID:26561818

  11. Multivariable optimization of liquid rocket engines using particle swarm algorithms

    NASA Astrophysics Data System (ADS)

    Jones, Daniel Ray

    Liquid rocket engines are highly reliable, controllable, and efficient compared to other conventional forms of rocket propulsion. As such, they have seen wide use in the space industry and have become the standard propulsion system for launch vehicles, orbit insertion, and orbital maneuvering. Though these systems are well understood, historical optimization techniques are often inadequate due to the highly non-linear nature of the engine performance problem. In this thesis, a Particle Swarm Optimization (PSO) variant was applied to maximize the specific impulse of a finite-area combustion chamber (FAC) equilibrium flow rocket performance model by controlling the engine's oxidizer-to-fuel ratio and de Laval nozzle expansion and contraction ratios. In addition to the PSO-controlled parameters, engine performance was calculated based on propellant chemistry, combustion chamber pressure, and ambient pressure, which are provided as inputs to the program. The performance code was validated by comparison with NASA's Chemical Equilibrium with Applications (CEA) and the commercially available Rocket Propulsion Analysis (RPA) tool. Similarly, the PSO algorithm was validated by comparison with brute-force optimization, which calculates all possible solutions and subsequently determines which is the optimum. Particle Swarm Optimization was shown to be an effective optimizer capable of quick and reliable convergence for complex functions of multiple non-linear variables.

  12. Texture and haptic cues in slant discrimination: reliability-based cue weighting without statistically optimal cue combination

    NASA Astrophysics Data System (ADS)

    Rosas, Pedro; Wagemans, Johan; Ernst, Marc O.; Wichmann, Felix A.

    2005-05-01

    A number of models of depth-cue combination suggest that the final depth percept results from a weighted average of independent depth estimates based on the different cues available. The weight of each cue in such an average is thought to depend on the reliability of each cue. In principle, such a depth estimation could be statistically optimal in the sense of producing the minimum-variance unbiased estimator that can be constructed from the available information. Here we test such models by using visual and haptic depth information. Different texture types produce differences in slant-discrimination performance, thus providing a means for testing a reliability-sensitive cue-combination model with texture as one of the cues to slant. Our results show that the weights for the cues were generally sensitive to their reliability but fell short of statistically optimal combination - we find reliability-based reweighting but not statistically optimal cue combination.

  13. Behavior and neural basis of near-optimal visual search

    PubMed Central

    Ma, Wei Ji; Navalpakkam, Vidhya; Beck, Jeffrey M; van den Berg, Ronald; Pouget, Alexandre

    2013-01-01

    The ability to search efficiently for a target in a cluttered environment is one of the most remarkable functions of the nervous system. This task is difficult under natural circumstances, as the reliability of sensory information can vary greatly across space and time and is typically a priori unknown to the observer. In contrast, visual-search experiments commonly use stimuli of equal and known reliability. In a target detection task, we randomly assigned high or low reliability to each item on a trial-by-trial basis. An optimal observer would weight the observations by their trial-to-trial reliability and combine them using a specific nonlinear integration rule. We found that humans were near-optimal, regardless of whether distractors were homogeneous or heterogeneous and whether reliability was manipulated through contrast or shape. We present a neural-network implementation of near-optimal visual search based on probabilistic population coding. The network matched human performance. PMID:21552276

  14. Runtime Speculative Software-Only Fault Tolerance

    DTIC Science & Technology

    2012-06-01

    reliability of RSFT, a in-depth analysis on its window of vulnerability is also discussed and measured via simulated fault injection. The performance...propagation of faults through the entire program. For optimal performance, these techniques have to use herotic alias analysis to find the minimum set of...affect program output. No program source code or alias analysis is needed to analyze the fault propagation ahead of time. 2.3 Limitations of Existing

  15. Automatic yield-line analysis of slabs using discontinuity layout optimization

    PubMed Central

    Gilbert, Matthew; He, Linwei; Smith, Colin C.; Le, Canh V.

    2014-01-01

    The yield-line method of analysis is a long established and extremely effective means of estimating the maximum load sustainable by a slab or plate. However, although numerous attempts to automate the process of directly identifying the critical pattern of yield-lines have been made over the past few decades, to date none has proved capable of reliably analysing slabs of arbitrary geometry. Here, it is demonstrated that the discontinuity layout optimization (DLO) procedure can successfully be applied to such problems. The procedure involves discretization of the problem using nodes inter-connected by potential yield-line discontinuities, with the critical layout of these then identified using linear programming. The procedure is applied to various benchmark problems, demonstrating that highly accurate solutions can be obtained, and showing that DLO provides a truly systematic means of directly and reliably automatically identifying yield-line patterns. Finally, since the critical yield-line patterns for many problems are found to be quite complex in form, a means of automatically simplifying these is presented. PMID:25104905

  16. 2007 Beyond SBIR Phase II: Bringing Technology Edge to the Warfighter

    DTIC Science & Technology

    2007-08-23

    Systems Trade-Off Analysis and Optimization Verification and Validation On-Board Diagnostics and Self - healing Security and Anti-Tampering Rapid...verification; Safety and reliability analysis of flight and mission critical systems On-Board Diagnostics and Self - Healing Model-based monitoring and... self - healing On-board diagnostics and self - healing ; Autonomic computing; Network intrusion detection and prevention Anti-Tampering and Trust

  17. Central Plant Optimization for Waste Energy Reduction (CPOWER). ESTCP Cost and Performance Report

    DTIC Science & Technology

    2016-12-01

    in the regression models. The solar radiation data did not appear reliable in the weather dataset for the location, and hence it was not used. The...and additional factors (e.g., solar insolation) may be needed to obtain a better model. 2. Inputs to optimizer: During several periods of...Location: North Carolina Energy Consumption Cost Savings $ 443,698.00 Analysis Type: FEMP PV of total savings 215,698.00$ Base Date: April 1

  18. Development of a nanosatellite de-orbiting system by reliability based design optimization

    NASA Astrophysics Data System (ADS)

    Nikbay, Melike; Acar, Pınar; Aslan, Alim Rüstem

    2015-12-01

    This paper presents design approaches to develop a reliable and efficient de-orbiting system for the 3USAT nanosatellite to provide a beneficial orbital decay process at the end of a mission. A de-orbiting system is initially designed by employing the aerodynamic drag augmentation principle where the structural constraints of the overall satellite system and the aerodynamic forces are taken into account. Next, an alternative de-orbiting system is designed with new considerations and further optimized using deterministic and reliability based design techniques. For the multi-objective design, the objectives are chosen to maximize the aerodynamic drag force through the maximization of the Kapton surface area while minimizing the de-orbiting system mass. The constraints are related in a deterministic manner to the required deployment force, the height of the solar panel hole and the deployment angle. The length and the number of layers of the deployable Kapton structure are used as optimization variables. In the second stage of this study, uncertainties related to both manufacturing and operating conditions of the deployable structure in space environment are considered. These uncertainties are then incorporated into the design process by using different probabilistic approaches such as Monte Carlo Simulation, the First-Order Reliability Method and the Second-Order Reliability Method. The reliability based design optimization seeks optimal solutions using the former design objectives and constraints with the inclusion of a reliability index. Finally, the de-orbiting system design alternatives generated by different approaches are investigated and the reliability based optimum design is found to yield the best solution since it significantly improves both system reliability and performance requirements.

  19. Reliable and efficient solution of genome-scale models of Metabolism and macromolecular Expression

    DOE PAGES

    Ma, Ding; Yang, Laurence; Fleming, Ronan M. T.; ...

    2017-01-18

    Currently, Constraint-Based Reconstruction and Analysis (COBRA) is the only methodology that permits integrated modeling of Metabolism and macromolecular Expression (ME) at genome-scale. Linear optimization computes steady-state flux solutions to ME models, but flux values are spread over many orders of magnitude. Data values also have greatly varying magnitudes. Furthermore, standard double-precision solvers may return inaccurate solutions or report that no solution exists. Exact simplex solvers based on rational arithmetic require a near-optimal warm start to be practical on large problems (current ME models have 70,000 constraints and variables and will grow larger). We also developed a quadrupleprecision version of ourmore » linear and nonlinear optimizer MINOS, and a solution procedure (DQQ) involving Double and Quad MINOS that achieves reliability and efficiency for ME models and other challenging problems tested here. DQQ will enable extensive use of large linear and nonlinear models in systems biology and other applications involving multiscale data.« less

  20. Reliable and efficient solution of genome-scale models of Metabolism and macromolecular Expression

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Ding; Yang, Laurence; Fleming, Ronan M. T.

    Currently, Constraint-Based Reconstruction and Analysis (COBRA) is the only methodology that permits integrated modeling of Metabolism and macromolecular Expression (ME) at genome-scale. Linear optimization computes steady-state flux solutions to ME models, but flux values are spread over many orders of magnitude. Data values also have greatly varying magnitudes. Furthermore, standard double-precision solvers may return inaccurate solutions or report that no solution exists. Exact simplex solvers based on rational arithmetic require a near-optimal warm start to be practical on large problems (current ME models have 70,000 constraints and variables and will grow larger). We also developed a quadrupleprecision version of ourmore » linear and nonlinear optimizer MINOS, and a solution procedure (DQQ) involving Double and Quad MINOS that achieves reliability and efficiency for ME models and other challenging problems tested here. DQQ will enable extensive use of large linear and nonlinear models in systems biology and other applications involving multiscale data.« less

  1. Parameterization of the InVEST Crop Pollination Model to spatially predict abundance of wild blueberry (Vaccinium angustifolium Aiton) native bee pollinators in Maine, USA

    USGS Publications Warehouse

    Groff, Shannon C.; Loftin, Cynthia S.; Drummond, Frank; Bushmann, Sara; McGill, Brian J.

    2016-01-01

    Non-native honeybees historically have been managed for crop pollination, however, recent population declines draw attention to pollination services provided by native bees. We applied the InVEST Crop Pollination model, developed to predict native bee abundance from habitat resources, in Maine's wild blueberry crop landscape. We evaluated model performance with parameters informed by four approaches: 1) expert opinion; 2) sensitivity analysis; 3) sensitivity analysis informed model optimization; and, 4) simulated annealing (uninformed) model optimization. Uninformed optimization improved model performance by 29% compared to expert opinion-informed model, while sensitivity-analysis informed optimization improved model performance by 54%. This suggests that expert opinion may not result in the best parameter values for the InVEST model. The proportion of deciduous/mixed forest within 2000 m of a blueberry field also reliably predicted native bee abundance in blueberry fields, however, the InVEST model provides an efficient tool to estimate bee abundance beyond the field perimeter.

  2. Business Cases for Microgrids: Modeling Interactions of Technology Choice, Reliability, Cost, and Benefit

    NASA Astrophysics Data System (ADS)

    Hanna, Ryan

    Distributed energy resources (DERs), and increasingly microgrids, are becoming an integral part of modern distribution systems. Interest in microgrids--which are insular and autonomous power networks embedded within the bulk grid--stems largely from the vast array of flexibilities and benefits they can offer stakeholders. Managed well, they can improve grid reliability and resiliency, increase end-use energy efficiency by coupling electric and thermal loads, reduce transmission losses by generating power locally, and may reduce system-wide emissions, among many others. Whether these public benefits are realized, however, depends on whether private firms see a "business case", or private value, in investing. To this end, firms need models that evaluate costs, benefits, risks, and assumptions that underlie decisions to invest. The objectives of this dissertation are to assess the business case for microgrids that provide what industry analysts forecast as two primary drivers of market growth--that of providing energy services (similar to an electric utility) as well as reliability service to customers within. Prototypical first adopters are modeled--using an existing model to analyze energy services and a new model that couples that analysis with one of reliability--to explore interactions between technology choice, reliability, costs, and benefits. The new model has a bi-level hierarchy; it uses heuristic optimization to select and size DERs and analytical optimization to schedule them. It further embeds Monte Carlo simulation to evaluate reliability as well as regression models for customer damage functions to monetize reliability. It provides least-cost microgrid configurations for utility customers who seek to reduce interruption and operating costs. Lastly, the model is used to explore the impact of such adoption on system-wide greenhouse gas emissions in California. Results indicate that there are, at present, co-benefits for emissions reductions when customers adopt and operate microgrids for private benefit, though future analysis is needed as the bulk grid continues to transition toward a less carbon intensive system.

  3. Reliability measurement for mixed mode failures of 33/11 kilovolt electric power distribution stations.

    PubMed

    Alwan, Faris M; Baharum, Adam; Hassan, Geehan S

    2013-01-01

    The reliability of the electrical distribution system is a contemporary research field due to diverse applications of electricity in everyday life and diverse industries. However a few research papers exist in literature. This paper proposes a methodology for assessing the reliability of 33/11 Kilovolt high-power stations based on average time between failures. The objective of this paper is to find the optimal fit for the failure data via time between failures. We determine the parameter estimation for all components of the station. We also estimate the reliability value of each component and the reliability value of the system as a whole. The best fitting distribution for the time between failures is a three parameter Dagum distribution with a scale parameter [Formula: see text] and shape parameters [Formula: see text] and [Formula: see text]. Our analysis reveals that the reliability value decreased by 38.2% in each 30 days. We believe that the current paper is the first to address this issue and its analysis. Thus, the results obtained in this research reflect its originality. We also suggest the practicality of using these results for power systems for both the maintenance of power systems models and preventive maintenance models.

  4. Reliability Measurement for Mixed Mode Failures of 33/11 Kilovolt Electric Power Distribution Stations

    PubMed Central

    Alwan, Faris M.; Baharum, Adam; Hassan, Geehan S.

    2013-01-01

    The reliability of the electrical distribution system is a contemporary research field due to diverse applications of electricity in everyday life and diverse industries. However a few research papers exist in literature. This paper proposes a methodology for assessing the reliability of 33/11 Kilovolt high-power stations based on average time between failures. The objective of this paper is to find the optimal fit for the failure data via time between failures. We determine the parameter estimation for all components of the station. We also estimate the reliability value of each component and the reliability value of the system as a whole. The best fitting distribution for the time between failures is a three parameter Dagum distribution with a scale parameter and shape parameters and . Our analysis reveals that the reliability value decreased by 38.2% in each 30 days. We believe that the current paper is the first to address this issue and its analysis. Thus, the results obtained in this research reflect its originality. We also suggest the practicality of using these results for power systems for both the maintenance of power systems models and preventive maintenance models. PMID:23936346

  5. Optimal periodic proof test based on cost-effective and reliability criteria

    NASA Technical Reports Server (NTRS)

    Yang, J.-N.

    1976-01-01

    An exploratory study for the optimization of periodic proof tests for fatigue-critical structures is presented. The optimal proof load level and the optimal number of periodic proof tests are determined by minimizing the total expected (statistical average) cost, while the constraint on the allowable level of structural reliability is satisfied. The total expected cost consists of the expected cost of proof tests, the expected cost of structures destroyed by proof tests, and the expected cost of structural failure in service. It is demonstrated by numerical examples that significant cost saving and reliability improvement for fatigue-critical structures can be achieved by the application of the optimal periodic proof test. The present study is relevant to the establishment of optimal maintenance procedures for fatigue-critical structures.

  6. Improving the Reliability of Technological Subsystems Equipment for Steam Turbine Unit in Operation

    NASA Astrophysics Data System (ADS)

    Brodov, Yu. M.; Murmansky, B. E.; Aronson, R. T.

    2017-11-01

    The authors’ conception is presented of an integrated approach to reliability improving of the steam turbine unit (STU) state along with its implementation examples for the various STU technological subsystems. Basing on the statistical analysis of damage to turbine individual parts and components, on the development and application of modern methods and technologies of repair and on operational monitoring techniques, the critical components and elements of equipment are identified and priorities are proposed for improving the reliability of STU equipment in operation. The research results are presented of the analysis of malfunctions for various STU technological subsystems equipment operating as part of power units and at cross-linked thermal power plants and resulting in turbine unit shutdown (failure). Proposals are formulated and justified for adjustment of maintenance and repair for turbine components and parts, for condenser unit equipment, for regeneration subsystem and oil supply system that permit to increase the operational reliability, to reduce the cost of STU maintenance and repair and to optimize the timing and amount of repairs.

  7. Multi-Response Parameter Interval Sensitivity and Optimization for the Composite Tape Winding Process.

    PubMed

    Deng, Bo; Shi, Yaoyao; Yu, Tao; Kang, Chao; Zhao, Pan

    2018-01-31

    The composite tape winding process, which utilizes a tape winding machine and prepreg tapes, provides a promising way to improve the quality of composite products. Nevertheless, the process parameters of composite tape winding have crucial effects on the tensile strength and void content, which are closely related to the performances of the winding products. In this article, two different object values of winding products, including mechanical performance (tensile strength) and a physical property (void content), were respectively calculated. Thereafter, the paper presents an integrated methodology by combining multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis to obtain the optimal intervals of the composite tape winding process. First, the global multi-parameter sensitivity analysis method was applied to investigate the sensitivity of each parameter in the tape winding processing. Then, the local single-parameter sensitivity analysis method was employed to calculate the sensitivity of a single parameter within the corresponding range. Finally, the stability and instability ranges of each parameter were distinguished. Meanwhile, the authors optimized the process parameter ranges and provided comprehensive optimized intervals of the winding parameters. The verification test validated that the optimized intervals of the process parameters were reliable and stable for winding products manufacturing.

  8. Multi-Response Parameter Interval Sensitivity and Optimization for the Composite Tape Winding Process

    PubMed Central

    Yu, Tao; Kang, Chao; Zhao, Pan

    2018-01-01

    The composite tape winding process, which utilizes a tape winding machine and prepreg tapes, provides a promising way to improve the quality of composite products. Nevertheless, the process parameters of composite tape winding have crucial effects on the tensile strength and void content, which are closely related to the performances of the winding products. In this article, two different object values of winding products, including mechanical performance (tensile strength) and a physical property (void content), were respectively calculated. Thereafter, the paper presents an integrated methodology by combining multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis to obtain the optimal intervals of the composite tape winding process. First, the global multi-parameter sensitivity analysis method was applied to investigate the sensitivity of each parameter in the tape winding processing. Then, the local single-parameter sensitivity analysis method was employed to calculate the sensitivity of a single parameter within the corresponding range. Finally, the stability and instability ranges of each parameter were distinguished. Meanwhile, the authors optimized the process parameter ranges and provided comprehensive optimized intervals of the winding parameters. The verification test validated that the optimized intervals of the process parameters were reliable and stable for winding products manufacturing. PMID:29385048

  9. Operation and planning of coordinated natural gas and electricity infrastructures

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaping

    Natural gas is becoming rapidly the optimal choice for fueling new generating units in electric power system driven by abundant natural gas supplies and environmental regulations that are expected to cause coal-fired generation retirements. The growing reliance on natural gas as a dominant fuel for electricity generation throughout North America has brought the interaction between the natural gas and power grids into sharp focus. The primary concern and motivation of this research is to address the emerging interdependency issues faced by the electric power and natural gas industry. This thesis provides a comprehensive analysis of the interactions between the two systems regarding the short-term operation and long-term infrastructure planning. Natural gas and renewable energy appear complementary in many respects regarding fuel price and availability, environmental impact, resource distribution and dispatchability. In addition, demand response has also held the promise of making a significant contribution to enhance system operations by providing incentives to customers for a more flat load profile. We investigated the coordination between natural gas-fired generation and prevailing nontraditional resources including renewable energy, demand response so as to provide economical options for optimizing the short-term scheduling with the intense natural gas delivery constraints. As the amount and dispatch of gas-fired generation increases, the long-term interdependency issue is whether there is adequate pipeline capacity to provide sufficient gas to natural gas-fired generation during the entire planning horizon while it is widely used outside the power sector. This thesis developed a co-optimization planning model by incorporating the natural gas transportation system into the multi-year resource and transmission system planning problem. This consideration would provide a more comprehensive decision for the investment and accurate assessment for system adequacy and reliability. With the growing reliance on natural gas and widespread utilization of highly efficient combined heat and power (CHP), it is also questionable that whether the independent design of infrastructures can meet potential challenges of future energy supply. To address this issue, this thesis proposed an optimization framework for a sustainable multiple energy system expansion planning based on an energy hub model while considering the energy efficiency, emission and reliability performance. In addition, we introduced the probabilistic reliability evaluation and flow network analysis into the multiple energy system design in order to obtain an optimal and reliable network topology.

  10. Using wound care algorithms: a content validation study.

    PubMed

    Beitz, J M; van Rijswijk, L

    1999-09-01

    Valid and reliable heuristic devices facilitating optimal wound care are lacking. The objectives of this study were to establish content validation data for a set of wound care algorithms, to identify their associated strengths and weaknesses, and to gain insight into the wound care decision-making process. Forty-four registered nurse wound care experts were surveyed and interviewed at national and regional educational meetings. Using a cross-sectional study design and an 83-item, 4-point Likert-type scale, this purposive sample was asked to quantify the degree of validity of the algorithms' decisions and components. Participants' comments were tape-recorded, transcribed, and themes were derived. On a scale of 1 to 4, the mean score of the entire instrument was 3.47 (SD +/- 0.87), the instrument's Content Validity Index was 0.86, and the individual Content Validity Index of 34 of 44 participants was > 0.8. Item scores were lower for those related to packing deep wounds (P < .001). No other significant differences were observed. Qualitative data analysis revealed themes of difficulty associated with wound assessment and care issues, that is, the absence of valid and reliable definitions. The wound care algorithms studied proved valid. However, the lack of valid and reliable wound assessment and care definitions hinders optimal use of these instruments. Further research documenting their clinical use is warranted. Research-based practice recommendations should direct the development of future valid and reliable algorithms designed to help nurses provide optimal wound care.

  11. Evaluating the Effect of Minimizing Screws on Stabilization of Symphysis Mandibular Fracture by 3D Finite Element Analysis.

    PubMed

    Kharmanda, Ghias; Kharma, Mohamed-Yaser

    2017-06-01

    The objective of this work is to integrate structural optimization and reliability concepts into mini-plate fixation strategy used in symphysis mandibular fractures. The structural reliability levels are next estimated when considering a single failure mode and multiple failure modes. A 3-dimensional finite element model is developed in order to evaluate the ability of reducing the negative effect due to the stabilization of the fracture. Topology optimization process is considered in the conceptual design stage to predict possible fixation layouts. In the detailed design stage, suitable mini-plates are selected taking into account the resulting topology and different anatomical considerations. Several muscle forces are considered in order to obtain realistic predictions. Since some muscles can be cut or harmed during the surgery and cannot operate at its maximum capacity, there is a strong motivation to introduce the loading uncertainties in order to obtain reliable designs. The structural reliability is carried out for a single failure mode and multiple failure modes. The different results are validated with a clinical case of a male patient with symphysis fracture. In this case while use of the upper plate fixation with four holes, only two screws were applied to protect adjacent vital structure. This behavior does not affect the stability of the fracture. The proposed strategy to optimize bone plates leads to fewer complications and second surgeries, less patient discomfort, and shorter time of healing.

  12. Optimal Sensor Location Design for Reliable Fault Detection in Presence of False Alarms

    PubMed Central

    Yang, Fan; Xiao, Deyun; Shah, Sirish L.

    2009-01-01

    To improve fault detection reliability, sensor location should be designed according to an optimization criterion with constraints imposed by issues of detectability and identifiability. Reliability requires the minimization of undetectability and false alarm probability due to random factors on sensor readings, which is not only related with sensor readings but also affected by fault propagation. This paper introduces the reliability criteria expression based on the missed/false alarm probability of each sensor and system topology or connectivity derived from the directed graph. The algorithm for the optimization problem is presented as a heuristic procedure. Finally, a boiler system is illustrated using the proposed method. PMID:22291524

  13. AMOVA ["Accumulative Manifold Validation Analysis"]: An Advanced Statistical Methodology Designed to Measure and Test the Validity, Reliability, and Overall Efficacy of Inquiry-Based Psychometric Instruments

    ERIC Educational Resources Information Center

    Osler, James Edward, II

    2015-01-01

    This monograph provides an epistemological rational for the Accumulative Manifold Validation Analysis [also referred by the acronym "AMOVA"] statistical methodology designed to test psychometric instruments. This form of inquiry is a form of mathematical optimization in the discipline of linear stochastic modelling. AMOVA is an in-depth…

  14. Reliability based design of the primary structure of oil tankers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casella, G.; Dogliani, M.; Guedes Soares, C.

    1996-12-31

    The present paper describes the reliability analysis carried out for two oil tanker-ships having comparable dimensions but different design. The scope of the analysis was to derive indications on the value of the reliability index obtained for existing, typical and well designed oil tankers, as well as to apply the tentative rule checking formulation developed within the CEC-funded SHIPREL Project. The checking formula was adopted to redesign the midships section of one of the considered ships, upgrading her in order to meet the target failure probability considered in the rule development process. The resulting structure, in view of an upgradingmore » of the steel grade in the central part of the deck, lead to a convenient reliability level. The results of the analysis clearly showed that a large scatter exists presently in the design safety levels of ships, even when the Classification Societies` unified requirements are satisfied. A reliability based approach for the calibration of the rules for the global strength of ships is therefore proposed, in order to assist designers and Classification Societies in the process of producing ships which are more optimized, with respect to ensured safety levels. Based on the work reported in the paper, the feasibility and usefulness of a reliability based approach in the development of ship longitudinal strength requirements has been demonstrated.« less

  15. NREL's Energy Storage and REopt Teams Awarded $525k from TCF to Study

    Science.gov Websites

    Commercial Viability of Optimal, Reliable Building-Integrated Energy Storage | News | NREL NREL's Energy Storage and REopt Teams Awarded $525k from TCF to Study Commercial Viability of Optimal Study Commercial Viability of Optimal, Reliable Building-Integrated Energy Storage November 14, 2017

  16. Effect of Pd Surface Roughness on the Bonding Process and High Temperature Reliability of Au Ball Bonds

    NASA Astrophysics Data System (ADS)

    Huang, Y.; Kim, H. J.; McCracken, M.; Viswanathan, G.; Pon, F.; Mayer, M.; Zhou, Y. N.

    2011-06-01

    A 0.3- μm-thick electrolytic Pd layer was plated on 1 μm of electroless Ni on 1 mm-thick polished and roughened Cu substrates with roughness values ( R a) of 0.08 μm and 0.5 μm, respectively. The rough substrates were produced with sand-blasting. Au wire bonding on the Ni/Pd surface was optimized, and the electrical reliability was investigated under a high temperature storage test (HTST) during 800 h at 250°C by measuring the ball bond contact resistance, R c. The average value of R c of optimized ball bonds on the rough substrate was 1.96 mΩ which was about 40.0% higher than that on the smooth substrate. The initial bondability increased for the rougher surface, so that only half of the original ultrasonic level was required, but the reliability was not affected by surface roughness. For both substrate types, HTST caused bond healing, reducing the average R c by about 21% and 27%, respectively. Au diffusion into the Pd layer was observed in scanning transmission electron microscopy/ energy dispersive spectroscopy (STEM-EDS) line-scan analysis after HTST. It is considered that diffusion of Au or interdiffusion between Au and Pd can provide chemically strong bonding during HTST. This is supported by the R c decrease measured as the aging time increased. Cu migration was indicated in the STEM-EDS analysis, but its effect on reliability can be ignored. Au and Pd tend to form a complete solid solution at the interface and can provide reliable interconnection for high temperature (250°C) applications.

  17. Processing time tolerance-based ACO algorithm for solving job-shop scheduling problem

    NASA Astrophysics Data System (ADS)

    Luo, Yabo; Waden, Yongo P.

    2017-06-01

    Ordinarily, Job Shop Scheduling Problem (JSSP) is known as NP-hard problem which has uncertainty and complexity that cannot be handled by a linear method. Thus, currently studies on JSSP are concentrated mainly on applying different methods of improving the heuristics for optimizing the JSSP. However, there still exist many problems for efficient optimization in the JSSP, namely, low efficiency and poor reliability, which can easily trap the optimization process of JSSP into local optima. Therefore, to solve this problem, a study on Ant Colony Optimization (ACO) algorithm combined with constraint handling tactics is carried out in this paper. Further, the problem is subdivided into three parts: (1) Analysis of processing time tolerance-based constraint features in the JSSP which is performed by the constraint satisfying model; (2) Satisfying the constraints by considering the consistency technology and the constraint spreading algorithm in order to improve the performance of ACO algorithm. Hence, the JSSP model based on the improved ACO algorithm is constructed; (3) The effectiveness of the proposed method based on reliability and efficiency is shown through comparative experiments which are performed on benchmark problems. Consequently, the results obtained by the proposed method are better, and the applied technique can be used in optimizing JSSP.

  18. Quick-low-density parity check and dynamic threshold voltage optimization in 1X nm triple-level cell NAND flash memory with comprehensive analysis of endurance, retention-time, and temperature variation

    NASA Astrophysics Data System (ADS)

    Doi, Masafumi; Tokutomi, Tsukasa; Hachiya, Shogo; Kobayashi, Atsuro; Tanakamaru, Shuhei; Ning, Sheyang; Ogura Iwasaki, Tomoko; Takeuchi, Ken

    2016-08-01

    NAND flash memory’s reliability degrades with increasing endurance, retention-time and/or temperature. After a comprehensive evaluation of 1X nm triple-level cell (TLC) NAND flash, two highly reliable techniques are proposed. The first proposal, quick low-density parity check (Quick-LDPC), requires only one cell read in order to accurately estimate a bit-error rate (BER) that includes the effects of temperature, write and erase (W/E) cycles and retention-time. As a result, 83% read latency reduction is achieved compared to conventional AEP-LDPC. Also, W/E cycling is extended by 100% compared with conventional Bose-Chaudhuri-Hocquenghem (BCH) error-correcting code (ECC). The second proposal, dynamic threshold voltage optimization (DVO) has two parts, adaptive V Ref shift (AVS) and V TH space control (VSC). AVS reduces read error and latency by adaptively optimizing the reference voltage (V Ref) based on temperature, W/E cycles and retention-time. AVS stores the optimal V Ref’s in a table in order to enable one cell read. VSC further improves AVS by optimizing the voltage margins between V TH states. DVO reduces BER by 80%.

  19. Microgrid Analysis Tools Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jimenez, Antonio; Haase, Scott G; Mathur, Shivani

    2018-03-05

    The over-arching goal of the Alaska Microgrid Partnership is to reduce the use of total imported fuel into communities to secure all energy services by at least 50% in Alaska's remote microgrids without increasing system life cycle costs while also improving overall system reliability, security, and resilience. One goal of the Alaska Microgrid Partnership is to investigate whether a combination of energy efficiency and high-contribution (from renewable energy) power systems can reduce total imported energy usage by 50% while reducing life cycle costs and improving reliability and resiliency. This presentation provides an overview of the following four renewable energy optimizationmore » tools. Information is from respective tool websites, tool developers, and author experience. Distributed Energy Resources Customer Adoption Model (DER-CAM) Microgrid Design Toolkit (MDT) Renewable Energy Optimization (REopt) Tool Hybrid Optimization Model for Electric Renewables (HOMER).« less

  20. Performance and Reliability Optimization for Aerospace Systems subject to Uncertainty and Degradation

    NASA Technical Reports Server (NTRS)

    Miller, David W.; Uebelhart, Scott A.; Blaurock, Carl

    2004-01-01

    This report summarizes work performed by the Space Systems Laboratory (SSL) for NASA Langley Research Center in the field of performance optimization for systems subject to uncertainty. The objective of the research is to develop design methods and tools to the aerospace vehicle design process which take into account lifecycle uncertainties. It recognizes that uncertainty between the predictions of integrated models and data collected from the system in its operational environment is unavoidable. Given the presence of uncertainty, the goal of this work is to develop means of identifying critical sources of uncertainty, and to combine these with the analytical tools used with integrated modeling. In this manner, system uncertainty analysis becomes part of the design process, and can motivate redesign. The specific program objectives were: 1. To incorporate uncertainty modeling, propagation and analysis into the integrated (controls, structures, payloads, disturbances, etc.) design process to derive the error bars associated with performance predictions. 2. To apply modern optimization tools to guide in the expenditure of funds in a way that most cost-effectively improves the lifecycle productivity of the system by enhancing the subsystem reliability and redundancy. The results from the second program objective are described. This report describes the work and results for the first objective: uncertainty modeling, propagation, and synthesis with integrated modeling.

  1. Optimum allocation of redundancy among subsystems connected in series. Ph.D. Thesis - Case Western Reserve Univ., Sep. 1970

    NASA Technical Reports Server (NTRS)

    Bien, D. D.

    1973-01-01

    This analysis considers the optimum allocation of redundancy in a system of serially connected subsystems in which each subsystem is of the k-out-of-n type. Redundancy is optimally allocated when: (1) reliability is maximized for given costs; or (2) costs are minimized for given reliability. Several techniques are presented for achieving optimum allocation and their relative merits are discussed. Approximate solutions in closed form were attainable only for the special case of series-parallel systems and the efficacy of these approximations is discussed.

  2. CARES/Life Software for Designing More Reliable Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Baker, Eric H.

    1997-01-01

    Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion, and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CAPES/Life software eases this task by providing a tool to optimize the design and manufacture of brittle material components using probabilistic reliability analysis techniques. Probabilistic component design involves predicting the probability of failure for a thermomechanically loaded component from specimen rupture data. Typically, these experiments are performed using many simple geometry flexural or tensile test specimens. A static, dynamic, or cyclic load is applied to each specimen until fracture. Statistical strength and SCG (fatigue) parameters are then determined from these data. Using these parameters and the results obtained from a finite element analysis, the time-dependent reliability for a complex component geometry and loading is then predicted. Appropriate design changes are made until an acceptable probability of failure has been reached.

  3. Optimization of Systems with Uncertainty: Initial Developments for Performance, Robustness and Reliability Based Designs

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    This paper presents a study on the optimization of systems with structured uncertainties, whose inputs and outputs can be exhaustively described in the probabilistic sense. By propagating the uncertainty from the input to the output in the space of the probability density functions and the moments, optimization problems that pursue performance, robustness and reliability based designs are studied. Be specifying the desired outputs in terms of desired probability density functions and then in terms of meaningful probabilistic indices, we settle a computationally viable framework for solving practical optimization problems. Applications to static optimization and stability control are used to illustrate the relevance of incorporating uncertainty in the early stages of the design. Several examples that admit a full probabilistic description of the output in terms of the design variables and the uncertain inputs are used to elucidate the main features of the generic problem and its solution. Extensions to problems that do not admit closed form solutions are also evaluated. Concrete evidence of the importance of using a consistent probabilistic formulation of the optimization problem and a meaningful probabilistic description of its solution is provided in the examples. In the stability control problem the analysis shows that standard deterministic approaches lead to designs with high probability of running into instability. The implementation of such designs can indeed have catastrophic consequences.

  4. Reliability optimization design of the gear modification coefficient based on the meshing stiffness

    NASA Astrophysics Data System (ADS)

    Wang, Qianqian; Wang, Hui

    2018-04-01

    Since the time varying meshing stiffness of gear system is the key factor affecting gear vibration, it is important to design the meshing stiffness to reduce vibration. Based on the effect of gear modification coefficient on the meshing stiffness, considering the random parameters, reliability optimization design of the gear modification is researched. The dimension reduction and point estimation method is used to estimate the moment of the limit state function, and the reliability is obtained by the forth moment method. The cooperation of the dynamic amplitude results before and after optimization indicates that the research is useful for the reduction of vibration and noise and the improvement of the reliability.

  5. A system methodology for optimization design of the structural crashworthiness of a vehicle subjected to a high-speed frontal crash

    NASA Astrophysics Data System (ADS)

    Xia, Liang; Liu, Weiguo; Lv, Xiaojiang; Gu, Xianguang

    2018-04-01

    The structural crashworthiness design of vehicles has become an important research direction to ensure the safety of the occupants. To effectively improve the structural safety of a vehicle in a frontal crash, a system methodology is presented in this study. The surrogate model of Online support vector regression (Online-SVR) is adopted to approximate crashworthiness criteria and different kernel functions are selected to enhance the accuracy of the model. The Online-SVR model is demonstrated to have the advantages of solving highly nonlinear problems and saving training costs, and can effectively be applied for vehicle structural crashworthiness design. By combining the non-dominated sorting genetic algorithm II and Monte Carlo simulation, both deterministic optimization and reliability-based design optimization (RBDO) are conducted. The optimization solutions are further validated by finite element analysis, which shows the effectiveness of the RBDO solution in the structural crashworthiness design process. The results demonstrate the advantages of using RBDO, resulting in not only increased energy absorption and decreased structural weight from a baseline design, but also a significant improvement in the reliability of the design.

  6. Computer-aided communication satellite system analysis and optimization

    NASA Technical Reports Server (NTRS)

    Stagl, T. W.; Morgan, N. H.; Morley, R. E.; Singh, J. P.

    1973-01-01

    The capabilities and limitations of the various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. A satellite Telecommunication analysis and Modeling Program (STAMP) for costing and sensitivity analysis work in application of communication satellites to educational development is given. The modifications made to STAMP include: extension of the six beam capability to eight; addition of generation of multiple beams from a single reflector system with an array of feeds; an improved system costing to reflect the time value of money, growth in earth terminal population with time, and to account for various measures of system reliability; inclusion of a model for scintillation at microwave frequencies in the communication link loss model; and, an updated technological environment.

  7. Applicability and Limitations of Reliability Allocation Methods

    NASA Technical Reports Server (NTRS)

    Cruz, Jose A.

    2016-01-01

    Reliability allocation process may be described as the process of assigning reliability requirements to individual components within a system to attain the specified system reliability. For large systems, the allocation process is often performed at different stages of system design. The allocation process often begins at the conceptual stage. As the system design develops, more information about components and the operating environment becomes available, different allocation methods can be considered. Reliability allocation methods are usually divided into two categories: weighting factors and optimal reliability allocation. When properly applied, these methods can produce reasonable approximations. Reliability allocation techniques have limitations and implied assumptions that need to be understood by system engineers. Applying reliability allocation techniques without understanding their limitations and assumptions can produce unrealistic results. This report addresses weighting factors, optimal reliability allocation techniques, and identifies the applicability and limitations of each reliability allocation technique.

  8. Fluorescence detection of dental calculus

    NASA Astrophysics Data System (ADS)

    Gonchukov, S.; Biryukova, T.; Sukhinina, A.; Vdovin, Yu

    2010-11-01

    This work is devoted to the optimization of fluorescence dental calculus diagnostics in optical spectrum. The optimal wavelengths for fluorescence excitation and registration are determined. Two spectral ranges 620 - 645 nm and 340 - 370 nm are the most convenient for supra- and subgingival calculus determination. The simple implementation of differential method free from the necessity of spectrometer using was investigated. Calculus detection reliability in the case of simple implementation is higher than in the case of spectra analysis at optimal wavelengths. The use of modulated excitation light and narrowband detection of informative signal allows us to decrease essentially its diagnostic intensity even in comparison with intensity of the low level laser dental therapy.

  9. Aerothermodynamic shape optimization of hypersonic blunt bodies

    NASA Astrophysics Data System (ADS)

    Eyi, Sinan; Yumuşak, Mine

    2015-07-01

    The aim of this study is to develop a reliable and efficient design tool that can be used in hypersonic flows. The flow analysis is based on the axisymmetric Euler/Navier-Stokes and finite-rate chemical reaction equations. The equations are coupled simultaneously and solved implicitly using Newton's method. The Jacobian matrix is evaluated analytically. A gradient-based numerical optimization is used. The adjoint method is utilized for sensitivity calculations. The objective of the design is to generate a hypersonic blunt geometry that produces the minimum drag with low aerodynamic heating. Bezier curves are used for geometry parameterization. The performances of the design optimization method are demonstrated for different hypersonic flow conditions.

  10. Time-saving design of experiment protocol for optimization of LC-MS data processing in metabolomic approaches.

    PubMed

    Zheng, Hong; Clausen, Morten Rahr; Dalsgaard, Trine Kastrup; Mortensen, Grith; Bertram, Hanne Christine

    2013-08-06

    We describe a time-saving protocol for the processing of LC-MS-based metabolomics data by optimizing parameter settings in XCMS and threshold settings for removing noisy and low-intensity peaks using design of experiment (DoE) approaches including Plackett-Burman design (PBD) for screening and central composite design (CCD) for optimization. A reliability index, which is based on evaluation of the linear response to a dilution series, was used as a parameter for the assessment of data quality. After identifying the significant parameters in the XCMS software by PBD, CCD was applied to determine their values by maximizing the reliability and group indexes. Optimal settings by DoE resulted in improvements of 19.4% and 54.7% in the reliability index for a standard mixture and human urine, respectively, as compared with the default setting, and a total of 38 h was required to complete the optimization. Moreover, threshold settings were optimized by using CCD for further improvement. The approach combining optimal parameter setting and the threshold method improved the reliability index about 9.5 times for a standards mixture and 14.5 times for human urine data, which required a total of 41 h. Validation results also showed improvements in the reliability index of about 5-7 times even for urine samples from different subjects. It is concluded that the proposed methodology can be used as a time-saving approach for improving the processing of LC-MS-based metabolomics data.

  11. Analysis of reliability for multi-ring interconnection of RPR networks

    NASA Astrophysics Data System (ADS)

    Liu, Jia; Jin, Depeng; Zeng, Lieguang; Li, Yong

    2008-11-01

    In this paper, the reliability and MTTF (Mean Time to Failure) for multi-ring RPR (Resilient Packet Ring) are calculated on the conditions of single-link failures, double-link failures and no failure, respectively. The parameters such as the total number of stations N, the number of the sub-rings R, and the distribution of Ni which represents the number of the stations in the i-th sub-ring (1<=i<=R) are contained in the formulas. The relationship between the value of the reliability/MTTF and the parameters N, R and Ni is analyzed. The result shows that reliability/MTTF of the RPR multi-rings is increasing while the variance of Ni is decreasing. It is also proved that the value of the reliability/MTTF is maximum when Ni=Nj ( i ≠j and 1<=i, j<=R) by using Lagrange multipliers method, i.e. the condition of the optimal reliability of multi-ring RPR is satisfied when var(Ni) =0.

  12. A novel approach on accelerated ageing towards reliability optimization of high concentration photovoltaic cells

    NASA Astrophysics Data System (ADS)

    Tsanakas, John A.; Jaffre, Damien; Sicre, Mathieu; Elouamari, Rachid; Vossier, Alexis; de Salins, Jean-Edouard; Bechou, Laurent; Levrier, Bruno; Perona, Arnaud; Dollet, Alain

    2014-09-01

    This paper presents a preliminary study upon a novel approach proposed for highly accelerated ageing and reliability optimization of high concentrating photovoltaic (HCPV) cells and assemblies. The intended approach aims to overcome several limitations of some current accelerated ageing tests (AAT) adopted up today, proposing the use of an alternative experimental set-up for performing faster and more realistic thermal cycles, under real sun, without the involvement of environmental chamber. The study also includes specific characterization techniques, before and after each AAT sequence, which respectively provide the initial and final diagnosis on the condition of the tested sample. The acquired data from these diagnostic/characterization methods are then used as indices to determine both quantitatively and qualitatively the severity of degradation and, thus, the ageing level for each tested HCPV assembly or cell sample. Ultimate goal of such "initial diagnosis - AAT - final diagnosis" sequences is to provide the basis for a future work on the reliability analysis of the main degradation mechanisms and confident prediction of failure propagation in HCPV cells, by means of acceleration factor (AF) and mean-time-to-failure (MTTF) estimations.

  13. Reliability evaluation of high-performance, low-power FinFET standard cells based on mixed RBB/FBB technique

    NASA Astrophysics Data System (ADS)

    Wang, Tian; Cui, Xiaoxin; Ni, Yewen; Liao, Kai; Liao, Nan; Yu, Dunshan; Cui, Xiaole

    2017-04-01

    With shrinking transistor feature size, the fin-type field-effect transistor (FinFET) has become the most promising option in low-power circuit design due to its superior capability to suppress leakage. To support the VLSI digital system flow based on logic synthesis, we have designed an optimized high-performance low-power FinFET standard cell library based on employing the mixed FBB/RBB technique in the existing stacked structure of each cell. This paper presents the reliability evaluation of the optimized cells under process and operating environment variations based on Monte Carlo analysis. The variations are modelled with Gaussian distribution of the device parameters and 10000 sweeps are conducted in the simulation to obtain the statistical properties of the worst-case delay and input-dependent leakage for each cell. For comparison, a set of non-optimal cells that adopt the same topology without employing the mixed biasing technique is also generated. Experimental results show that the optimized cells achieve standard deviation reduction of 39.1% and 30.7% at most in worst-case delay and input-dependent leakage respectively while the normalized deviation shrinking in worst-case delay and input-dependent leakage can be up to 98.37% and 24.13%, respectively, which demonstrates that our optimized cells are less sensitive to variability and exhibit more reliability. Project supported by the National Natural Science Foundation of China (No. 61306040), the State Key Development Program for Basic Research of China (No. 2015CB057201), the Beijing Natural Science Foundation (No. 4152020), and Natural Science Foundation of Guangdong Province, China (No. 2015A030313147).

  14. Modified Bayesian Kriging for Noisy Response Problems for Reliability Analysis

    DTIC Science & Technology

    2015-01-01

    52242, USA nicholas-gaul@uiowa.edu Mary Kathryn Cowles Department of Statistics & Actuarial Science College of Liberal Arts and Sciences , The...Forrester, A. I. J., & Keane, A. J. (2009). Recent advances in surrogate-based optimization. Progress in Aerospace Sciences , 45(1–3), 50-79. doi...Wiley. [27] Sacks, J., Welch, W. J., Toby J. Mitchell, & Wynn, H. P. (1989). Design and analysis of computer experiments. Statistical Science , 4

  15. Psychometric properties of the medical outcomes study sleep scale in Spanish postmenopausal women.

    PubMed

    Zagalaz-Anula, Noelia; Hita-Contreras, Fidel; Martínez-Amat, Antonio; Cruz-Díaz, David; Lomas-Vega, Rafael

    2017-07-01

    This study aimed to analyze the reliability and validity of the Spanish version of the Medical Outcomes Study Sleep Scale (MOS-SS), and its ability to discriminate between poor and good sleepers among a Spanish population with vestibular disorders. In all, 121 women (50-76 years old) completed the Spanish version of the MOS-SS. Internal consistency, test-retest reliability, and construct validity (exploratory factor analysis) were analyzed. Concurrent validity was evaluated using the Pittsburgh Sleep Quality Index and the 36-item Short Form Health Survey. To analyze the ability of the MOS-SS scores to discriminate between poor and good sleepers, a receiver-operating characteristic curve analysis was performed. The Spanish version of the MOS-SS showed excellent and substantial reliability in Sleep Problems Index I (two sleep disturbance items, one somnolence item, two sleep adequacy items, and awaken short of breath or with headache) and Sleep Problems Index II (four sleep disturbance items, two somnolence items, two sleep adequacy items, and awaken short of breath or with headache), respectively, and good internal consistency with optimal Cronbach's alpha values in all domains and indexes (0.70-0.90). Factor analysis suggested a coherent four-factor structure (explained variance 70%). In concurrent validity analysis, MOS-SS indexes showed significant and strong correlation with the Pittsburgh Sleep Quality Index total score, and moderate with the 36-item Short Form Health Survey component summaries. Several domains and the two indexes were significantly able to discriminate between poor and good sleepers (P < 0.05). Optimal cut-off points were above 20 for "sleep disturbance" domain, with above 22.22 and above 33.33 for Sleep Problems Index I and II. The Spanish version of the MOS-SS is a valid and reliable instrument, suitable to assess sleep quality in Spanish postmenopausal women, with satisfactory general psychometric properties. It discriminates well between good and poor sleepers.

  16. Efficacy of a Newly Designed Cephalometric Analysis Software for McNamara Analysis in Comparison with Dolphin Software.

    PubMed

    Nouri, Mahtab; Hamidiaval, Shadi; Akbarzadeh Baghban, Alireza; Basafa, Mohammad; Fahim, Mohammad

    2015-01-01

    Cephalometric norms of McNamara analysis have been studied in various populations due to their optimal efficiency. Dolphin cephalometric software greatly enhances the conduction of this analysis for orthodontic measurements. However, Dolphin is very expensive and cannot be afforded by many clinicians in developing countries. A suitable alternative software program in Farsi/English will greatly help Farsi speaking clinicians. The present study aimed to develop an affordable Iranian cephalometric analysis software program and compare it with Dolphin, the standard software available on the market for cephalometric analysis. In this diagnostic, descriptive study, 150 lateral cephalograms of normal occlusion individuals were selected in Mashhad and Qazvin, two major cities of Iran mainly populated with Fars ethnicity, the main Iranian ethnic group. After tracing the cephalograms, the McNamara analysis standards were measured both with Dolphin and the new software. The cephalometric software was designed using Microsoft Visual C++ program in Windows XP. Measurements made with the new software were compared with those of Dolphin software on both series of cephalograms. The validity and reliability were tested using intra-class correlation coefficient. Calculations showed a very high correlation between the results of the Iranian cephalometric analysis software and Dolphin. This confirms the validity and optimal efficacy of the newly designed software (ICC 0.570-1.0). According to our results, the newly designed software has acceptable validity and reliability and can be used for orthodontic diagnosis, treatment planning and assessment of treatment outcome.

  17. A Comparison of Probabilistic and Deterministic Campaign Analysis for Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Merrill, R. Gabe; Andraschko, Mark; Stromgren, Chel; Cirillo, Bill; Earle, Kevin; Goodliff, Kandyce

    2008-01-01

    Human space exploration is by its very nature an uncertain endeavor. Vehicle reliability, technology development risk, budgetary uncertainty, and launch uncertainty all contribute to stochasticity in an exploration scenario. However, traditional strategic analysis has been done in a deterministic manner, analyzing and optimizing the performance of a series of planned missions. History has shown that exploration scenarios rarely follow such a planned schedule. This paper describes a methodology to integrate deterministic and probabilistic analysis of scenarios in support of human space exploration. Probabilistic strategic analysis is used to simulate "possible" scenario outcomes, based upon the likelihood of occurrence of certain events and a set of pre-determined contingency rules. The results of the probabilistic analysis are compared to the nominal results from the deterministic analysis to evaluate the robustness of the scenario to adverse events and to test and optimize contingency planning.

  18. Network reliability maximization for stochastic-flow network subject to correlated failures using genetic algorithm and tabu\\xA0search

    NASA Astrophysics Data System (ADS)

    Yeh, Cheng-Ta; Lin, Yi-Kuei; Yang, Jo-Yun

    2018-07-01

    Network reliability is an important performance index for many real-life systems, such as electric power systems, computer systems and transportation systems. These systems can be modelled as stochastic-flow networks (SFNs) composed of arcs and nodes. Most system supervisors respect the network reliability maximization by finding the optimal multi-state resource assignment, which is one resource to each arc. However, a disaster may cause correlated failures for the assigned resources, affecting the network reliability. This article focuses on determining the optimal resource assignment with maximal network reliability for SFNs. To solve the problem, this study proposes a hybrid algorithm integrating the genetic algorithm and tabu search to determine the optimal assignment, called the hybrid GA-TS algorithm (HGTA), and integrates minimal paths, recursive sum of disjoint products and the correlated binomial distribution to calculate network reliability. Several practical numerical experiments are adopted to demonstrate that HGTA has better computational quality than several popular soft computing algorithms.

  19. ANSYS-Based Simulation and Optimization on Temperature Field of Amorphous Ingot Made by Water Quenching

    NASA Astrophysics Data System (ADS)

    Zhao, W.; Sun, Z.; Tang, Z.; Liaw, P. K.; Li, J.; Liu, R. P.; Li, Gong

    2014-05-01

    2D finite element analysis was conducted on the temperature field to create an amorphous ingot by vacuum water quenching. An optimized analysis document was then written by ANSYS parametric design language, and the optimal design modules of ANSYS were used to study the inside diameter and wall thickness of the quartz tube, as well as the water temperature. The microstructure and the phase structure of the amorphous ingot were evaluated by scanning electron microscopy and X-ray diffraction, respectively. Results show that during the cooling process, the thinner wall thickness, smaller diameter of the ingot, or lower temperature of the water environment can result in higher cooling rate at a given temperature. Besides, the gap between the different cooling rates induced by wall thickness or diameter of the ingot narrows down as the temperature decreases, and the gap between the different cooling rates induced by temperature of the water environment remains constant. The process parameters in creating an amorphous ingot, which is optimized by the finite element analysis on the temperature field, are reliable.

  20. Hawaii electric system reliability.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and formore » application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.« less

  1. Hawaii Electric System Reliability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loose, Verne William; Silva Monroy, Cesar Augusto

    2012-08-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers’ views of reliability “worth” and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and formore » application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers’ views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.« less

  2. Branch and bound algorithm for accurate estimation of analytical isotropic bidirectional reflectance distribution function models.

    PubMed

    Yu, Chanki; Lee, Sang Wook

    2016-05-20

    We present a reliable and accurate global optimization framework for estimating parameters of isotropic analytical bidirectional reflectance distribution function (BRDF) models. This approach is based on a branch and bound strategy with linear programming and interval analysis. Conventional local optimization is often very inefficient for BRDF estimation since its fitting quality is highly dependent on initial guesses due to the nonlinearity of analytical BRDF models. The algorithm presented in this paper employs L1-norm error minimization to estimate BRDF parameters in a globally optimal way and interval arithmetic to derive our feasibility problem and lower bounding function. Our method is developed for the Cook-Torrance model but with several normal distribution functions such as the Beckmann, Berry, and GGX functions. Experiments have been carried out to validate the presented method using 100 isotropic materials from the MERL BRDF database, and our experimental results demonstrate that the L1-norm minimization provides a more accurate and reliable solution than the L2-norm minimization.

  3. High power diode lasers emitting from 639 nm to 690 nm

    NASA Astrophysics Data System (ADS)

    Bao, L.; Grimshaw, M.; DeVito, M.; Kanskar, M.; Dong, W.; Guan, X.; Zhang, S.; Patterson, J.; Dickerson, P.; Kennedy, K.; Li, S.; Haden, J.; Martinsen, R.

    2014-03-01

    There is increasing market demand for high power reliable red lasers for display and cinema applications. Due to the fundamental material system limit at this wavelength range, red diode lasers have lower efficiency and are more temperature sensitive, compared to 790-980 nm diode lasers. In terms of reliability, red lasers are also more sensitive to catastrophic optical mirror damage (COMD) due to the higher photon energy. Thus developing higher power-reliable red lasers is very challenging. This paper will present nLIGHT's released red products from 639 nm to 690nm, with established high performance and long-term reliability. These single emitter diode lasers can work as stand-alone singleemitter units or efficiently integrate into our compact, passively-cooled Pearl™ fiber-coupled module architectures for higher output power and improved reliability. In order to further improve power and reliability, new chip optimizations have been focused on improving epitaxial design/growth, chip configuration/processing and optical facet passivation. Initial optimization has demonstrated promising results for 639 nm diode lasers to be reliably rated at 1.5 W and 690nm diode lasers to be reliably rated at 4.0 W. Accelerated life-test has started and further design optimization are underway.

  4. Advanced power analysis methodology targeted to the optimization of a digital pixel readout chip design and its critical serial powering system

    NASA Astrophysics Data System (ADS)

    Marconi, S.; Orfanelli, S.; Karagounis, M.; Hemperek, T.; Christiansen, J.; Placidi, P.

    2017-02-01

    A dedicated power analysis methodology, based on modern digital design tools and integrated with the VEPIX53 simulation framework developed within RD53 collaboration, is being used to guide vital choices for the design and optimization of the next generation ATLAS and CMS pixel chips and their critical serial powering circuit (shunt-LDO). Power consumption is studied at different stages of the design flow under different operating conditions. Significant effort is put into extensive investigations of dynamic power variations in relation with the decoupling seen by the powering network. Shunt-LDO simulations are also reported to prove the reliability at the system level.

  5. Selection of optimal sensors for predicting performance of polymer electrolyte membrane fuel cell

    NASA Astrophysics Data System (ADS)

    Mao, Lei; Jackson, Lisa

    2016-10-01

    In this paper, sensor selection algorithms are investigated based on a sensitivity analysis, and the capability of optimal sensors in predicting PEM fuel cell performance is also studied using test data. The fuel cell model is developed for generating the sensitivity matrix relating sensor measurements and fuel cell health parameters. From the sensitivity matrix, two sensor selection approaches, including the largest gap method, and exhaustive brute force searching technique, are applied to find the optimal sensors providing reliable predictions. Based on the results, a sensor selection approach considering both sensor sensitivity and noise resistance is proposed to find the optimal sensor set with minimum size. Furthermore, the performance of the optimal sensor set is studied to predict fuel cell performance using test data from a PEM fuel cell system. Results demonstrate that with optimal sensors, the performance of PEM fuel cell can be predicted with good quality.

  6. An adequacy-constrained integrated planning method for effective accommodation of DG and electric vehicles in smart distribution systems

    NASA Astrophysics Data System (ADS)

    Tan, Zhukui; Xie, Baiming; Zhao, Yuanliang; Dou, Jinyue; Yan, Tong; Liu, Bin; Zeng, Ming

    2018-06-01

    This paper presents a new integrated planning framework for effective accommodating electric vehicles in smart distribution systems (SDS). The proposed method incorporates various investment options available for the utility collectively, including distributed generation (DG), capacitors and network reinforcement. Using a back-propagation algorithm combined with cost-benefit analysis, the optimal network upgrade plan, allocation and sizing of the selected components are determined, with the purpose of minimizing the total system capital and operating costs of DG and EV accommodation. Furthermore, a new iterative reliability test method is proposed. It can check the optimization results by subsequently simulating the reliability level of the planning scheme, and modify the generation reserve margin to guarantee acceptable adequacy levels for each year of the planning horizon. Numerical results based on a 32-bus distribution system verify the effectiveness of the proposed method.

  7. Hot and cold body reference noise generators from 0 to 40 GHz

    NASA Technical Reports Server (NTRS)

    Hornbostel, D. H.

    1974-01-01

    This article describes the design, development, and analysis of exceptionally accurate radiometric noise generators from 0-40 GHz to serve as standard references. Size, weight, power, and reliability are optimized to meet the requirements of NASA air- and space-borne radiometers. The radiometric noise temperature of these noise generators is, unavoidably, calculated from measured values rather than measured directly. The absolute accuracy and stability are equal to or better than those of reliable standards available for comparison. A noise generator has been developed whose measurable properties (VSWR, line loss, thermometric temperatures) have been optimized in order to minimize the effects of the uncertainty in the calculated radiometric noise temperatures. Each measurable property is evaluated and analyzed to determine the effects of the uncertainty of the measured value. Unmeasurable properties (primarily temperature gradients) are analyzed, and reasonable precautions are designed into the noise generator to guarantee that the uncertainty of the value remains within tolerable limits.

  8. Exploratory and Confirmatory Factor Analysis of the Decision Regret Scale in Recipients of Internal Cardioverter Defibrillators

    PubMed Central

    Hickman, Ronald L.; Pinto, Melissa D.; Lee, Eunsuk; Daly, Barbara J.

    2015-01-01

    The Decision Regret Scale (DRS) is a five-item instrument that captures an individual’s regret associated with a healthcare decision. Cross-sectional data were collected from 109 cardiac patients who decided to receive an internal cardioverter defibrillator (ICD). Exploratory and confirmatory factor analyses, assessments of the internal reliability consistency (α = .86), and discriminant validity established the DRS as a reliable and valid measure of decision regret in ICD recipients. The DRS, a psychometrically sound instrument, has relevance for clinicians and researchers vested in optimizing the decisional outcomes of ICD recipients. Future research is needed to examine the reliability and validity of the DRS in a larger and more diverse sample of ICD recipients. PMID:22679707

  9. Structural design considerations for micromachined solid-oxide fuel cells

    NASA Astrophysics Data System (ADS)

    Srikar, V. T.; Turner, Kevin T.; Andrew Ie, Tze Yung; Spearing, S. Mark

    Micromachined solid-oxide fuel cells (μSOFCs) are among a class of devices being investigated for portable power generation. Optimization of the performance and reliability of such devices requires robust, scale-dependent, design methodologies. In this first analysis, we consider the structural design of planar, electrolyte-supported, μSOFCs from the viewpoints of electrochemical performance, mechanical stability and reliability, and thermal behavior. The effect of electrolyte thickness on fuel cell performance is evaluated using a simple analytical model. Design diagrams that account explicitly for thermal and intrinsic residual stresses are presented to identify geometries that are resistant to fracture and buckling. Analysis of energy loss due to in-plane heat conduction highlights the importance of efficient thermal isolation in microscale fuel cell design.

  10. Batch settling curve registration via image data modeling.

    PubMed

    Derlon, Nicolas; Thürlimann, Christian; Dürrenmatt, David; Villez, Kris

    2017-05-01

    To this day, obtaining reliable characterization of sludge settling properties remains a challenging and time-consuming task. Without such assessments however, optimal design and operation of secondary settling tanks is challenging and conservative approaches will remain necessary. With this study, we show that automated sludge blanket height registration and zone settling velocity estimation is possible thanks to analysis of images taken during batch settling experiments. The experimental setup is particularly interesting for practical applications as it consists of off-the-shelf components only, no moving parts are required, and the software is released publicly. Furthermore, the proposed multivariate shape constrained spline model for image analysis appears to be a promising method for reliable sludge blanket height profile registration. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Empirical Recommendations for Improving the Stability of the Dot-Probe Task in Clinical Research

    PubMed Central

    Price, Rebecca B.; Kuckertz, Jennie M.; Siegle, Greg J.; Ladouceur, Cecile D.; Silk, Jennifer S.; Ryan, Neal D.; Dahl, Ronald E.; Amir, Nader

    2014-01-01

    The dot-probe task has been widely used in research to produce an index of biased attention based on reaction times (RTs). Despite its popularity, very few published studies have examined psychometric properties of the task, including test-retest reliability, and no previous study has examined reliability in clinically anxious samples or systematically explored the effects of task design and analysis decisions on reliability. In the current analysis, we utilized dot-probe data from three studies where attention bias towards threat-related faces was assessed at multiple (≥5) timepoints. Two of the studies were similar (adults with Social Anxiety Disorder, similar design features) while one was much more disparate (pediatric healthy volunteers, distinct task design). We explored the effects of analysis choices (e.g., bias score calculation formula, methods for outlier handling) on reliability and searched for convergence of findings across the three studies. We found that, when considering the three studies concurrently, the most reliable RT bias index utilized data from dot-bottom trials, comparing congruent to incongruent trials, with rescaled outliers, particularly after averaging across more than one assessment point. Although reliability of RT bias indices was moderate to low under most circumstances, within-session variability in bias (attention bias variability; ABV), a recently proposed RT index, was more reliable across sessions. Several eyetracking-based indices of attention bias (available in the pediatric healthy sample only) showed reliability that matched the optimal RT index (ABV). On the basis of these findings, we make specific recommendations to researchers using the dot probe, particularly those wishing to investigate individual differences and/or single-patient applications. PMID:25419646

  12. Automating Structural Analysis of Spacecraft Vehicles

    NASA Technical Reports Server (NTRS)

    Hrinda, Glenn A.

    2004-01-01

    A major effort within NASA's vehicle analysis discipline has been to automate structural analysis and sizing optimization during conceptual design studies of advanced spacecraft. Traditional spacecraft structural sizing has involved detailed finite element analysis (FEA) requiring large degree-of-freedom (DOF) finite element models (FEM). Creation and analysis of these models can be time consuming and limit model size during conceptual designs. The goal is to find an optimal design that meets the mission requirements but produces the lightest structure. A structural sizing tool called HyperSizer has been successfully used in the conceptual design phase of a reusable launch vehicle and planetary exploration spacecraft. The program couples with FEA to enable system level performance assessments and weight predictions including design optimization of material selections and sizing of spacecraft members. The software's analysis capabilities are based on established aerospace structural methods for strength, stability and stiffness that produce adequately sized members and reliable structural weight estimates. The software also helps to identify potential structural deficiencies early in the conceptual design so changes can be made without wasted time. HyperSizer's automated analysis and sizing optimization increases productivity and brings standardization to a systems study. These benefits will be illustrated in examining two different types of conceptual spacecraft designed using the software. A hypersonic air breathing, single stage to orbit (SSTO), reusable launch vehicle (RLV) will be highlighted as well as an aeroshell for a planetary exploration vehicle used for aerocapture at Mars. By showing the two different types of vehicles, the software's flexibility will be demonstrated with an emphasis on reducing aeroshell structural weight. Member sizes, concepts and material selections will be discussed as well as analysis methods used in optimizing the structure. Analysis based on the HyperSizer structural sizing software will be discussed. Design trades required to optimize structural weight will be presented.

  13. Perceptual attraction in tool use: evidence for a reliability-based weighting mechanism.

    PubMed

    Debats, Nienke B; Ernst, Marc O; Heuer, Herbert

    2017-04-01

    Humans are well able to operate tools whereby their hand movement is linked, via a kinematic transformation, to a spatially distant object moving in a separate plane of motion. An everyday example is controlling a cursor on a computer monitor. Despite these separate reference frames, the perceived positions of the hand and the object were found to be biased toward each other. We propose that this perceptual attraction is based on the principles by which the brain integrates redundant sensory information of single objects or events, known as optimal multisensory integration. That is, 1 ) sensory information about the hand and the tool are weighted according to their relative reliability (i.e., inverse variances), and 2 ) the unisensory reliabilities sum up in the integrated estimate. We assessed whether perceptual attraction is consistent with optimal multisensory integration model predictions. We used a cursor-control tool-use task in which we manipulated the relative reliability of the unisensory hand and cursor position estimates. The perceptual biases shifted according to these relative reliabilities, with an additional bias due to contextual factors that were present in experiment 1 but not in experiment 2 The biased position judgments' variances were, however, systematically larger than the predicted optimal variances. Our findings suggest that the perceptual attraction in tool use results from a reliability-based weighting mechanism similar to optimal multisensory integration, but that certain boundary conditions for optimality might not be satisfied. NEW & NOTEWORTHY Kinematic tool use is associated with a perceptual attraction between the spatially separated hand and the effective part of the tool. We provide a formal account for this phenomenon, thereby showing that the process behind it is similar to optimal integration of sensory information relating to single objects. Copyright © 2017 the American Physiological Society.

  14. Hybrid computational and experimental approach for the study and optimization of mechanical components

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1998-05-01

    Increased demands on the performance and efficiency of mechanical components impose challenges on their engineering design and optimization, especially when new and more demanding applications must be developed in relatively short periods of time while satisfying design objectives, as well as cost and manufacturability. In addition, reliability and durability must be taken into consideration. As a consequence, effective quantitative methodologies, computational and experimental, should be applied in the study and optimization of mechanical components. Computational investigations enable parametric studies and the determination of critical engineering design conditions, while experimental investigations, especially those using optical techniques, provide qualitative and quantitative information on the actual response of the structure of interest to the applied load and boundary conditions. We discuss a hybrid experimental and computational approach for investigation and optimization of mechanical components. The approach is based on analytical, computational, and experimental resolutions methodologies in the form of computational, noninvasive optical techniques, and fringe prediction analysis tools. Practical application of the hybrid approach is illustrated with representative examples that demonstrate the viability of the approach as an effective engineering tool for analysis and optimization.

  15. Optimization of an electromagnetic linear actuator using a network and a finite element model

    NASA Astrophysics Data System (ADS)

    Neubert, Holger; Kamusella, Alfred; Lienig, Jens

    2011-03-01

    Model based design optimization leads to robust solutions only if the statistical deviations of design, load and ambient parameters from nominal values are considered. We describe an optimization methodology that involves these deviations as stochastic variables for an exemplary electromagnetic actuator used to drive a Braille printer. A combined model simulates the dynamic behavior of the actuator and its non-linear load. It consists of a dynamic network model and a stationary magnetic finite element (FE) model. The network model utilizes lookup tables of the magnetic force and the flux linkage computed by the FE model. After a sensitivity analysis using design of experiment (DoE) methods and a nominal optimization based on gradient methods, a robust design optimization is performed. Selected design variables are involved in form of their density functions. In order to reduce the computational effort we use response surfaces instead of the combined system model obtained in all stochastic analysis steps. Thus, Monte-Carlo simulations can be applied. As a result we found an optimum system design meeting our requirements with regard to function and reliability.

  16. Optimal design of groundwater remediation system using a probabilistic multi-objective fast harmony search algorithm under uncertainty

    NASA Astrophysics Data System (ADS)

    Luo, Qiankun; Wu, Jianfeng; Yang, Yun; Qian, Jiazhong; Wu, Jichun

    2014-11-01

    This study develops a new probabilistic multi-objective fast harmony search algorithm (PMOFHS) for optimal design of groundwater remediation systems under uncertainty associated with the hydraulic conductivity (K) of aquifers. The PMOFHS integrates the previously developed deterministic multi-objective optimization method, namely multi-objective fast harmony search algorithm (MOFHS) with a probabilistic sorting technique to search for Pareto-optimal solutions to multi-objective optimization problems in a noisy hydrogeological environment arising from insufficient K data. The PMOFHS is then coupled with the commonly used flow and transport codes, MODFLOW and MT3DMS, to identify the optimal design of groundwater remediation systems for a two-dimensional hypothetical test problem and a three-dimensional Indiana field application involving two objectives: (i) minimization of the total remediation cost through the engineering planning horizon, and (ii) minimization of the mass remaining in the aquifer at the end of the operational period, whereby the pump-and-treat (PAT) technology is used to clean up contaminated groundwater. Also, Monte Carlo (MC) analysis is employed to evaluate the effectiveness of the proposed methodology. Comprehensive analysis indicates that the proposed PMOFHS can find Pareto-optimal solutions with low variability and high reliability and is a potentially effective tool for optimizing multi-objective groundwater remediation problems under uncertainty.

  17. Ceramic Integration Technologies for Energy and Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Singh, Mrityunjay; Asthana, Ralph N.

    2007-01-01

    Robust and affordable integration technologies for advanced ceramics are required to improve the performance, reliability, efficiency, and durability of components, devices, and systems based on them in a wide variety of energy, aerospace, and environmental applications. Many thermochemical and thermomechanical factors including joint design, analysis, and optimization must be considered in integration of similar and dissimilar material systems.

  18. A Simple and Reliable Method of Design for Standalone Photovoltaic Systems

    NASA Astrophysics Data System (ADS)

    Srinivasarao, Mantri; Sudha, K. Rama; Bhanu, C. V. K.

    2017-06-01

    Standalone photovoltaic (SAPV) systems are seen as a promoting method of electrifying areas of developing world that lack power grid infrastructure. Proliferations of these systems require a design procedure that is simple, reliable and exhibit good performance over its life time. The proposed methodology uses simple empirical formulae and easily available parameters to design SAPV systems, that is, array size with energy storage. After arriving at the different array size (area), performance curves are obtained for optimal design of SAPV system with high amount of reliability in terms of autonomy at a specified value of loss of load probability (LOLP). Based on the array to load ratio (ALR) and levelized energy cost (LEC) through life cycle cost (LCC) analysis, it is shown that the proposed methodology gives better performance, requires simple data and is more reliable when compared with conventional design using monthly average daily load and insolation.

  19. The WHOMEN’s Scale (Women’s HAART Optimism Monitoring and EvaluatioN Scale v.1) and the Association with Fertility Intentions and Sexual Behaviours Among HIV-Positive Women in Uganda

    PubMed Central

    Lima, Viviane Dias; Andia, Irene; Kabakyenga, Jerome; Mbabazi, Pamela; Emenyonu, Nneka; Patterson, Thomas L.; Hogg, Robert S.; Bangsberg, David R.

    2013-01-01

    The objective of this study was to develop a reliable HAART optimism scale among HIV-positive women in Uganda and to test the scale’s validity against measures of fertility intentions, sexual activity, and unprotected sexual intercourse. We used cross-sectional survey data of 540 women (18–50 years) attending Mbarara University’s HIV clinic in Uganda. Women were asked how much they agreed or disagreed with 23 statements about HAART. Data were subjected to a principal components and factor analyses. Subsequently, we tested the association between the scale and fertility intentions and sexual behaviour using Wilcoxon rank sum test. Factor analysis yielded three factors, one of which was an eight-item HAART optimism scale with moderately high internal consistency (α = 0.70). Women who reported that they intended to have (more) children had significantly higher HAART optimism scores (median = 13.5 [IQR: 12–16]) than women who did not intend to have (more) children (median = 10.5 [IQR: 8–12]; P <0.0001). Similarly, women who were sexually active and who reported practicing unprotected sexual intercourse had significantly higher HAART optimism scores than women who were sexually abstinent or who practiced protected sexual intercourse. Our reliable and valid scale, termed the Women’s HAART Optimism Monitoring and EvaluatioN scale (WHOMEN’s scale), may be valuable to broader studies investigating the role of HAART optimism on reproductive intentions and sexual behaviours of HIV-positive women in high HIV prevalence settings. PMID:19387819

  20. Scaling Impacts in Life Support Architecture and Technology Selection

    NASA Technical Reports Server (NTRS)

    Lange, Kevin

    2016-01-01

    For long-duration space missions outside of Earth orbit, reliability considerations will drive higher levels of redundancy and/or on-board spares for life support equipment. Component scaling will be a critical element in minimizing overall launch mass while maintaining an acceptable level of system reliability. Building on an earlier reliability study (AIAA 2012-3491), this paper considers the impact of alternative scaling approaches, including the design of technology assemblies and their individual components to maximum, nominal, survival, or other fractional requirements. The optimal level of life support system closure is evaluated for deep-space missions of varying duration using equivalent system mass (ESM) as the comparative basis. Reliability impacts are included in ESM by estimating the number of component spares required to meet a target system reliability. Common cause failures are included in the analysis. ISS and ISS-derived life support technologies are considered along with selected alternatives. This study focusses on minimizing launch mass, which may be enabling for deep-space missions.

  1. An Automated, Adaptive Framework for Optimizing Preprocessing Pipelines in Task-Based Functional MRI

    PubMed Central

    Churchill, Nathan W.; Spring, Robyn; Afshin-Pour, Babak; Dong, Fan; Strother, Stephen C.

    2015-01-01

    BOLD fMRI is sensitive to blood-oxygenation changes correlated with brain function; however, it is limited by relatively weak signal and significant noise confounds. Many preprocessing algorithms have been developed to control noise and improve signal detection in fMRI. Although the chosen set of preprocessing and analysis steps (the “pipeline”) significantly affects signal detection, pipelines are rarely quantitatively validated in the neuroimaging literature, due to complex preprocessing interactions. This paper outlines and validates an adaptive resampling framework for evaluating and optimizing preprocessing choices by optimizing data-driven metrics of task prediction and spatial reproducibility. Compared to standard “fixed” preprocessing pipelines, this optimization approach significantly improves independent validation measures of within-subject test-retest, and between-subject activation overlap, and behavioural prediction accuracy. We demonstrate that preprocessing choices function as implicit model regularizers, and that improvements due to pipeline optimization generalize across a range of simple to complex experimental tasks and analysis models. Results are shown for brief scanning sessions (<3 minutes each), demonstrating that with pipeline optimization, it is possible to obtain reliable results and brain-behaviour correlations in relatively small datasets. PMID:26161667

  2. The impact of symptom stability on time frame and recall reliability in CFS.

    PubMed

    Evans, Meredyth; Jason, Leonard A

    This study is an investigation of the potential impact of perceived symptom stability on the recall reliability of symptom severity and frequency as reported by individuals with chronic fatigue syndrome (CFS). Symptoms were recalled using three different recall timeframes (the past week, the past month, and the past six months) and at two assessment points (with one week in between each assessment). Participants were 51 adults (45 women and 6 men), between the ages of 29 and 66 with a current diagnosis of CFS. Multilevel Model (MLM) Analyses were used to determine the optimal recall timeframe (in terms of test-retest reliability) for reporting symptoms perceived as variable and as stable over time. Headaches were recalled more reliably when they were reported as stable over time. Furthermore, the optimal timeframe in terms of test-retest reliability for stable symptoms was highly uniform, such that all Fukuda 1 CFS symptoms were more reliably recalled at the six month timeframe. Furthermore, the optimal timeframe for CFS symptoms perceived as variable, differed across symptoms. Symptom stability and recall timeframe are important to consider in order to improve the accuracy and reliability of the current methods for diagnosing this illness.

  3. Probabilistic Design of a Plate-Like Wing to Meet Flutter and Strength Requirements

    NASA Technical Reports Server (NTRS)

    Stroud, W. Jefferson; Krishnamurthy, T.; Mason, Brian H.; Smith, Steven A.; Naser, Ahmad S.

    2002-01-01

    An approach is presented for carrying out reliability-based design of a metallic, plate-like wing to meet strength and flutter requirements that are given in terms of risk/reliability. The design problem is to determine the thickness distribution such that wing weight is a minimum and the probability of failure is less than a specified value. Failure is assumed to occur if either the flutter speed is less than a specified allowable or the stress caused by a pressure loading is greater than a specified allowable. Four uncertain quantities are considered: wing thickness, calculated flutter speed, allowable stress, and magnitude of a uniform pressure load. The reliability-based design optimization approach described herein starts with a design obtained using conventional deterministic design optimization with margins on the allowables. Reliability is calculated using Monte Carlo simulation with response surfaces that provide values of stresses and flutter speed. During the reliability-based design optimization, the response surfaces and move limits are coordinated to ensure accuracy of the response surfaces. Studies carried out in the paper show the relationship between reliability and weight and indicate that, for the design problem considered, increases in reliability can be obtained with modest increases in weight.

  4. Probabilistic assessment of dynamic system performance. Part 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belhadj, Mohamed

    1993-01-01

    Accurate prediction of dynamic system failure behavior can be important for the reliability and risk analyses of nuclear power plants, as well as for their backfitting to satisfy given constraints on overall system reliability, or optimization of system performance. Global analysis of dynamic systems through investigating the variations in the structure of the attractors of the system and the domains of attraction of these attractors as a function of the system parameters is also important for nuclear technology in order to understand the fault-tolerance as well as the safety margins of the system under consideration and to insure a safemore » operation of nuclear reactors. Such a global analysis would be particularly relevant to future reactors with inherent or passive safety features that are expected to rely on natural phenomena rather than active components to achieve and maintain safe shutdown. Conventionally, failure and global analysis of dynamic systems necessitate the utilization of different methodologies which have computational limitations on the system size that can be handled. Using a Chapman-Kolmogorov interpretation of system dynamics, a theoretical basis is developed that unifies these methodologies as special cases and which can be used for a comprehensive safety and reliability analysis of dynamic systems.« less

  5. SuperSegger: robust image segmentation, analysis and lineage tracking of bacterial cells.

    PubMed

    Stylianidou, Stella; Brennan, Connor; Nissen, Silas B; Kuwada, Nathan J; Wiggins, Paul A

    2016-11-01

    Many quantitative cell biology questions require fast yet reliable automated image segmentation to identify and link cells from frame-to-frame, and characterize the cell morphology and fluorescence. We present SuperSegger, an automated MATLAB-based image processing package well-suited to quantitative analysis of high-throughput live-cell fluorescence microscopy of bacterial cells. SuperSegger incorporates machine-learning algorithms to optimize cellular boundaries and automated error resolution to reliably link cells from frame-to-frame. Unlike existing packages, it can reliably segment microcolonies with many cells, facilitating the analysis of cell-cycle dynamics in bacteria as well as cell-contact mediated phenomena. This package has a range of built-in capabilities for characterizing bacterial cells, including the identification of cell division events, mother, daughter and neighbouring cells, and computing statistics on cellular fluorescence, the location and intensity of fluorescent foci. SuperSegger provides a variety of postprocessing data visualization tools for single cell and population level analysis, such as histograms, kymographs, frame mosaics, movies and consensus images. Finally, we demonstrate the power of the package by analyzing lag phase growth with single cell resolution. © 2016 John Wiley & Sons Ltd.

  6. Alternative difference analysis scheme combining R -space EXAFS fit with global optimization XANES fit for X-ray transient absorption spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhan, Fei; Tao, Ye; Zhao, Haifeng

    Time-resolved X-ray absorption spectroscopy (TR-XAS), based on the laser-pump/X-ray-probe method, is powerful in capturing the change of the geometrical and electronic structure of the absorbing atom upon excitation. TR-XAS data analysis is generally performed on the laser-on minus laser-off difference spectrum. Here, a new analysis scheme is presented for the TR-XAS difference fitting in both the extended X-ray absorption fine-structure (EXAFS) and the X-ray absorption near-edge structure (XANES) regions.R-space EXAFS difference fitting could quickly provide the main quantitative structure change of the first shell. The XANES fitting part introduces a global non-derivative optimization algorithm and optimizes the local structure changemore » in a flexible way where both the core XAS calculation package and the search method in the fitting shell are changeable. The scheme was applied to the TR-XAS difference analysis of Fe(phen) 3spin crossover complex and yielded reliable distance change and excitation population.« less

  7. Alternative difference analysis scheme combining R-space EXAFS fit with global optimization XANES fit for X-ray transient absorption spectroscopy.

    PubMed

    Zhan, Fei; Tao, Ye; Zhao, Haifeng

    2017-07-01

    Time-resolved X-ray absorption spectroscopy (TR-XAS), based on the laser-pump/X-ray-probe method, is powerful in capturing the change of the geometrical and electronic structure of the absorbing atom upon excitation. TR-XAS data analysis is generally performed on the laser-on minus laser-off difference spectrum. Here, a new analysis scheme is presented for the TR-XAS difference fitting in both the extended X-ray absorption fine-structure (EXAFS) and the X-ray absorption near-edge structure (XANES) regions. R-space EXAFS difference fitting could quickly provide the main quantitative structure change of the first shell. The XANES fitting part introduces a global non-derivative optimization algorithm and optimizes the local structure change in a flexible way where both the core XAS calculation package and the search method in the fitting shell are changeable. The scheme was applied to the TR-XAS difference analysis of Fe(phen) 3 spin crossover complex and yielded reliable distance change and excitation population.

  8. Distributed collaborative probabilistic design for turbine blade-tip radial running clearance using support vector machine of regression

    NASA Astrophysics Data System (ADS)

    Fei, Cheng-Wei; Bai, Guang-Chen

    2014-12-01

    To improve the computational precision and efficiency of probabilistic design for mechanical dynamic assembly like the blade-tip radial running clearance (BTRRC) of gas turbine, a distribution collaborative probabilistic design method-based support vector machine of regression (SR)(called as DCSRM) is proposed by integrating distribution collaborative response surface method and support vector machine regression model. The mathematical model of DCSRM is established and the probabilistic design idea of DCSRM is introduced. The dynamic assembly probabilistic design of aeroengine high-pressure turbine (HPT) BTRRC is accomplished to verify the proposed DCSRM. The analysis results reveal that the optimal static blade-tip clearance of HPT is gained for designing BTRRC, and improving the performance and reliability of aeroengine. The comparison of methods shows that the DCSRM has high computational accuracy and high computational efficiency in BTRRC probabilistic analysis. The present research offers an effective way for the reliability design of mechanical dynamic assembly and enriches mechanical reliability theory and method.

  9. Optimization analysis of thermal management system for electric vehicle battery pack

    NASA Astrophysics Data System (ADS)

    Gong, Huiqi; Zheng, Minxin; Jin, Peng; Feng, Dong

    2018-04-01

    Electric vehicle battery pack can increase the temperature to affect the power battery system cycle life, charge-ability, power, energy, security and reliability. The Computational Fluid Dynamics simulation and experiment of the charging and discharging process of the battery pack were carried out for the thermal management system of the battery pack under the continuous charging of the battery. The simulation result and the experimental data were used to verify the rationality of the Computational Fluid Dynamics calculation model. In view of the large temperature difference of the battery module in high temperature environment, three optimization methods of the existing thermal management system of the battery pack were put forward: adjusting the installation position of the fan, optimizing the arrangement of the battery pack and reducing the fan opening temperature threshold. The feasibility of the optimization method is proved by simulation and experiment of the thermal management system of the optimized battery pack.

  10. Validity of the posttraumatic stress disorders (PTSD) checklist in pregnant women.

    PubMed

    Gelaye, Bizu; Zheng, Yinnan; Medina-Mora, Maria Elena; Rondon, Marta B; Sánchez, Sixto E; Williams, Michelle A

    2017-05-12

    The PTSD Checklist-civilian (PCL-C) is one of the most commonly used self-report measures of PTSD symptoms, however, little is known about its validity when used in pregnancy. This study aims to evaluate the reliability and validity of the PCL-C as a screen for detecting PTSD symptoms among pregnant women. A total of 3372 pregnant women who attended their first prenatal care visit in Lima, Peru participated in the study. We assessed the reliability of the PCL-C items using Cronbach's alpha. Criterion validity and performance characteristics of PCL-C were assessed against an independent, blinded Clinician-Administered PTSD Scale (CAPS) interview using measures of sensitivity, specificity and receiver operating characteristics (ROC) curves. We tested construct validity using exploratory and confirmatory factor analytic approaches. The reliability of the PCL-C was excellent (Cronbach's alpha =0.90). ROC analysis showed that a cut-off score of 26 offered optimal discriminatory power, with a sensitivity of 0.86 (95% CI: 0.78-0.92) and a specificity of 0.63 (95% CI: 0.62-0.65). The area under the ROC curve was 0.75 (95% CI: 0.71-0.78). A three-factor solution was extracted using exploratory factor analysis and was further complemented with three other models using confirmatory factor analysis (CFA). In a CFA, a three-factor model based on DSM-IV symptom structure had reasonable fit statistics with comparative fit index of 0.86 and root mean square error of approximation of 0.09. The Spanish-language version of the PCL-C may be used as a screening tool for pregnant women. The PCL-C has good reliability, criterion validity and factorial validity. The optimal cut-off score obtained by maximizing the sensitivity and specificity should be considered cautiously; women who screened positive may require further investigation to confirm PTSD diagnosis.

  11. Automatic documentation system extension to multi-manufacturers' computers and to measure, improve, and predict software reliability

    NASA Technical Reports Server (NTRS)

    Simmons, D. B.

    1975-01-01

    The DOMONIC system has been modified to run on the Univac 1108 and the CDC 6600 as well as the IBM 370 computer system. The DOMONIC monitor system has been implemented to gather data which can be used to optimize the DOMONIC system and to predict the reliability of software developed using DOMONIC. The areas of quality metrics, error characterization, program complexity, program testing, validation and verification are analyzed. A software reliability model for estimating program completion levels and one on which to base system acceptance have been developed. The DAVE system which performs flow analysis and error detection has been converted from the University of Colorado CDC 6400/6600 computer to the IBM 360/370 computer system for use with the DOMONIC system.

  12. Mixed Criticality Scheduling for Industrial Wireless Sensor Networks

    PubMed Central

    Jin, Xi; Xia, Changqing; Xu, Huiting; Wang, Jintao; Zeng, Peng

    2016-01-01

    Wireless sensor networks (WSNs) have been widely used in industrial systems. Their real-time performance and reliability are fundamental to industrial production. Many works have studied the two aspects, but only focus on single criticality WSNs. Mixed criticality requirements exist in many advanced applications in which different data flows have different levels of importance (or criticality). In this paper, first, we propose a scheduling algorithm, which guarantees the real-time performance and reliability requirements of data flows with different levels of criticality. The algorithm supports centralized optimization and adaptive adjustment. It is able to improve both the scheduling performance and flexibility. Then, we provide the schedulability test through rigorous theoretical analysis. We conduct extensive simulations, and the results demonstrate that the proposed scheduling algorithm and analysis significantly outperform existing ones. PMID:27589741

  13. A reliability design method for a lithium-ion battery pack considering the thermal disequilibrium in electric vehicles

    NASA Astrophysics Data System (ADS)

    Xia, Quan; Wang, Zili; Ren, Yi; Sun, Bo; Yang, Dezhen; Feng, Qiang

    2018-05-01

    With the rapid development of lithium-ion battery technology in the electric vehicle (EV) industry, the lifetime of the battery cell increases substantially; however, the reliability of the battery pack is still inadequate. Because of the complexity of the battery pack, a reliability design method for a lithium-ion battery pack considering the thermal disequilibrium is proposed in this paper based on cell redundancy. Based on this method, a three-dimensional electric-thermal-flow-coupled model, a stochastic degradation model of cells under field dynamic conditions and a multi-state system reliability model of a battery pack are established. The relationships between the multi-physics coupling model, the degradation model and the system reliability model are first constructed to analyze the reliability of the battery pack and followed by analysis examples with different redundancy strategies. By comparing the reliability of battery packs of different redundant cell numbers and configurations, several conclusions for the redundancy strategy are obtained. More notably, the reliability does not monotonically increase with the number of redundant cells for the thermal disequilibrium effects. In this work, the reliability of a 6 × 5 parallel-series configuration is the optimal system structure. In addition, the effect of the cell arrangement and cooling conditions are investigated.

  14. Cascade Optimization Strategy for Aircraft and Air-Breathing Propulsion System Concepts

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Lavelle, Thomas M.; Hopkins, Dale A.; Coroneos, Rula M.

    1996-01-01

    Design optimization for subsonic and supersonic aircraft and for air-breathing propulsion engine concepts has been accomplished by soft-coupling the Flight Optimization System (FLOPS) and the NASA Engine Performance Program analyzer (NEPP), to the NASA Lewis multidisciplinary optimization tool COMETBOARDS. Aircraft and engine design problems, with their associated constraints and design variables, were cast as nonlinear optimization problems with aircraft weight and engine thrust as the respective merit functions. Because of the diversity of constraint types and the overall distortion of the design space, the most reliable single optimization algorithm available in COMETBOARDS could not produce a satisfactory feasible optimum solution. Some of COMETBOARDS' unique features, which include a cascade strategy, variable and constraint formulations, and scaling devised especially for difficult multidisciplinary applications, successfully optimized the performance of both aircraft and engines. The cascade method has two principal steps: In the first, the solution initiates from a user-specified design and optimizer, in the second, the optimum design obtained in the first step with some random perturbation is used to begin the next specified optimizer. The second step is repeated for a specified sequence of optimizers or until a successful solution of the problem is achieved. A successful solution should satisfy the specified convergence criteria and have several active constraints but no violated constraints. The cascade strategy available in the combined COMETBOARDS, FLOPS, and NEPP design tool converges to the same global optimum solution even when it starts from different design points. This reliable and robust design tool eliminates manual intervention in the design of aircraft and of air-breathing propulsion engines where it eases the cycle analysis procedures. The combined code is also much easier to use, which is an added benefit. This paper describes COMETBOARDS and its cascade strategy and illustrates the capability of the combined design tool through the optimization of a subsonic aircraft and a high-bypass-turbofan wave-rotor-topped engine.

  15. [Enzymatic analysis of the quality of foodstuffs].

    PubMed

    Kolesnov, A Iu

    1997-01-01

    Enzymatic analysis is an independent and separate branch of enzymology and analytical chemistry. It has become one of the most important methodologies used in food analysis. Enzymatic analysis allows the quick, reliable determination of many food ingredients. Often these contents cannot be determined by conventional methods, or if methods are available, they are determined only with limited accuracy. Today, methods of enzymatic analysis are being increasingly used in the investigation of foodstuffs. Enzymatic measurement techniques are used in industry, scientific and food inspection laboratories for quality analysis. This article describes the requirements of an optimal analytical method: specificity, sample preparation, assay performance, precision, sensitivity, time requirement, analysis cost, safety of reagents.

  16. Spanish version of the Mattis Dementia Rating Scale-2 for early detection of Alzheimer's disease and mild cognitive impairment.

    PubMed

    Boycheva, Elina; Contador, Israel; Fernández-Calvo, Bernardino; Ramos-Campos, Francisco; Puertas-Martín, Verónica; Villarejo-Galende, Alberto; Bermejo-Pareja, Félix

    2018-06-01

    We aimed to analyse the clinical utility of the Mattis Dementia Rating Scale (MDRS-2) for early detection of Alzheimer's disease (AD) and amnestic mild cognitive impairment (MCI) in a sample of Spanish older adults. A total of 125 participants (age = 75.12 ± 6.83, years of education =7.08 ± 3.57) were classified in three diagnostic groups: 45 patients with mild AD, 37 with amnestic MCI-single and multiple domain and 43 cognitively healthy controls (HCs). Reliability, criterion validity and diagnostic accuracy of the MDRS-2 (total and subscales) were analysed. The MDRS-2 scores, adjusted by socio-demographic characteristics, were calculated through hierarchical multiple regression analysis. The global scale had adequate reliability (α = 0.736) and good criterion validity (r = 0.760, p < .001) with the Mini-Mental State Examination. The optimal cut-off point between AD patients and HCs was 124 (sensitivity [Se] = 97% and specificity [Sp] = 95%), whereas 131 (Se = 89%, Sp = 81%) was the optimal cut-off point between MCI and HCs. An optimal cut-off point of 123 had good Se (0.97), but poor Sp (0.56) to differentiate AD and MCI groups. The Memory and Initiation/Perseveration subscales had the highest discriminative capacity between the groups. The MDRS-2 is a reliable and valid instrument for the assessment of cognitive impairment in Spanish older adults. In particular, optimal capacity emerged for the detection of early AD and MCI. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Structural design of high-performance capacitive accelerometers using parametric optimization with uncertainties

    NASA Astrophysics Data System (ADS)

    Teves, André da Costa; Lima, Cícero Ribeiro de; Passaro, Angelo; Silva, Emílio Carlos Nelli

    2017-03-01

    Electrostatic or capacitive accelerometers are among the highest volume microelectromechanical systems (MEMS) products nowadays. The design of such devices is a complex task, since they depend on many performance requirements, which are often conflicting. Therefore, optimization techniques are often used in the design stage of these MEMS devices. Because of problems with reliability, the technology of MEMS is not yet well established. Thus, in this work, size optimization is combined with the reliability-based design optimization (RBDO) method to improve the performance of accelerometers. To account for uncertainties in the dimensions and material properties of these devices, the first order reliability method is applied to calculate the probabilities involved in the RBDO formulation. Practical examples of bulk-type capacitive accelerometer designs are presented and discussed to evaluate the potential of the implemented RBDO solver.

  18. Structural design and analysis for an ultra low CTE optical bench for the Hubble Space Telescope corrective optics

    NASA Technical Reports Server (NTRS)

    Neam, Douglas C.; Gerber, John D.

    1992-01-01

    The stringent stability requirements of the Corrective Optics Space Telescope Axial Replacement (COSTAR) necessitates a Deployable Optical Bench (DOB) with both a low CTE and high resonant frequency. The DOB design consists of a monocoque thin shell structure which marries metallic machined parts with graphite epoxy formed structure. Structural analysis of the DOB has been integrated into the laminate design and optimization process. Also, the structural analytical results are compared with vibration and thermal test data to assess the reliability of the analysis.

  19. Assessment of Lower Limb Muscle Strength and Power Using Hand-Held and Fixed Dynamometry: A Reliability and Validity Study

    PubMed Central

    Perraton, Luke G.; Bower, Kelly J.; Adair, Brooke; Pua, Yong-Hao; Williams, Gavin P.; McGaw, Rebekah

    2015-01-01

    Introduction Hand-held dynamometry (HHD) has never previously been used to examine isometric muscle power. Rate of force development (RFD) is often used for muscle power assessment, however no consensus currently exists on the most appropriate method of calculation. The aim of this study was to examine the reliability of different algorithms for RFD calculation and to examine the intra-rater, inter-rater, and inter-device reliability of HHD as well as the concurrent validity of HHD for the assessment of isometric lower limb muscle strength and power. Methods 30 healthy young adults (age: 23±5yrs, male: 15) were assessed on two sessions. Isometric muscle strength and power were measured using peak force and RFD respectively using two HHDs (Lafayette Model-01165 and Hoggan microFET2) and a criterion-reference KinCom dynamometer. Statistical analysis of reliability and validity comprised intraclass correlation coefficients (ICC), Pearson correlations, concordance correlations, standard error of measurement, and minimal detectable change. Results Comparison of RFD methods revealed that a peak 200ms moving window algorithm provided optimal reliability results. Intra-rater, inter-rater, and inter-device reliability analysis of peak force and RFD revealed mostly good to excellent reliability (coefficients ≥ 0.70) for all muscle groups. Concurrent validity analysis showed moderate to excellent relationships between HHD and fixed dynamometry for the hip and knee (ICCs ≥ 0.70) for both peak force and RFD, with mostly poor to good results shown for the ankle muscles (ICCs = 0.31–0.79). Conclusions Hand-held dynamometry has good to excellent reliability and validity for most measures of isometric lower limb strength and power in a healthy population, particularly for proximal muscle groups. To aid implementation we have created freely available software to extract these variables from data stored on the Lafayette device. Future research should examine the reliability and validity of these variables in clinical populations. PMID:26509265

  20. Analyzing Responses of Chemical Sensor Arrays

    NASA Technical Reports Server (NTRS)

    Zhou, Hanying

    2007-01-01

    NASA is developing a third-generation electronic nose (ENose) capable of continuous monitoring of the International Space Station s cabin atmosphere for specific, harmful airborne contaminants. Previous generations of the ENose have been described in prior NASA Tech Briefs issues. Sensor selection is critical in both (prefabrication) sensor material selection and (post-fabrication) data analysis of the ENose, which detects several analytes that are difficult to detect, or that are at very low concentration ranges. Existing sensor selection approaches usually include limited statistical measures, where selectivity is more important but reliability and sensitivity are not of concern. When reliability and sensitivity can be major limiting factors in detecting target compounds reliably, the existing approach is not able to provide meaningful selection that will actually improve data analysis results. The approach and software reported here consider more statistical measures (factors) than existing approaches for a similar purpose. The result is a more balanced and robust sensor selection from a less than ideal sensor array. The software offers quick, flexible, optimal sensor selection and weighting for a variety of purposes without a time-consuming, iterative search by performing sensor calibrations to a known linear or nonlinear model, evaluating the individual sensor s statistics, scoring the individual sensor s overall performance, finding the best sensor array size to maximize class separation, finding optimal weights for the remaining sensor array, estimating limits of detection for the target compounds, evaluating fingerprint distance between group pairs, and finding the best event-detecting sensors.

  1. Basic Principles of Electrical Network Reliability Optimization in Liberalised Electricity Market

    NASA Astrophysics Data System (ADS)

    Oleinikova, I.; Krishans, Z.; Mutule, A.

    2008-01-01

    The authors propose to select long-term solutions to the reliability problems of electrical networks in the stage of development planning. The guide lines or basic principles of such optimization are: 1) its dynamical nature; 2) development sustainability; 3) integrated solution of the problems of network development and electricity supply reliability; 4) consideration of information uncertainty; 5) concurrent consideration of the network and generation development problems; 6) application of specialized information technologies; 7) definition of requirements for independent electricity producers. In the article, the major aspects of liberalized electricity market, its functions and tasks are reviewed, with emphasis placed on the optimization of electrical network development as a significant component of sustainable management of power systems.

  2. A Near-Term, High-Confidence Heavy Lift Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Rothschild, William J.; Talay, Theodore A.

    2009-01-01

    The use of well understood, legacy elements of the Space Shuttle system could yield a near-term, high-confidence Heavy Lift Launch Vehicle that offers significant performance, reliability, schedule, risk, cost, and work force transition benefits. A side-mount Shuttle-Derived Vehicle (SDV) concept has been defined that has major improvements over previous Shuttle-C concepts. This SDV is shown to carry crew plus large logistics payloads to the ISS, support an operationally efficient and cost effective program of lunar exploration, and offer the potential to support commercial launch operations. This paper provides the latest data and estimates on the configurations, performance, concept of operations, reliability and safety, development schedule, risks, costs, and work force transition opportunities for this optimized side-mount SDV concept. The results presented in this paper have been based on established models and fully validated analysis tools used by the Space Shuttle Program, and are consistent with similar analysis tools commonly used throughout the aerospace industry. While these results serve as a factual basis for comparisons with other launch system architectures, no such comparisons are presented in this paper. The authors welcome comparisons between this optimized SDV and other Heavy Lift Launch Vehicle concepts.

  3. An accurate and efficient reliability-based design optimization using the second order reliability method and improved stability transformation method

    NASA Astrophysics Data System (ADS)

    Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo

    2018-05-01

    The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.

  4. Automated Lead Optimization of MMP-12 Inhibitors Using a Genetic Algorithm.

    PubMed

    Pickett, Stephen D; Green, Darren V S; Hunt, David L; Pardoe, David A; Hughes, Ian

    2011-01-13

    Traditional lead optimization projects involve long synthesis and testing cycles, favoring extensive structure-activity relationship (SAR) analysis and molecular design steps, in an attempt to limit the number of cycles that a project must run to optimize a development candidate. Microfluidic-based chemistry and biology platforms, with cycle times of minutes rather than weeks, lend themselves to unattended autonomous operation. The bottleneck in the lead optimization process is therefore shifted from synthesis or test to SAR analysis and design. As such, the way is open to an algorithm-directed process, without the need for detailed user data analysis. Here, we present results of two synthesis and screening experiments, undertaken using traditional methodology, to validate a genetic algorithm optimization process for future application to a microfluidic system. The algorithm has several novel features that are important for the intended application. For example, it is robust to missing data and can suggest compounds for retest to ensure reliability of optimization. The algorithm is first validated on a retrospective analysis of an in-house library embedded in a larger virtual array of presumed inactive compounds. In a second, prospective experiment with MMP-12 as the target protein, 140 compounds are submitted for synthesis over 10 cycles of optimization. Comparison is made to the results from the full combinatorial library that was synthesized manually and tested independently. The results show that compounds selected by the algorithm are heavily biased toward the more active regions of the library, while the algorithm is robust to both missing data (compounds where synthesis failed) and inactive compounds. This publication places the full combinatorial library and biological data into the public domain with the intention of advancing research into algorithm-directed lead optimization methods.

  5. Automated Lead Optimization of MMP-12 Inhibitors Using a Genetic Algorithm

    PubMed Central

    2010-01-01

    Traditional lead optimization projects involve long synthesis and testing cycles, favoring extensive structure−activity relationship (SAR) analysis and molecular design steps, in an attempt to limit the number of cycles that a project must run to optimize a development candidate. Microfluidic-based chemistry and biology platforms, with cycle times of minutes rather than weeks, lend themselves to unattended autonomous operation. The bottleneck in the lead optimization process is therefore shifted from synthesis or test to SAR analysis and design. As such, the way is open to an algorithm-directed process, without the need for detailed user data analysis. Here, we present results of two synthesis and screening experiments, undertaken using traditional methodology, to validate a genetic algorithm optimization process for future application to a microfluidic system. The algorithm has several novel features that are important for the intended application. For example, it is robust to missing data and can suggest compounds for retest to ensure reliability of optimization. The algorithm is first validated on a retrospective analysis of an in-house library embedded in a larger virtual array of presumed inactive compounds. In a second, prospective experiment with MMP-12 as the target protein, 140 compounds are submitted for synthesis over 10 cycles of optimization. Comparison is made to the results from the full combinatorial library that was synthesized manually and tested independently. The results show that compounds selected by the algorithm are heavily biased toward the more active regions of the library, while the algorithm is robust to both missing data (compounds where synthesis failed) and inactive compounds. This publication places the full combinatorial library and biological data into the public domain with the intention of advancing research into algorithm-directed lead optimization methods. PMID:24900251

  6. Development of the hard and soft constraints based optimisation model for unit sizing of the hybrid renewable energy system designed for microgrid applications

    NASA Astrophysics Data System (ADS)

    Sundaramoorthy, Kumaravel

    2017-02-01

    The hybrid energy systems (HESs) based electricity generation system has become a more attractive solution for rural electrification nowadays. Economically feasible and technically reliable HESs are solidly based on an optimisation stage. This article discusses about the optimal unit sizing model with the objective function to minimise the total cost of the HES. Three typical rural sites from southern part of India have been selected for the application of the developed optimisation methodology. Feasibility studies and sensitivity analysis on the optimal HES are discussed elaborately in this article. A comparison has been carried out with the Hybrid Optimization Model for Electric Renewable optimisation model for three sites. The optimal HES is found with less total net present rate and rate of energy compared with the existing method

  7. Distributed collaborative probabilistic design of multi-failure structure with fluid-structure interaction using fuzzy neural network of regression

    NASA Astrophysics Data System (ADS)

    Song, Lu-Kai; Wen, Jie; Fei, Cheng-Wei; Bai, Guang-Chen

    2018-05-01

    To improve the computing efficiency and precision of probabilistic design for multi-failure structure, a distributed collaborative probabilistic design method-based fuzzy neural network of regression (FR) (called as DCFRM) is proposed with the integration of distributed collaborative response surface method and fuzzy neural network regression model. The mathematical model of DCFRM is established and the probabilistic design idea with DCFRM is introduced. The probabilistic analysis of turbine blisk involving multi-failure modes (deformation failure, stress failure and strain failure) was investigated by considering fluid-structure interaction with the proposed method. The distribution characteristics, reliability degree, and sensitivity degree of each failure mode and overall failure mode on turbine blisk are obtained, which provides a useful reference for improving the performance and reliability of aeroengine. Through the comparison of methods shows that the DCFRM reshapes the probability of probabilistic analysis for multi-failure structure and improves the computing efficiency while keeping acceptable computational precision. Moreover, the proposed method offers a useful insight for reliability-based design optimization of multi-failure structure and thereby also enriches the theory and method of mechanical reliability design.

  8. [Orthogonal experiment using SFE-CO2 in extraction of essential oil from fresh Houttuynia cordata and analysis of essential oil by GC-MS].

    PubMed

    Meng, Jiang; Dong, Xiao-ping; Zhou, Yi-sheng; Jiang, Zhi-hong; Leung, Kelvin Sze-Yin; Zhao, Zhong-zhen

    2007-02-01

    To optimize the extraction procedure of essential oil from H. cordata using the SFE-CO2 and analyze the chemical composition of the essential oil. The extraction procedure of essential oil from fresh H. cordata was optimized with the orthogonal experiment. Essential oil of fresh H. cordata was analysed by GC-MS. The optimize preparative procedure was as follow: essential oil of H. cordata was extracted at a temperature of 35 degrees C, pressure of 15,000 kPa for 20 min. 38 chemical components were identified and the relative contents were quantified. The optimum preparative procedure is reliable and can guarantee the quality of essential oil.

  9. Optimal service distribution in WSN service system subject to data security constraints.

    PubMed

    Wu, Zhao; Xiong, Naixue; Huang, Yannong; Gu, Qiong

    2014-08-04

    Services composition technology provides a flexible approach to building Wireless Sensor Network (WSN) Service Applications (WSA) in a service oriented tasking system for WSN. Maintaining the data security of WSA is one of the most important goals in sensor network research. In this paper, we consider a WSN service oriented tasking system in which the WSN Services Broker (WSB), as the resource management center, can map the service request from user into a set of atom-services (AS) and send them to some independent sensor nodes (SN) for parallel execution. The distribution of ASs among these SNs affects the data security as well as the reliability and performance of WSA because these SNs can be of different and independent specifications. By the optimal service partition into the ASs and their distribution among SNs, the WSB can provide the maximum possible service reliability and/or expected performance subject to data security constraints. This paper proposes an algorithm of optimal service partition and distribution based on the universal generating function (UGF) and the genetic algorithm (GA) approach. The experimental analysis is presented to demonstrate the feasibility of the suggested algorithm.

  10. Optimal Service Distribution in WSN Service System Subject to Data Security Constraints

    PubMed Central

    Wu, Zhao; Xiong, Naixue; Huang, Yannong; Gu, Qiong

    2014-01-01

    Services composition technology provides a flexible approach to building Wireless Sensor Network (WSN) Service Applications (WSA) in a service oriented tasking system for WSN. Maintaining the data security of WSA is one of the most important goals in sensor network research. In this paper, we consider a WSN service oriented tasking system in which the WSN Services Broker (WSB), as the resource management center, can map the service request from user into a set of atom-services (AS) and send them to some independent sensor nodes (SN) for parallel execution. The distribution of ASs among these SNs affects the data security as well as the reliability and performance of WSA because these SNs can be of different and independent specifications. By the optimal service partition into the ASs and their distribution among SNs, the WSB can provide the maximum possible service reliability and/or expected performance subject to data security constraints. This paper proposes an algorithm of optimal service partition and distribution based on the universal generating function (UGF) and the genetic algorithm (GA) approach. The experimental analysis is presented to demonstrate the feasibility of the suggested algorithm. PMID:25093346

  11. Exploration of multi-fold symmetry element-loaded superconducting radio frequency structure for reliable acceleration of low- & medium-beta ion species

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Shichun; Geng, Rongli

    2015-09-01

    Reliable acceleration of low- to medium-beta proton or heavy ion species is needed for future high-current superconducting radio frequency (SRF) accelerators. Due to the high-Q nature of an SRF resonator, it is sensitive to many factors such as electron loading (from either the accelerated beam or from parasitic field emitted electrons), mechanical vibration, and liquid helium bath pressure fluctuation etc. To increase the stability against those factors, a mechanically strong and stable RF structure is desirable. Guided by this consideration, multi-fold symmetry element-loaded SRF structures (MFSEL), cylindrical tanks with multiple (n>=3) rod-shaped radial elements, are being explored. The top goalmore » of its optimization is to improve mechanical stability. A natural consequence of this structure is a lowered ratio of the peak surface electromagnetic field to the acceleration gradient as compared to the traditional spoke cavity. A disadvantage of this new structure is an increased size for a fixed resonant frequency and optimal beta. This paper describes the optimization of the electro-magnetic (EM) design and preliminary mechanical analysis for such structures.« less

  12. Precursor Report of Data Needs and Recommended Practices for PV Plant Availability Operations and Maintenance Reporting.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, Roger R.; Klise, Geoffrey Taylor; Balfour, John R.

    Characterizing the factors that affect reliability of a photovoltaic (PV) power plant is an important aspect of optimal asset management. This document describes the many factors that affect operation and maintenance (O&M) of a PV plant, identifies the data necessary to quantify those factors, and describes how data might be used by O&M service providers and others in the PV industry. This document lays out data needs from perspectives of reliability, availability, and key performance indicators and is intended to be a precursor for standardizing terminology and data reporting, which will improve data sharing, analysis, and ultimately PV plant performance.

  13. Mechanically and optically reliable folding structure with a hyperelastic material for seamless foldable displays

    NASA Astrophysics Data System (ADS)

    Kwon, Hyuk-Jun; Shim, HongShik; Kim, Sunkook; Choi, Woong; Chun, Youngtea; Kee, InSeo; Lee, SangYoon

    2011-04-01

    We report a mechanically and optically robust folding structure to realize a foldable active matrix organic-light-emitting-diode (AMOLED) display without a visible crease at the junction. A nonlinear stress analysis, based on a finite element method, provided an optimized design. The folding-unfolding test on the structure exhibited negligible deterioration of the relative brightness at the junction of the individual panels up to 105 cycles at a folding radius of 1 mm, indicating highly reliable mechanical and optical tolerances. These results demonstrate the feasibility of seamless foldable AMOLED displays, with potentially important technical implications on fabricating large size flexible displays.

  14. Reliability Assessment of a Robust Design Under Uncertainty for a 3-D Flexible Wing

    NASA Technical Reports Server (NTRS)

    Gumbert, Clyde R.; Hou, Gene J. -W.; Newman, Perry A.

    2003-01-01

    The paper presents reliability assessment results for the robust designs under uncertainty of a 3-D flexible wing previously reported by the authors. Reliability assessments (additional optimization problems) of the active constraints at the various probabilistic robust design points are obtained and compared with the constraint values or target constraint probabilities specified in the robust design. In addition, reliability-based sensitivity derivatives with respect to design variable mean values are also obtained and shown to agree with finite difference values. These derivatives allow one to perform reliability based design without having to obtain second-order sensitivity derivatives. However, an inner-loop optimization problem must be solved for each active constraint to find the most probable point on that constraint failure surface.

  15. Optimized Biasing of Pump Laser Diodes in a Highly Reliable Metrology Source for Long-Duration Space Missions

    NASA Technical Reports Server (NTRS)

    Poberezhskiy, Ilya Y; Chang, Daniel H.; Erlig, Herman

    2011-01-01

    Optical metrology system reliability during a prolonged space mission is often limited by the reliability of pump laser diodes. We developed a metrology laser pump module architecture that meets NASA SIM Lite instrument optical power and reliability requirements by combining the outputs of multiple single-mode pump diodes in a low-loss, high port count fiber coupler. We describe Monte-Carlo simulations used to calculate the reliability of the laser pump module and introduce a combined laser farm aging parameter that serves as a load-sharing optimization metric. Employing these tools, we select pump module architecture, operating conditions, biasing approach and perform parameter sensitivity studies to investigate the robustness of the obtained solution.

  16. Distribution System Reliability Analysis for Smart Grid Applications

    NASA Astrophysics Data System (ADS)

    Aljohani, Tawfiq Masad

    Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

  17. Determining the psychometric properties of the Enhancing Decision-making Assessment in Midwifery (EDAM) measure in a cross cultural context.

    PubMed

    Jefford, Elaine; Jomeen, Julie; Martin, Colin R

    2016-04-28

    The ability to act on and justify clinical decisions as autonomous accountable midwifery practitioners, is encompassed within many international regulatory frameworks, yet decision-making within midwifery is poorly defined. Decision-making theories from medicine and nursing may have something to offer, but fail to take into consideration midwifery context and philosophy and the decisional autonomy of women. Using an underpinning qualitative methodology, a decision-making framework was developed, which identified Good Clinical Reasoning and Good Midwifery Practice as two conditions necessary to facilitate optimal midwifery decision-making during 2nd stage labour. This study aims to confirm the robustness of the framework and describe the development of Enhancing Decision-making Assessment in Midwifery (EDAM) as a measurement tool through testing of its factor structure, validity and reliability. A cross-sectional design for instrument development and a 2 (country; Australia/UK) x 2 (Decision-making; optimal/sub-optimal) between-subjects design for instrument evaluation using exploratory and confirmatory factor analysis, internal consistency and known-groups validity. Two 'expert' maternity panels, based in Australia and the UK, comprising of 42 participants assessed 16 midwifery real care episode vignettes using the empirically derived 26 item framework. Each item was answered on a 5 point likert scale based on the level of agreement to which the participant felt each item was present in each of the vignettes. Participants were then asked to rate the overall decision-making (optimal/sub-optimal). Post factor analysis the framework was reduced to a 19 item EDAM measure, and confirmed as two distinct scales of 'Clinical Reasoning' (CR) and 'Midwifery Practice' (MP). The CR scale comprised of two subscales; 'the clinical reasoning process' and 'integration and intervention'. The MP scale also comprised two subscales; women's relationship with the midwife' and 'general midwifery practice'. EDAM would generally appear to be a robust, valid and reliable psychometric instrument for measuring midwifery decision-making, which performs consistently across differing international contexts. The 'women's relationship with midwife' subscale marginally failed to meet the threshold for determining good instrument reliability, which may be due to its brevity. Further research using larger samples and in a wider international context to confirm the veracity of the instrument's measurement properties and its wider global utility, would be advantageous.

  18. Study of Fuze Structure and Reliability Design Based on the Direct Search Method

    NASA Astrophysics Data System (ADS)

    Lin, Zhang; Ning, Wang

    2017-03-01

    Redundant design is one of the important methods to improve the reliability of the system, but mutual coupling of multiple factors is often involved in the design. In my study, Direct Search Method is introduced into the optimum redundancy configuration for design optimization, in which, the reliability, cost, structural weight and other factors can be taken into account simultaneously, and the redundant allocation and reliability design of aircraft critical system are computed. The results show that this method is convenient and workable, and applicable to the redundancy configurations and optimization of various designs upon appropriate modifications. And this method has a good practical value.

  19. Building model analysis applications with the Joint Universal Parameter IdenTification and Evaluation of Reliability (JUPITER) API

    USGS Publications Warehouse

    Banta, E.R.; Hill, M.C.; Poeter, E.; Doherty, J.E.; Babendreier, J.

    2008-01-01

    The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input and output conventions allow application users to access various applications and the analysis methods they embody with a minimum of time and effort. Process models simulate, for example, physical, chemical, and (or) biological systems of interest using phenomenological, theoretical, or heuristic approaches. The types of model analyses supported by the JUPITER API include, but are not limited to, sensitivity analysis, data needs assessment, calibration, uncertainty analysis, model discrimination, and optimization. The advantages provided by the JUPITER API for users and programmers allow for rapid programming and testing of new ideas. Application-specific coding can be in languages other than the Fortran-90 of the API. This article briefly describes the capabilities and utility of the JUPITER API, lists existing applications, and uses UCODE_2005 as an example.

  20. New technologies for advanced three-dimensional optimum shape design in aeronautics

    NASA Astrophysics Data System (ADS)

    Dervieux, Alain; Lanteri, Stéphane; Malé, Jean-Michel; Marco, Nathalie; Rostaing-Schmidt, Nicole; Stoufflet, Bruno

    1999-05-01

    The analysis of complex flows around realistic aircraft geometries is becoming more and more predictive. In order to obtain this result, the complexity of flow analysis codes has been constantly increasing, involving more refined fluid models and sophisticated numerical methods. These codes can only run on top computers, exhausting their memory and CPU capabilities. It is, therefore, difficult to introduce best analysis codes in a shape optimization loop: most previous works in the optimum shape design field used only simplified analysis codes. Moreover, as the most popular optimization methods are the gradient-based ones, the more complex the flow solver, the more difficult it is to compute the sensitivity code. However, emerging technologies are contributing to make such an ambitious project, of including a state-of-the-art flow analysis code into an optimisation loop, feasible. Among those technologies, there are three important issues that this paper wishes to address: shape parametrization, automated differentiation and parallel computing. Shape parametrization allows faster optimization by reducing the number of design variable; in this work, it relies on a hierarchical multilevel approach. The sensitivity code can be obtained using automated differentiation. The automated approach is based on software manipulation tools, which allow the differentiation to be quick and the resulting differentiated code to be rather fast and reliable. In addition, the parallel algorithms implemented in this work allow the resulting optimization software to run on increasingly larger geometries. Copyright

  1. Non-equilibrium responses of PFPE lubricants with various atomistic/molecular architecture at elevated temperature

    NASA Astrophysics Data System (ADS)

    Chung, Pil Seung; Song, Wonyup; Biegler, Lorenz T.; Jhon, Myung S.

    2017-05-01

    During the operation of hard disk drive (HDD), the perfluoropolyether (PFPE) lubricant experiences elastic or viscous shear/elongation deformations, which affect the performance and reliability of the HDD. Therefore, the viscoelastic responses of PFPE could provide a finger print analysis in designing optimal molecular architecture of lubricants to control the tribological phenomena. In this paper, we examine the rheological responses of PFPEs including storage (elastic) and loss (viscous) moduli (G' and G″) by monitoring the time-dependent-stress-strain relationship via non-equilibrium molecular dynamics simulations. We analyzed the rheological responses by using Cox-Merz rule, and investigated the molecular structural and thermal effects on the solid-like and liquid-like behaviors of PFPEs. The temperature dependence of the endgroup agglomeration phenomena was examined, where the functional endgroups are decoupled as the temperature increases. By analyzing the relaxation processes, the molecular rheological studies will provide the optimal lubricant selection criteria to enhance the HDD performance and reliability for the heat-assisted magnetic recording applications.

  2. An expert system for integrated structural analysis and design optimization for aerospace structures

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The results of a research study on the development of an expert system for integrated structural analysis and design optimization is presented. An Object Representation Language (ORL) was developed first in conjunction with a rule-based system. This ORL/AI shell was then used to develop expert systems to provide assistance with a variety of structural analysis and design optimization tasks, in conjunction with procedural modules for finite element structural analysis and design optimization. The main goal of the research study was to provide expertise, judgment, and reasoning capabilities in the aerospace structural design process. This will allow engineers performing structural analysis and design, even without extensive experience in the field, to develop error-free, efficient and reliable structural designs very rapidly and cost-effectively. This would not only improve the productivity of design engineers and analysts, but also significantly reduce time to completion of structural design. An extensive literature survey in the field of structural analysis, design optimization, artificial intelligence, and database management systems and their application to the structural design process was first performed. A feasibility study was then performed, and the architecture and the conceptual design for the integrated 'intelligent' structural analysis and design optimization software was then developed. An Object Representation Language (ORL), in conjunction with a rule-based system, was then developed using C++. Such an approach would improve the expressiveness for knowledge representation (especially for structural analysis and design applications), provide ability to build very large and practical expert systems, and provide an efficient way for storing knowledge. Functional specifications for the expert systems were then developed. The ORL/AI shell was then used to develop a variety of modules of expert systems for a variety of modeling, finite element analysis, and design optimization tasks in the integrated aerospace structural design process. These expert systems were developed to work in conjunction with procedural finite element structural analysis and design optimization modules (developed in-house at SAT, Inc.). The complete software, AutoDesign, so developed, can be used for integrated 'intelligent' structural analysis and design optimization. The software was beta-tested at a variety of companies, used by a range of engineers with different levels of background and expertise. Based on the feedback obtained by such users, conclusions were developed and are provided.

  3. An expert system for integrated structural analysis and design optimization for aerospace structures

    NASA Astrophysics Data System (ADS)

    1992-04-01

    The results of a research study on the development of an expert system for integrated structural analysis and design optimization is presented. An Object Representation Language (ORL) was developed first in conjunction with a rule-based system. This ORL/AI shell was then used to develop expert systems to provide assistance with a variety of structural analysis and design optimization tasks, in conjunction with procedural modules for finite element structural analysis and design optimization. The main goal of the research study was to provide expertise, judgment, and reasoning capabilities in the aerospace structural design process. This will allow engineers performing structural analysis and design, even without extensive experience in the field, to develop error-free, efficient and reliable structural designs very rapidly and cost-effectively. This would not only improve the productivity of design engineers and analysts, but also significantly reduce time to completion of structural design. An extensive literature survey in the field of structural analysis, design optimization, artificial intelligence, and database management systems and their application to the structural design process was first performed. A feasibility study was then performed, and the architecture and the conceptual design for the integrated 'intelligent' structural analysis and design optimization software was then developed. An Object Representation Language (ORL), in conjunction with a rule-based system, was then developed using C++. Such an approach would improve the expressiveness for knowledge representation (especially for structural analysis and design applications), provide ability to build very large and practical expert systems, and provide an efficient way for storing knowledge. Functional specifications for the expert systems were then developed. The ORL/AI shell was then used to develop a variety of modules of expert systems for a variety of modeling, finite element analysis, and design optimization tasks in the integrated aerospace structural design process. These expert systems were developed to work in conjunction with procedural finite element structural analysis and design optimization modules (developed in-house at SAT, Inc.). The complete software, AutoDesign, so developed, can be used for integrated 'intelligent' structural analysis and design optimization. The software was beta-tested at a variety of companies, used by a range of engineers with different levels of background and expertise. Based on the feedback obtained by such users, conclusions were developed and are provided.

  4. Validation of the Childbirth Attitudes Questionnaire in Greek and psychosocial characteristics of pregnant women with fear of childbirth.

    PubMed

    Gourounti, Kleanthi; Kouklaki, Eleni; Lykeridou, Katerina

    2015-09-01

    Fear of childbirth has been found to be a factor that influences women's decision about their choice of delivery. The investigation of psychological factors that relate to fear of childbirth as well as the validation of the Childbirth Attitudes Questionnaire (CAQ) can help identify pregnant women with high fear of childbirth. This study aimed to (a) translate and validate the Childbirth Attitude Questionnaire in Greek and (b) explore how fear of childbirth was related to childbirth self-efficacy, state anxiety, depression, neuroticism, self-esteem and optimism. 145 nulliparous pregnant women, in late pregnancy, attending routine antenatal visit in a private maternity clinic participated. All participants completed the CAQ (fear of childbirth) and other self-report questionnaires that measured childbirth self-efficacy, self-esteem, optimism, neuroticism, state anxiety and depressive symptomatology. Factor structure was investigated using principal axis factoring. Cronbach's α was used to measure internal consistency reliability. Pearson's correlation coefficients were calculated between all study variables, followed by multiple linear regression. The factor analysis suggested the existence of one-factor structure. Construct validity was confirmed by computing correlations between the CAQ and childbirth self-efficacy, self-esteem and optimism. Multivariate analysis showed that childbirth self-efficacy, self-esteem and optimism were negatively and significantly associated with fear of childbirth. The Greek version of the CAQ is a reliable and valid measure. The clinical use of CAQ may enable midwives and other health care professionals to identify pregnant women with low childbirth confidence and to provide information and support. Copyright © 2015 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.

  5. Optimal Reservoir Operation using Stochastic Model Predictive Control

    NASA Astrophysics Data System (ADS)

    Sahu, R.; McLaughlin, D.

    2016-12-01

    Hydropower operations are typically designed to fulfill contracts negotiated with consumers who need reliable energy supplies, despite uncertainties in reservoir inflows. In addition to providing reliable power the reservoir operator needs to take into account environmental factors such as downstream flooding or compliance with minimum flow requirements. From a dynamical systems perspective, the reservoir operating strategy must cope with conflicting objectives in the presence of random disturbances. In order to achieve optimal performance, the reservoir system needs to continually adapt to disturbances in real time. Model Predictive Control (MPC) is a real-time control technique that adapts by deriving the reservoir release at each decision time from the current state of the system. Here an ensemble-based version of MPC (SMPC) is applied to a generic reservoir to determine both the optimal power contract, considering future inflow uncertainty, and a real-time operating strategy that attempts to satisfy the contract. Contract selection and real-time operation are coupled in an optimization framework that also defines a Pareto trade off between the revenue generated from energy production and the environmental damage resulting from uncontrolled reservoir spills. Further insight is provided by a sensitivity analysis of key parameters specified in the SMPC technique. The results demonstrate that SMPC is suitable for multi-objective planning and associated real-time operation of a wide range of hydropower reservoir systems.

  6. An instrument to characterize the environment for residents' evidence-based medicine learning and practice.

    PubMed

    Mi, Misa; Moseley, James L; Green, Michael L

    2012-02-01

    Many residency programs offer training in evidence-based medicine (EBM). However, these curricula often fail to achieve optimal learning outcomes, perhaps because they neglect various contextual factors in the learning environment. We developed and validated an instrument to characterize the environment for EBM learning and practice in residency programs. An EBM Environment Scale was developed following scale development principles. A survey was administered to residents across six programs in primary care specialties at four medical centers. Internal consistency reliability was analyzed with Cronbach's coefficient alpha. Validity was assessed by comparing predetermined subscales with the survey's internal structure as assessed via factor analysis. Scores were also compared for subgroups based on residency program affiliation and residency characteristics. Out of 262 eligible residents, 124 completed the survey (response rate 47%). The overall mean score was 3.89 (standard deviation=0.56). The initial reliability analysis of the 48-item scale had a high reliability coefficient (Cronbach α=.94). Factor analysis and further item analysis resulted in a shorter 36-item scale with a satisfactory reliability coefficient (Cronbach α=.86). Scores were higher for residents with prior EBM training in medical school (4.14 versus 3.62) and in residency (4.25 versus 3.69). If further testing confirms its properties, the EBM Environment Scale may be used to understand the influence of the learning environment on the effectiveness of EBM training. Additionally, it may detect changes in the EBM learning environment in response to programmatic or institutional interventions.

  7. Reduced Uncertainties in the Flutter Analysis of the Aerostructures Test Wing

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Lung, Shun-fat

    2010-01-01

    Tuning the finite element model using measured data to minimize the model uncertainties is a challenging task in the area of structural dynamics. A test validated finite element model can provide a reliable flutter analysis to define the flutter placard speed to which the aircraft can be flown prior to flight flutter testing. Minimizing the difference between numerical and experimental results is a type of optimization problem. Through the use of the National Aeronautics and Space Administration Dryden Flight Research Center s (Edwards, California, USA) multidisciplinary design, analysis, and optimization tool to optimize the objective function and constraints; the mass properties, the natural frequencies, and the mode shapes are matched to the target data and the mass matrix orthogonality is retained. The approach in this study has been applied to minimize the model uncertainties for the structural dynamic model of the aerostructures test wing, which was designed, built, and tested at the National Aeronautics and Space Administration Dryden Flight Research Center. A 25-percent change in flutter speed has been shown after reducing the uncertainties

  8. Reduced Uncertainties in the Flutter Analysis of the Aerostructures Test Wing

    NASA Technical Reports Server (NTRS)

    Pak, Chan-Gi; Lung, Shun Fat

    2011-01-01

    Tuning the finite element model using measured data to minimize the model uncertainties is a challenging task in the area of structural dynamics. A test validated finite element model can provide a reliable flutter analysis to define the flutter placard speed to which the aircraft can be flown prior to flight flutter testing. Minimizing the difference between numerical and experimental results is a type of optimization problem. Through the use of the National Aeronautics and Space Administration Dryden Flight Research Center's (Edwards, California) multidisciplinary design, analysis, and optimization tool to optimize the objective function and constraints; the mass properties, the natural frequencies, and the mode shapes are matched to the target data, and the mass matrix orthogonality is retained. The approach in this study has been applied to minimize the model uncertainties for the structural dynamic model of the aerostructures test wing, which was designed, built, and tested at the National Aeronautics and Space Administration Dryden Flight Research Center. A 25 percent change in flutter speed has been shown after reducing the uncertainties.

  9. Incorporation of prior information on parameters into nonlinear regression groundwater flow models: 1. Theory

    USGS Publications Warehouse

    Cooley, Richard L.

    1982-01-01

    Prior information on the parameters of a groundwater flow model can be used to improve parameter estimates obtained from nonlinear regression solution of a modeling problem. Two scales of prior information can be available: (1) prior information having known reliability (that is, bias and random error structure) and (2) prior information consisting of best available estimates of unknown reliability. A regression method that incorporates the second scale of prior information assumes the prior information to be fixed for any particular analysis to produce improved, although biased, parameter estimates. Approximate optimization of two auxiliary parameters of the formulation is used to help minimize the bias, which is almost always much smaller than that resulting from standard ridge regression. It is shown that if both scales of prior information are available, then a combined regression analysis may be made.

  10. The welfare effects of integrating renewable energy into electricity markets

    NASA Astrophysics Data System (ADS)

    Lamadrid, Alberto J.

    The challenges of deploying more renewable energy sources on an electric grid are caused largely by their inherent variability. In this context, energy storage can help make the electric delivery system more reliable by mitigating this variability. This thesis analyzes a series of models for procuring electricity and ancillary services for both individuals and social planners with high penetrations of stochastic wind energy. The results obtained for an individual decision maker using stochastic optimization are ambiguous, with closed form solutions dependent on technological parameters, and no consideration of the system reliability. The social planner models correctly reflect the effect of system reliability, and in the case of a Stochastic, Security Constrained Optimal Power Flow (S-SC-OPF or SuperOPF), determine reserve capacity endogenously so that system reliability is maintained. A single-period SuperOPF shows that including ramping costs in the objective function leads to more wind spilling and increased capacity requirements for reliability. However, this model does not reflect the inter temporal tradeoffs of using Energy Storage Systems (ESS) to improve reliability and mitigate wind variability. The results with the multiperiod SuperOPF determine the optimum use of storage for a typical day, and compare the effects of collocating ESS at wind sites with the same amount of storage (deferrable demand) located at demand centers. The collocated ESS has slightly lower operating costs and spills less wind generation compared to deferrable demand, but the total amount of conventional generating capacity needed for system adequacy is higher. In terms of the total system costs, that include the capital cost of conventional generating capacity, the costs with deferrable demand is substantially lower because the daily demand profile is flattened and less conventional generation capacity is then needed for reliability purposes. The analysis also demonstrates that the optimum daily pattern of dispatch and reserves is seriously distorted if the stochastic characteristics of wind generation are ignored.

  11. Reliability of Wearable Inertial Measurement Units to Measure Physical Activity in Team Handball.

    PubMed

    Luteberget, Live S; Holme, Benjamin R; Spencer, Matt

    2018-04-01

    To assess the reliability and sensitivity of commercially available inertial measurement units to measure physical activity in team handball. Twenty-two handball players were instrumented with 2 inertial measurement units (OptimEye S5; Catapult Sports, Melbourne, Australia) taped together. They participated in either a laboratory assessment (n = 10) consisting of 7 team handball-specific tasks or field assessment (n = 12) conducted in 12 training sessions. Variables, including PlayerLoad™ and inertial movement analysis (IMA) magnitude and counts, were extracted from the manufacturers' software. IMA counts were divided into intensity bands of low (1.5-2.5 m·s -1 ), medium (2.5-3.5 m·s -1 ), high (>3.5 m·s -1 ), medium/high (>2.5 m·s -1 ), and total (>1.5 m·s -1 ). Reliability between devices and sensitivity was established using coefficient of variation (CV) and smallest worthwhile difference (SWD). Laboratory assessment: IMA magnitude showed a good reliability (CV = 3.1%) in well-controlled tasks. CV increased (4.4-6.7%) in more-complex tasks. Field assessment: Total IMA counts (CV = 1.8% and SWD = 2.5%), PlayerLoad (CV = 0.9% and SWD = 2.1%), and their associated variables (CV = 0.4-1.7%) showed a good reliability, well below the SWD. However, the CV of IMA increased when categorized into intensity bands (2.9-5.6%). The reliability of IMA counts was good when data were displayed as total, high, or medium/high counts. A good reliability for PlayerLoad and associated variables was evident. The CV of the previously mentioned variables was well below the SWD, suggesting that OptimEye's inertial measurement unit and its software are sensitive for use in team handball.

  12. DG Planning with Amalgamation of Operational and Reliability Considerations

    NASA Astrophysics Data System (ADS)

    Battu, Neelakanteshwar Rao; Abhyankar, A. R.; Senroy, Nilanjan

    2016-04-01

    Distributed Generation has been playing a vital role in dealing issues related to distribution systems. This paper presents an approach which provides policy maker with a set of solutions for DG placement to optimize reliability and real power loss of the system. Optimal location of a Distributed Generator is evaluated based on performance indices derived for reliability index and real power loss. The proposed approach is applied on a 15-bus radial distribution system and a 18-bus radial distribution system with conventional and wind distributed generators individually.

  13. Improving fMRI reliability in presurgical mapping for brain tumours.

    PubMed

    Stevens, M Tynan R; Clarke, David B; Stroink, Gerhard; Beyea, Steven D; D'Arcy, Ryan Cn

    2016-03-01

    Functional MRI (fMRI) is becoming increasingly integrated into clinical practice for presurgical mapping. Current efforts are focused on validating data quality, with reliability being a major factor. In this paper, we demonstrate the utility of a recently developed approach that uses receiver operating characteristic-reliability (ROC-r) to: (1) identify reliable versus unreliable data sets; (2) automatically select processing options to enhance data quality; and (3) automatically select individualised thresholds for activation maps. Presurgical fMRI was conducted in 16 patients undergoing surgical treatment for brain tumours. Within-session test-retest fMRI was conducted, and ROC-reliability of the patient group was compared to a previous healthy control cohort. Individually optimised preprocessing pipelines were determined to improve reliability. Spatial correspondence was assessed by comparing the fMRI results to intraoperative cortical stimulation mapping, in terms of the distance to the nearest active fMRI voxel. The average ROC-r reliability for the patients was 0.58±0.03, as compared to 0.72±0.02 in healthy controls. For the patient group, this increased significantly to 0.65±0.02 by adopting optimised preprocessing pipelines. Co-localisation of the fMRI maps with cortical stimulation was significantly better for more reliable versus less reliable data sets (8.3±0.9 vs 29±3 mm, respectively). We demonstrated ROC-r analysis for identifying reliable fMRI data sets, choosing optimal postprocessing pipelines, and selecting patient-specific thresholds. Data sets with higher reliability also showed closer spatial correspondence to cortical stimulation. ROC-r can thus identify poor fMRI data at time of scanning, allowing for repeat scans when necessary. ROC-r analysis provides optimised and automated fMRI processing for improved presurgical mapping. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  14. [Reliability and validity of Professional Quality of Life Scale among government staff in earthquake - stricken areas in China].

    PubMed

    Dang, Weimin; Cheng, Wenhong; Ma, Hong; Lin, Jin; Wu, Baoming; Ma, Ning; Wang, Rongke; Xu, Junting; Zhou, Tianhang; Yu, Xin

    2015-06-01

    To evaluate the reliability and validity of Professional Quality of Life Scale (ProQOL-30, 4th version, 30 items) among government staff in the Wenchuan earthquake-stricken areas A total of 1,175 members of government staff in the Wenchuan earthquake-stricken areas were selected by convenience sampling and required to complete the ProQOL and Self-Reporting Questionnair (SRQ). The reliability and validity of the scale was evaluated by correlation analysis, t-test, and confirmatory factor analysis. Item-total correlation coefficients of the three subscales were 0.590 - 0.752, 0.389 - 0.603, and 0.340 - 0.647, respectively (P<0.05), and the average coefficients were 0.672, 0.482, and 0.555 respectively (P<0.05). The Cronbach's α coefficients of the three subscales were 0.864, 0.569, and 0.742 respectively, and the split-half reliabilities were 0.829, 0.490, and 0.677, respectively. P value was 0.88 in thE chi-square test of confirmatory factor analysis model. Goodness-of-fit indices of ProQOL-30 included GFI=0.895 NFI=0.856, CFI=0.895, RMSEA=0.063, and AGFI=0.912. For the ProQOL-28 as an optimized version o ProQOL-30, the Cronbach's a coefficients for burnout and trauma/compassion fatigue increased to 0.616 and 0.757, respectively. P value was 0.91 in the chi-square test of confirmatory factor analysis model test. Goodness-of-fit indices of ProQOL-28 were GFI =0.913, AGFI =0.924, NFI =0.900, CFI =0.913, and RMSEA =0.031 CONCLUSION: ProQOL-28 has good reliability and validity among government staff in the earthquake-stricker areas in China.

  15. Structural Statics Analysis and Optimization Design of Regulating Device for Air Conveyer Outlet in Coal Mine

    NASA Astrophysics Data System (ADS)

    Gong, Xiaoyan; Li, Ying; Zhang, Yongqiang

    2018-06-01

    In view of the enlargement of fully mechanized face excavation and long distance driving, gas emission and dust production increase greatly. However, the current ventilation device direction angle, caliber and front-back distance cannot change dynamically at any time, resulting in the serious accumulation in the dead zone. In this paper, a new device were proposed that can solve above problems. Finite element ANSYS software were used to simulate and optimize the structural safety of the control device' key components. The optimization results showed that the equivalent stress decreases by 49%; after the optimization of deformation and mass are 0.829mm and 0.548kg, which were 21% and 10% lower than before.The quality, safety, reliability and cost of the control device reach the expected standards perfectly, which can meet the requirements of safe ventilation and down-dusting of fully mechanized face.

  16. Accurate multiple sequence-structure alignment of RNA sequences using combinatorial optimization.

    PubMed

    Bauer, Markus; Klau, Gunnar W; Reinert, Knut

    2007-07-27

    The discovery of functional non-coding RNA sequences has led to an increasing interest in algorithms related to RNA analysis. Traditional sequence alignment algorithms, however, fail at computing reliable alignments of low-homology RNA sequences. The spatial conformation of RNA sequences largely determines their function, and therefore RNA alignment algorithms have to take structural information into account. We present a graph-based representation for sequence-structure alignments, which we model as an integer linear program (ILP). We sketch how we compute an optimal or near-optimal solution to the ILP using methods from combinatorial optimization, and present results on a recently published benchmark set for RNA alignments. The implementation of our algorithm yields better alignments in terms of two published scores than the other programs that we tested: This is especially the case with an increasing number of input sequences. Our program LARA is freely available for academic purposes from http://www.planet-lisa.net.

  17. Migration without migraines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lines, L.; Burton, A.; Lu, H.X.

    Accurate velocity models are a necessity for reliable migration results. Velocity analysis generally involves the use of methods such as normal moveout analysis (NMO), seismic traveltime tomography, or iterative prestack migration. These techniques can be effective, and each has its own advantage or disadvantage. Conventional NMO methods are relatively inexpensive but basically require simplifying assumptions about geology. Tomography is a more general method but requires traveltime interpretation of prestack data. Iterative prestack depth migration is very general but is computationally expensive. In some cases, there is the opportunity to estimate vertical velocities by use of well information. The well informationmore » can be used to optimize poststack migrations, thereby eliminating some of the time and expense of iterative prestack migration. The optimized poststack migration procedure defined here computes the velocity model which minimizes the depth differences between seismic images and formation depths at the well by using a least squares inversion method. The optimization methods described in this paper will hopefully produce ``migrations without migraines.``« less

  18. [Optimization for supercritical CO2 extraction with response surface methodology and component analysis of Sapindus mukorossi oil].

    PubMed

    Wu, Yan; Xiao, Xin-yu; Ge, Fa-huan

    2012-02-01

    To study the extraction conditions of Sapindus mukorossi oil by Supercritical CO2 Extraction and identify its components. Optimized SFE-CO2 Extraction by response surface methodology and used GC-MS to analysie Sapindus mukorossi oil compounds. Established the model of an equation for the extraction rate of Sapindus mukorossi oil by Supercritical CO2 Extraction, and the optimal parameters for the Supercritical CO2 Extraction determined by the equation were: the extraction pressure was 30 MPa, temperature was 40 degrees C; The separation I pressure was 14 MPa, temperature was 45 degrees C; The separation II pressure was 6 MPa, temperature was 40 degrees C; The extraction time was 60 min and the extraction rate of Sapindus mukorossi oil of 17.58%. 22 main compounds of Sapindus mukorossi oil extracted by supercritical CO2 were identified by GC-MS, unsaturated fatty acids were 86.59%. This process is reliable, safe and with simple operation, and can be used for the extraction of Sapindus mukorossi oil.

  19. Comparative study of popular objective functions for damping power system oscillations in multimachine system.

    PubMed

    Islam, Naz Niamul; Hannan, M A; Shareef, Hussain; Mohamed, Azah; Salam, M A

    2014-01-01

    Power oscillation damping controller is designed in linearized model with heuristic optimization techniques. Selection of the objective function is very crucial for damping controller design by optimization algorithms. In this research, comparative analysis has been carried out to evaluate the effectiveness of popular objective functions used in power system oscillation damping. Two-stage lead-lag damping controller by means of power system stabilizers is optimized using differential search algorithm for different objective functions. Linearized model simulations are performed to compare the dominant mode's performance and then the nonlinear model is continued to evaluate the damping performance over power system oscillations. All the simulations are conducted in two-area four-machine power system to bring a detailed analysis. Investigated results proved that multiobjective D-shaped function is an effective objective function in terms of moving unstable and lightly damped electromechanical modes into stable region. Thus, D-shape function ultimately improves overall system damping and concurrently enhances power system reliability.

  20. The psychometric characteristics of the revised depression attitude questionnaire (R-DAQ) in Pakistani medical practitioners: a cross-sectional study of doctors in Lahore.

    PubMed

    Haddad, Mark; Waqas, Ahmed; Sukhera, Ahmed Bashir; Tarar, Asad Zaman

    2017-07-27

    Depression is common mental health problem and leading contributor to the global burden of disease. The attitudes and beliefs of the public and of health professionals influence social acceptance and affect the esteem and help-seeking of people experiencing mental health problems. The attitudes of clinicians are particularly relevant to their role in accurately recognising and providing appropriate support and management of depression. This study examines the characteristics of the revised depression attitude questionnaire (R-DAQ) with doctors working in healthcare settings in Lahore, Pakistan. A cross-sectional survey was conducted in 2015 using the revised depression attitude questionnaire (R-DAQ). A convenience sample of 700 medical practitioners based in six hospitals in Lahore was approached to participate in the survey. The R-DAQ structure was examined using Parallel Analysis from polychoric correlations. Unweighted least squares analysis (ULSA) was used for factor extraction. Model fit was estimated using goodness-of-fit indices and the root mean square of standardized residuals (RMSR), and internal consistency reliability for the overall scale and subscales was assessed using reliability estimates based on Mislevy and Bock (BILOG 3 Item analysis and test scoring with binary logistic models. Mooresville: Scientific Software, 55) and the McDonald's Omega statistic. Findings using this approach were compared with principal axis factor analysis based on Pearson correlation matrix. 601 (86%) of the doctors approached consented to participate in the study. Exploratory factor analysis of R-DAQ scale responses demonstrated the same 3-factor structure as in the UK development study, though analyses indicated removal of 7 of the 22 items because of weak loading or poor model fit. The 3 factor solution accounted for 49.8% of the common variance. Scale reliability and internal consistency were adequate: total scale standardised alpha was 0.694; subscale reliability for professional confidence was 0.732, therapeutic optimism/pessimism was 0.638, and generalist perspective was 0.769. The R-DAQ was developed with a predominantly UK-based sample of health professionals. This study indicates that this scale functions adequately and provides a valid measure of depression attitudes for medical practitioners in Pakistan, with the same factor structure as in the scale development sample. However, optimal scale function necessitated removal of several items, with a 15-item scale enabling the most parsimonious factor solution for this population.

  1. Genome-wide characterization and selection of expressed sequence tag simple sequence repeat primers for optimized marker distribution and reliability in peach

    USDA-ARS?s Scientific Manuscript database

    Expressed sequence tag (EST) simple sequence repeats (SSRs) in Prunus were mined, and flanking primers designed and used for genome-wide characterization and selection of primers to optimize marker distribution and reliability. A total of 12,618 contigs were assembled from 84,727 ESTs, along with 34...

  2. On the Relationship between Maximal Reliability and Maximal Validity of Linear Composites

    ERIC Educational Resources Information Center

    Penev, Spiridon; Raykov, Tenko

    2006-01-01

    A linear combination of a set of measures is often sought as an overall score summarizing subject performance. The weights in this composite can be selected to maximize its reliability or to maximize its validity, and the optimal choice of weights is in general not the same for these two optimality criteria. We explore several relationships…

  3. Defect design of insulation systems for photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Mon, G. R.

    1981-01-01

    A defect-design approach to sizing electrical insulation systems for terrestrial photovoltaic modules is presented. It consists of gathering voltage-breakdown statistics on various thicknesses of candidate insulation films where, for a designated voltage, module failure probabilities for enumerated thickness and number-of-layer film combinations are calculated. Cost analysis then selects the most economical insulation system. A manufacturing yield problem is solved to exemplify the technique. Results for unaged Mylar suggest using fewer layers of thicker films. Defect design incorporates effects of flaws in optimal insulation system selection, and obviates choosing a tolerable failure rate, since the optimization process accomplishes that. Exposure to weathering and voltage stress reduces the voltage-withstanding capability of module insulation films. Defect design, applied to aged polyester films, promises to yield reliable, cost-optimal insulation systems.

  4. Benchmark analysis of forecasted seasonal temperature over different climatic areas

    NASA Astrophysics Data System (ADS)

    Giunta, G.; Salerno, R.; Ceppi, A.; Ercolani, G.; Mancini, M.

    2015-12-01

    From a long-term perspective, an improvement of seasonal forecasting, which is often exclusively based on climatology, could provide a new capability for the management of energy resources in a time scale of just a few months. This paper regards a benchmark analysis in relation to long-term temperature forecasts over Italy in the year 2010, comparing the eni-kassandra meteo forecast (e-kmf®) model, the Climate Forecast System-National Centers for Environmental Prediction (CFS-NCEP) model, and the climatological reference (based on 25-year data) with observations. Statistical indexes are used to understand the reliability of the prediction of 2-m monthly air temperatures with a perspective of 12 weeks ahead. The results show how the best performance is achieved by the e-kmf® system which improves the reliability for long-term forecasts compared to climatology and the CFS-NCEP model. By using the reliable high-performance forecast system, it is possible to optimize the natural gas portfolio and management operations, thereby obtaining a competitive advantage in the European energy market.

  5. Design optimization under uncertainty and speed variability for a piezoelectric energy harvester powering a tire pressure monitoring sensor

    NASA Astrophysics Data System (ADS)

    Toghi Eshghi, Amin; Lee, Soobum; Kazem Sadoughi, Mohammad; Hu, Chao; Kim, Young-Cheol; Seo, Jong-Ho

    2017-10-01

    Energy harvesting (EH) technologies to power small sized electronic devices are attracting great attention. Wasted energy in a vehicle’s rotating tire has a great potential to enable self-powered tire pressure monitoring sensors (TPMS). Piezoelectric type energy harvesters can be used to collect vibrational energy and power such systems. Due to the presence of harsh acceleration in a rotating tire, a design tradeoff needs to be studied to prolong the harvester’s fatigue life as well as to ensure sufficient power generation. However, the design by traditional deterministic design optimization (DDO) does not show reliable performance due to the lack of consideration of various uncertainty factors (e.g., manufacturing tolerances, material properties, and loading conditions). In this study, we address a new EH design formulation that considers the uncertainty in car speed, dimensional tolerances and material properties, and solve this design problem using reliability-based design optimization (RBDO). The RBDO problem is formulated to maximize compactness and minimize weight of a TPMS harvester while satisfying power and durability requirements. A transient analysis has been done to measure the time varying response of EH such as power generation, dynamic strain, and stress. A conservative design formulation is proposed to consider the expected power from varied speed and stress at higher speed. When compared to the DDO, the RBDO results show that the reliability of EH is increased significantly by scarifying the objective function. Finally, experimental test has been conducted to demonstrate the merits of RBDO design over DDO.

  6. Methodological aspects of an adaptive multidirectional pattern search to optimize speech perception using three hearing-aid algorithms

    NASA Astrophysics Data System (ADS)

    Franck, Bas A. M.; Dreschler, Wouter A.; Lyzenga, Johannes

    2004-12-01

    In this study we investigated the reliability and convergence characteristics of an adaptive multidirectional pattern search procedure, relative to a nonadaptive multidirectional pattern search procedure. The procedure was designed to optimize three speech-processing strategies. These comprise noise reduction, spectral enhancement, and spectral lift. The search is based on a paired-comparison paradigm, in which subjects evaluated the listening comfort of speech-in-noise fragments. The procedural and nonprocedural factors that influence the reliability and convergence of the procedure are studied using various test conditions. The test conditions combine different tests, initial settings, background noise types, and step size configurations. Seven normal hearing subjects participated in this study. The results indicate that the reliability of the optimization strategy may benefit from the use of an adaptive step size. Decreasing the step size increases accuracy, while increasing the step size can be beneficial to create clear perceptual differences in the comparisons. The reliability also depends on starting point, stop criterion, step size constraints, background noise, algorithms used, as well as the presence of drifting cues and suboptimal settings. There appears to be a trade-off between reliability and convergence, i.e., when the step size is enlarged the reliability improves, but the convergence deteriorates. .

  7. Reliability-based design optimization of reinforced concrete structures including soil-structure interaction using a discrete gravitational search algorithm and a proposed metamodel

    NASA Astrophysics Data System (ADS)

    Khatibinia, M.; Salajegheh, E.; Salajegheh, J.; Fadaee, M. J.

    2013-10-01

    A new discrete gravitational search algorithm (DGSA) and a metamodelling framework are introduced for reliability-based design optimization (RBDO) of reinforced concrete structures. The RBDO of structures with soil-structure interaction (SSI) effects is investigated in accordance with performance-based design. The proposed DGSA is based on the standard gravitational search algorithm (GSA) to optimize the structural cost under deterministic and probabilistic constraints. The Monte-Carlo simulation (MCS) method is considered as the most reliable method for estimating the probabilities of reliability. In order to reduce the computational time of MCS, the proposed metamodelling framework is employed to predict the responses of the SSI system in the RBDO procedure. The metamodel consists of a weighted least squares support vector machine (WLS-SVM) and a wavelet kernel function, which is called WWLS-SVM. Numerical results demonstrate the efficiency and computational advantages of DGSA and the proposed metamodel for RBDO of reinforced concrete structures.

  8. Structural Optimization for Reliability Using Nonlinear Goal Programming

    NASA Technical Reports Server (NTRS)

    El-Sayed, Mohamed E.

    1999-01-01

    This report details the development of a reliability based multi-objective design tool for solving structural optimization problems. Based on two different optimization techniques, namely sequential unconstrained minimization and nonlinear goal programming, the developed design method has the capability to take into account the effects of variability on the proposed design through a user specified reliability design criterion. In its sequential unconstrained minimization mode, the developed design tool uses a composite objective function, in conjunction with weight ordered design objectives, in order to take into account conflicting and multiple design criteria. Multiple design criteria of interest including structural weight, load induced stress and deflection, and mechanical reliability. The nonlinear goal programming mode, on the other hand, provides for a design method that eliminates the difficulty of having to define an objective function and constraints, while at the same time has the capability of handling rank ordered design objectives or goals. For simulation purposes the design of a pressure vessel cover plate was undertaken as a test bed for the newly developed design tool. The formulation of this structural optimization problem into sequential unconstrained minimization and goal programming form is presented. The resulting optimization problem was solved using: (i) the linear extended interior penalty function method algorithm; and (ii) Powell's conjugate directions method. Both single and multi-objective numerical test cases are included demonstrating the design tool's capabilities as it applies to this design problem.

  9. Reliability Analysis of Differential Relay as Main Protection Transformer Using Fuzzy Logic Algorithm

    NASA Astrophysics Data System (ADS)

    Mulyadi, Y.; Sucita, T.; Sumarto; Alpani, M.

    2018-02-01

    Electricity supply demand is increasing every year. It makes PT. PLN (Persero) is required to provide optimal customer service and satisfaction. Optimal service depends on the performance of the equipment of the power system owned, especially the transformer. Power transformer is an electrical equipment that transforms electricity from high voltage to low voltage or vice versa. However, in the electrical power system, is inseparable from interference included in the transformer. But, the disturbance can be minimized by the protection system. The main protection transformer is differential relays. Differential relays working system using Kirchoff law where inflows equal outflows. If there are excessive currents that interfere then the relays will work. But, the relay can also experience decreased performance. Therefore, this final project aims to analyze the reliability of the differential relay on the transformer in three different substations. Referring to the standard applied by the transmission line protection officer, the differential relay shall have slope characteristics of 30% in the first slope and 80% in the second slope when using two slopes and 80% when using one slope with an instant time and the corresponding ratio. So, the results obtained on the Siemens differential release have a reliable slope characteristic with a value of 30 on the fuzzy logic system. In a while, ABB a differential relay is only 80% reliable because two experiments are not reliable. For the time, all the differential relays are instant with a value of 0.06 on the fuzzy logic system. For ratios, the differential relays ABB have a better value than others brand with a value of 151 on the fuzzy logic system.

  10. Application of Plackett-Burman and Doehlert designs for optimization of selenium analysis in plasma with electrothermal atomic absorption spectrometry.

    PubMed

    El Ati-Hellal, Myriam; Hellal, Fayçal; Hedhili, Abderrazek

    2014-10-01

    The aim of this study was the optimization of selenium determination in plasma samples with electrothermal atomic absorption spectrometry using experimental design methodology. 11 variables being able to influence selenium analysis in human blood plasma by electrothermal atomic absorption spectrometry (ETAAS) were evaluated with Plackett-Burman experimental design. These factors were selected from sample preparation, furnace program and chemical modification steps. Both absorbance and background signals were chosen as responses in the screening approach. Doehlert design was used for method optimization. Results showed that only ashing temperature has a statistically significant effect on the selected responses. Optimization with Doehlert design allowed the development of a reliable method for selenium analysis with ETAAS. Samples were diluted 1/10 with 0.05% (v/v) TritonX-100+2.5% (v/v) HNO3 solution. Optimized ashing and atomization temperatures for nickel modifier were 1070°C and 2270°C, respectively. A detection limit of 2.1μgL(-1) Se was obtained. Accuracy of the method was checked by the analysis of selenium in Seronorm™ Trace element quality control serum level 1. The developed procedure was applied for the analysis of total selenium in fifteen plasma samples with standard addition method. Concentrations ranged between 24.4 and 64.6μgL(-1), with a mean of 42.6±4.9μgL(-1). The use of experimental designs allowed the development of a cheap and accurate method for selenium analysis in plasma that could be applied routinely in clinical laboratories. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  11. On-orbit servicing system assessment and optimization methods based on lifecycle simulation under mixed aleatory and epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; van Tooren, Michel

    2013-06-01

    To assess the on-orbit servicing (OOS) paradigm and optimize its utilities by taking advantage of its inherent flexibility and responsiveness, the OOS system assessment and optimization methods based on lifecycle simulation under uncertainties are studied. The uncertainty sources considered in this paper include both the aleatory (random launch/OOS operation failure and on-orbit component failure) and the epistemic (the unknown trend of the end-used market price) types. Firstly, the lifecycle simulation under uncertainties is discussed. The chronological flowchart is presented. The cost and benefit models are established, and the uncertainties thereof are modeled. The dynamic programming method to make optimal decision in face of the uncertain events is introduced. Secondly, the method to analyze the propagation effects of the uncertainties on the OOS utilities is studied. With combined probability and evidence theory, a Monte Carlo lifecycle Simulation based Unified Uncertainty Analysis (MCS-UUA) approach is proposed, based on which the OOS utility assessment tool under mixed uncertainties is developed. Thirdly, to further optimize the OOS system under mixed uncertainties, the reliability-based optimization (RBO) method is studied. To alleviate the computational burden of the traditional RBO method which involves nested optimum search and uncertainty analysis, the framework of Sequential Optimization and Mixed Uncertainty Analysis (SOMUA) is employed to integrate MCS-UUA, and the RBO algorithm SOMUA-MCS is developed. Fourthly, a case study on the OOS system for a hypothetical GEO commercial communication satellite is investigated with the proposed assessment tool. Furthermore, the OOS system is optimized with SOMUA-MCS. Lastly, some conclusions are given and future research prospects are highlighted.

  12. Automated Design of a High-Velocity Channel

    DTIC Science & Technology

    2006-05-01

    using Newton’s method. 2.2.2 Groundwater Applications Optimization methods are also very useful for solving groundwater problems. Townley et al... Townley 85] apply present computational algorithms to steady and transient models for groundwater °ow. The aquifer storage coe±cients, transmissivities...Reliability Analysis", Water Resources Research, Vol. 28, No. 12, December 1992, pp. 3269-3280. [ Townley 85] Townley , L. R. and Wilson, J. L

  13. Finite element based electric motor design optimization

    NASA Technical Reports Server (NTRS)

    Campbell, C. Warren

    1993-01-01

    The purpose of this effort was to develop a finite element code for the analysis and design of permanent magnet electric motors. These motors would drive electromechanical actuators in advanced rocket engines. The actuators would control fuel valves and thrust vector control systems. Refurbishing the hydraulic systems of the Space Shuttle after each flight is costly and time consuming. Electromechanical actuators could replace hydraulics, improve system reliability, and reduce down time.

  14. Characterization of interfade duration for satellite communication systems design and optimization in a temperate climate

    NASA Astrophysics Data System (ADS)

    Jorge, Flávio; Riva, Carlo; Rocha, Armando

    2016-03-01

    The characterization of the fade dynamics on Earth-satellite links is an important subject when designing the so called fade mitigation techniques that contribute to the proper reliability of the satellite communication systems and the customers' quality of service (QoS). The interfade duration, defined as the period between two consecutive fade events, has been only poorly analyzed using limited data sets, but its complete characterization would enable the design and optimization of the satellite communication systems by estimating the system requirements to recover in time before the next propagation impairment. Depending on this analysis, several actions can be taken ensuring the service maintenance. In this paper we present for the first time a detailed and comprehensive analysis of the interfade events statistical properties based on 9 years of in-excess attenuation measurements at Ka band (19.7 GHz) with very high availability that is required to build a reliable data set mainly for the longer interfade duration events. The number of years necessary to reach the statistical stability of interfade duration is also evaluated for the first time, providing a reference when accessing the relevance of the results published in the past. The study is carried out in Aveiro, Portugal, which is conditioned by temperate Mediterranean climate with Oceanic influences.

  15. Optimization of multiplexed PCR on an integrated microfluidic forensic platform for rapid DNA analysis.

    PubMed

    Estes, Matthew D; Yang, Jianing; Duane, Brett; Smith, Stan; Brooks, Carla; Nordquist, Alan; Zenhausern, Frederic

    2012-12-07

    This study reports the design, prototyping, and assay development of multiplexed polymerase chain reaction (PCR) on a plastic microfluidic device. Amplification of 17 DNA loci is carried out directly on-chip as part of a system for continuous workflow processing from sample preparation (SP) to capillary electrophoresis (CE). For enhanced performance of on-chip PCR amplification, improved control systems have been developed making use of customized Peltier assemblies, valve actuators, software, and amplification chemistry protocols. Multiple enhancements to the microfluidic chip design have been enacted to improve the reliability of sample delivery through the various on-chip modules. This work has been enabled by the encapsulation of PCR reagents into a solid phase material through an optimized Solid Phase Encapsulating Assay Mix (SPEAM) bead-based hydrogel fabrication process. SPEAM bead technology is reliably coupled with precise microfluidic metering and dispensing for efficient amplification and subsequent DNA short tandem repeat (STR) fragment analysis. This provides a means of on-chip reagent storage suitable for microfluidic automation, with the long shelf-life necessary for point-of-care (POC) or field deployable applications. This paper reports the first high quality 17-plex forensic STR amplification from a reference sample in a microfluidic chip with preloaded solid phase reagents, that is designed for integration with up and downstream processing.

  16. Warranty optimisation based on the prediction of costs to the manufacturer using neural network model and Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Stamenkovic, Dragan D.; Popovic, Vladimir M.

    2015-02-01

    Warranty is a powerful marketing tool, but it always involves additional costs to the manufacturer. In order to reduce these costs and make use of warranty's marketing potential, the manufacturer needs to master the techniques for warranty cost prediction according to the reliability characteristics of the product. In this paper a combination free replacement and pro rata warranty policy is analysed as warranty model for one type of light bulbs. Since operating conditions have a great impact on product reliability, they need to be considered in such analysis. A neural network model is used to predict light bulb reliability characteristics based on the data from the tests of light bulbs in various operating conditions. Compared with a linear regression model used in the literature for similar tasks, the neural network model proved to be a more accurate method for such prediction. Reliability parameters obtained in this way are later used in Monte Carlo simulation for the prediction of times to failure needed for warranty cost calculation. The results of the analysis make possible for the manufacturer to choose the optimal warranty policy based on expected product operating conditions. In such a way, the manufacturer can lower the costs and increase the profit.

  17. Targeted methods for quantitative analysis of protein glycosylation

    PubMed Central

    Goldman, Radoslav; Sanda, Miloslav

    2018-01-01

    Quantification of proteins by LC-MS/MS-MRM has become a standard method with broad projected clinical applicability. MRM quantification of protein modifications is, however, far less utilized, especially in the case of glycoproteins. This review summarizes current methods for quantitative analysis of protein glycosylation with a focus on MRM methods. We describe advantages of this quantitative approach, analytical parameters that need to be optimized to achieve reliable measurements, and point out the limitations. Differences between major classes of N- and O-glycopeptides are described and class-specific glycopeptide assays are demonstrated. PMID:25522218

  18. Critical evaluation of sample pretreatment techniques.

    PubMed

    Hyötyläinen, Tuulia

    2009-06-01

    Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.

  19. Nodal failure index approach to groundwater remediation design

    USGS Publications Warehouse

    Lee, J.; Reeves, H.W.; Dowding, C.H.

    2008-01-01

    Computer simulations often are used to design and to optimize groundwater remediation systems. We present a new computationally efficient approach that calculates the reliability of remedial design at every location in a model domain with a single simulation. The estimated reliability and other model information are used to select a best remedial option for given site conditions, conceptual model, and available data. To evaluate design performance, we introduce the nodal failure index (NFI) to determine the number of nodal locations at which the probability of success is below the design requirement. The strength of the NFI approach is that selected areas of interest can be specified for analysis and the best remedial design determined for this target region. An example application of the NFI approach using a hypothetical model shows how the spatial distribution of reliability can be used for a decision support system in groundwater remediation design. ?? 2008 ASCE.

  20. Developing a Measure of General Academic Ability: An Application of Maximal Reliability and Optimal Linear Combination to High School Students' Scores

    ERIC Educational Resources Information Center

    Dimitrov, Dimiter M.; Raykov, Tenko; AL-Qataee, Abdullah Ali

    2015-01-01

    This article is concerned with developing a measure of general academic ability (GAA) for high school graduates who apply to colleges, as well as with the identification of optimal weights of the GAA indicators in a linear combination that yields a composite score with maximal reliability and maximal predictive validity, employing the framework of…

  1. Research on the optimal structure configuration of dither RLG used in skewed redundant INS

    NASA Astrophysics Data System (ADS)

    Gao, Chunfeng; Wang, Qi; Wei, Guo; Long, Xingwu

    2016-05-01

    The actual combat effectiveness of weapon equipment is restricted by the performance of Inertial Navigation System (INS), especially in high reliability required situations such as fighter, satellite and submarine. Through the use of skewed sensor geometries, redundant technique has been applied to reduce the cost and improve the reliability of the INS. In this paper, the structure configuration and the inertial sensor characteristics of Skewed Redundant Strapdown Inertial Navigation System (SRSINS) using dithered Ring Laser Gyroscope (RLG) are analyzed. For the dither coupling effects of the dither gyro, the system measurement errors can be amplified either the individual gyro dither frequency is near one another or the structure of the SRSINS is unreasonable. Based on the characteristics of RLG, the research on coupled vibration of dithered RLG in SRSINS is carried out. On the principle of optimal navigation performance, optimal reliability and optimal cost-effectiveness, the comprehensive evaluation scheme of the inertial sensor configuration of SRINS is given.

  2. Nonlinear Rayleigh wave inversion based on the shuffled frog-leaping algorithm

    NASA Astrophysics Data System (ADS)

    Sun, Cheng-Yu; Wang, Yan-Yan; Wu, Dun-Shi; Qin, Xiao-Jun

    2017-12-01

    At present, near-surface shear wave velocities are mainly calculated through Rayleigh wave dispersion-curve inversions in engineering surface investigations, but the required calculations pose a highly nonlinear global optimization problem. In order to alleviate the risk of falling into a local optimal solution, this paper introduces a new global optimization method, the shuffle frog-leaping algorithm (SFLA), into the Rayleigh wave dispersion-curve inversion process. SFLA is a swarm-intelligence-based algorithm that simulates a group of frogs searching for food. It uses a few parameters, achieves rapid convergence, and is capability of effective global searching. In order to test the reliability and calculation performance of SFLA, noise-free and noisy synthetic datasets were inverted. We conducted a comparative analysis with other established algorithms using the noise-free dataset, and then tested the ability of SFLA to cope with data noise. Finally, we inverted a real-world example to examine the applicability of SFLA. Results from both synthetic and field data demonstrated the effectiveness of SFLA in the interpretation of Rayleigh wave dispersion curves. We found that SFLA is superior to the established methods in terms of both reliability and computational efficiency, so it offers great potential to improve our ability to solve geophysical inversion problems.

  3. Development of Long-Pulse Heating and Current Drive Actuators and Operational Techniques Compatible with a High-Z Divertor and First Wall

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Guiding

    Accurate measurement of the edge electron density profile is essential to optimizing antenna coupling and assessment of impurity contamination in studying long-pulse plasma heating and current drive in fusion devices. Measurement of the edge density profile has been demonstrated on the US fusion devices such as C-Mod, DIII-D, and TFTR amongst many devices, and has been used for RF loading and impurity modeling calculations for many years. University of Science and Technology of China (USTC) has recently installed a density profile reflectometer system on the EAST fusion device at the Institute of Plasma Physics, Chinese Academy of Sciences in Chinamore » based on the University of California Los Angeles (UCLA)-designed reflectometer system on the DIII-D fusion device at General Atomics Company in San Diego, California. UCLA has been working with USTC to optimize the existing microwave antenna, waveguide system, microwave electronics, and data analysis to produce reliable edge density profiles. During the past budget year, progress has been made in all three major areas: effort to achieve reliable system operations under various EAST operational conditions, effort to optimize system performance, and effort to provide quality density profiles into EAST’s database routinely.« less

  4. Stability of steady hand force production explored across spaces and methods of analysis.

    PubMed

    de Freitas, Paulo B; Freitas, Sandra M S F; Lewis, Mechelle M; Huang, Xuemei; Latash, Mark L

    2018-06-01

    We used the framework of the uncontrolled manifold (UCM) hypothesis and explored the reliability of several outcome variables across different spaces of analysis during a very simple four-finger accurate force production task. Fourteen healthy, young adults performed the accurate force production task with each hand on 3 days. Small spatial finger perturbations were generated by the "inverse piano" device three times per trial (lifting the fingers 1 cm/0.5 s and lowering them). The data were analyzed using the following main methods: (1) computation of indices of the structure of inter-trial variance and motor equivalence in the space of finger forces and finger modes, and (2) analysis of referent coordinates and apparent stiffness values for the hand. Maximal voluntary force and the index of enslaving (unintentional finger force production) showed good to excellent reliability. Strong synergies stabilizing total force were reflected in both structure of variance and motor equivalence indices. Variance within the UCM and the index of motor equivalent motion dropped over the trial duration and showed good to excellent reliability. Variance orthogonal to the UCM and the index of non-motor equivalent motion dropped over the 3 days and showed poor to moderate reliability. Referent coordinate and apparent stiffness indices co-varied strongly and both showed good reliability. In contrast, the computed index of force stabilization showed poor reliability. The findings are interpreted within the scheme of neural control with referent coordinates involving the hierarchy of two basic commands, the r-command and c-command. The data suggest natural drifts in the finger force space, particularly within the UCM. We interpret these drifts as reflections of a trade-off between stability and optimization of action. The implications of these findings for the UCM framework and future clinical applications are explored in the discussion. Indices of the structure of variance and motor equivalence show good reliability and can be recommended for applied studies.

  5. Finite element analysis of wirelessly interrogated implantable bio-MEMS

    NASA Astrophysics Data System (ADS)

    Dissanayake, Don W.; Al-Sarawi, Said F.; Lu, Tien-Fu; Abbott, Derek

    2008-12-01

    Wirelessly interrogated bio-MEMS devices are becoming more popular due to many challenges, such as improving the diagnosis, monitoring, and patient wellbeing. The authors present here a passive, low power and small area device, which can be interrogated wirelessly using a uniquely coded signal for a secure and reliable operation. The proposed new approach relies on converting the interrogating coded signal to surface acoustic wave that is then correlated with an embedded code. The suggested method is implemented to operate a micropump, which consist of a specially designed corrugated microdiaphragm to modulate the fluid flow in microchannels. Finite Element Analysis of the micropump operation is presented and a performance was analysed. Design parameters of the diaphragm design were finetuned for optimal performance and different polymer based materials were used in various parts of the micropump to allow for better flexibility and high reliability.

  6. Automated forensic DNA purification optimized for FTA card punches and identifiler STR-based PCR analysis.

    PubMed

    Tack, Lois C; Thomas, Michelle; Reich, Karl

    2007-03-01

    Forensic labs globally face the same problem-a growing need to process a greater number and wider variety of samples for DNA analysis. The same forensic lab can be tasked all at once with processing mixed casework samples from crime scenes, convicted offender samples for database entry, and tissue from tsunami victims for identification. Besides flexibility in the robotic system chosen for forensic automation, there is a need, for each sample type, to develop new methodology that is not only faster but also more reliable than past procedures. FTA is a chemical treatment of paper, unique to Whatman Bioscience, and is used for the stabilization and storage of biological samples. Here, the authors describe optimization of the Whatman FTA Purification Kit protocol for use with the AmpFlSTR Identifiler PCR Amplification Kit.

  7. Optimized production planning model for a multi-plant cultivation system under uncertainty

    NASA Astrophysics Data System (ADS)

    Ke, Shunkui; Guo, Doudou; Niu, Qingliang; Huang, Danfeng

    2015-02-01

    An inexact multi-constraint programming model under uncertainty was developed by incorporating a production plan algorithm into the crop production optimization framework under the multi-plant collaborative cultivation system. In the production plan, orders from the customers are assigned to a suitable plant under the constraints of plant capabilities and uncertainty parameters to maximize profit and achieve customer satisfaction. The developed model and solution method were applied to a case study of a multi-plant collaborative cultivation system to verify its applicability. As determined in the case analysis involving different orders from customers, the period of plant production planning and the interval between orders can significantly affect system benefits. Through the analysis of uncertain parameters, reliable and practical decisions can be generated using the suggested model of a multi-plant collaborative cultivation system.

  8. Need for Cost Optimization of Space Life Support Systems

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.; Anderson, Grant

    2017-01-01

    As the nation plans manned missions that go far beyond Earth orbit to Mars, there is an urgent need for a robust, disciplined systems engineering methodology that can identify an optimized Environmental Control and Life Support (ECLSS) architecture for long duration deep space missions. But unlike the previously used Equivalent System Mass (ESM), the method must be inclusive of all driving parameters and emphasize the economic analysis of life support system design. The key parameter for this analysis is Life Cycle Cost (LCC). LCC takes into account the cost for development and qualification of the system, launch costs, operational costs, maintenance costs and all other relevant and associated costs. Additionally, an effective methodology must consider system technical performance, safety, reliability, maintainability, crew time, and other factors that could affect the overall merit of the life support system.

  9. Error-based analysis of optimal tuning functions explains phenomena observed in sensory neurons.

    PubMed

    Yaeli, Steve; Meir, Ron

    2010-01-01

    Biological systems display impressive capabilities in effectively responding to environmental signals in real time. There is increasing evidence that organisms may indeed be employing near optimal Bayesian calculations in their decision-making. An intriguing question relates to the properties of optimal encoding methods, namely determining the properties of neural populations in sensory layers that optimize performance, subject to physiological constraints. Within an ecological theory of neural encoding/decoding, we show that optimal Bayesian performance requires neural adaptation which reflects environmental changes. Specifically, we predict that neuronal tuning functions possess an optimal width, which increases with prior uncertainty and environmental noise, and decreases with the decoding time window. Furthermore, even for static stimuli, we demonstrate that dynamic sensory tuning functions, acting at relatively short time scales, lead to improved performance. Interestingly, the narrowing of tuning functions as a function of time was recently observed in several biological systems. Such results set the stage for a functional theory which may explain the high reliability of sensory systems, and the utility of neuronal adaptation occurring at multiple time scales.

  10. Rasch Analysis of the 9-Item Shared Decision Making Questionnaire in Women With Breast Cancer.

    PubMed

    Wu, Tzu-Yi; Chen, Cheng-Te; Huang, Yi-Jing; Hou, Wen-Hsuan; Wang, Jung-Der; Hsieh, Ching-Lin

    2018-04-19

    Shared decision making (SDM) is a best practice to help patients make optimal decisions by a process of healthcare, especially for women diagnosed with breast cancer and having heavy burden in long-term treatments. To promote successful SDM, it is crucial to assess the level of perceived involvement in SDM in women with breast cancer. The aims of this study were to apply Rasch analysis to examine the construct validity and person reliability of the 9-item Shared Decision Making Questionnaire (SDM-Q-9) in women with breast cancer. The construct validity of SDM-Q-9 was confirmed when the items fit the Rasch model's assumptions of unidimensionality: (1) infit and outfit mean square ranged from 0.6 to 1.4; (2) the unexplained variance of the first dimension of the principal component analysis was less than 20%. Person reliability was calculated. A total of 212 participants were recruited in this study. Item 1 did not fit the model's assumptions and was deleted. The unidimensionality of the remaining 8 items (SDM-Q-8) was supported with good item fit (infit and outfit mean square ranging from 0.6 to 1.3) and very low unexplained variance of the first dimension (5.3%) of the principal component analysis. The person reliability of the SDM-Q-8 was 0.90. The SDM-Q-8 was unidimensional and had good person reliability in women with breast cancer. The SDM-Q-8 has shown its potential for assessing the level of perceived involvement in SDM in women with breast cancer for both research and clinical purposes.

  11. Optimal clustering of MGs based on droop controller for improving reliability using a hybrid of harmony search and genetic algorithms.

    PubMed

    Abedini, Mohammad; Moradi, Mohammad H; Hosseinian, S M

    2016-03-01

    This paper proposes a novel method to address reliability and technical problems of microgrids (MGs) based on designing a number of self-adequate autonomous sub-MGs via adopting MGs clustering thinking. In doing so, a multi-objective optimization problem is developed where power losses reduction, voltage profile improvement and reliability enhancement are considered as the objective functions. To solve the optimization problem a hybrid algorithm, named HS-GA, is provided, based on genetic and harmony search algorithms, and a load flow method is given to model different types of DGs as droop controller. The performance of the proposed method is evaluated in two case studies. The results provide support for the performance of the proposed method. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Design and comparative analysis of 10 MW class superconducting wind power generators according to different types of superconducting wires

    NASA Astrophysics Data System (ADS)

    Sung, Hae-Jin; Kim, Gyeong-Hun; Kim, Kwangmin; Park, Minwon; Yu, In-Keun; Kim, Jong-Yul

    2013-11-01

    Wind turbine concepts can be classified into the geared type and the gearless type. The gearless type wind turbine is more attractive due to advantages of simplified drive train and increased energy yield, and higher reliability because the gearbox is omitted. In addition, this type resolves the weight issue of the wind turbine with the light weight of gearbox. However, because of the low speed operation, this type has disadvantage such as the large diameter and heavy weight of generator. Super-Conducting (SC) wind power generator can reduce the weight and volume of a wind power system. Properties of superconducting wire are very different from each company. This paper considers the design and comparative analysis of 10 MW class SC wind power generators according to different types of SC wires. Super-Conducting Synchronous Generators (SCSGs) using YBCO and Bi-2223 wires are optimized by an optimal method. The magnetic characteristics of the SCSGs are investigated using the finite elements method program. The optimized specifications of the SCSGs are discussed in detail, and the optimization processes can be used effectively to develop large scale wind power generation systems.

  13. Reliability-based design optimization using a generalized subset simulation method and posterior approximation

    NASA Astrophysics Data System (ADS)

    Ma, Yuan-Zhuo; Li, Hong-Shuang; Yao, Wei-Xing

    2018-05-01

    The evaluation of the probabilistic constraints in reliability-based design optimization (RBDO) problems has always been significant and challenging work, which strongly affects the performance of RBDO methods. This article deals with RBDO problems using a recently developed generalized subset simulation (GSS) method and a posterior approximation approach. The posterior approximation approach is used to transform all the probabilistic constraints into ordinary constraints as in deterministic optimization. The assessment of multiple failure probabilities required by the posterior approximation approach is achieved by GSS in a single run at all supporting points, which are selected by a proper experimental design scheme combining Sobol' sequences and Bucher's design. Sequentially, the transformed deterministic design optimization problem can be solved by optimization algorithms, for example, the sequential quadratic programming method. Three optimization problems are used to demonstrate the efficiency and accuracy of the proposed method.

  14. Energy Storage Applications in Power Systems with Renewable Energy Generation

    NASA Astrophysics Data System (ADS)

    Ghofrani, Mahmoud

    In this dissertation, we propose new operational and planning methodologies for power systems with renewable energy sources. A probabilistic optimal power flow (POPF) is developed to model wind power variations and evaluate the power system operation with intermittent renewable energy generation. The methodology is used to calculate the operating and ramping reserves that are required to compensate for power system uncertainties. Distributed wind generation is introduced as an operational scheme to take advantage of the spatial diversity of renewable energy resources and reduce wind power fluctuations using low or uncorrelated wind farms. The POPF is demonstrated using the IEEE 24-bus system where the proposed operational scheme reduces the operating and ramping reserve requirements and operation and congestion cost of the system as compared to operational practices available in the literature. A stochastic operational-planning framework is also proposed to adequately size, optimally place and schedule storage units within power systems with high wind penetrations. The method is used for different applications of energy storage systems for renewable energy integration. These applications include market-based opportunities such as renewable energy time-shift, renewable capacity firming, and transmission and distribution upgrade deferral in the form of revenue or reduced cost and storage-related societal benefits such as integration of more renewables, reduced emissions and improved utilization of grid assets. A power-pool model which incorporates the one-sided auction market into POPF is developed. The model considers storage units as market participants submitting hourly price bids in the form of marginal costs. This provides an accurate market-clearing process as compared to the 'price-taker' analysis available in the literature where the effects of large-scale storage units on the market-clearing prices are neglected. Different case studies are provided to demonstrate our operational-planning framework and economic justification for different storage applications. A new reliability model is proposed for security and adequacy assessment of power networks containing renewable resources and energy storage systems. The proposed model is used in combination with the operational-planning framework to enhance the reliability and operability of wind integration. The proposed framework optimally utilizes the storage capacity for reliability applications of wind integration. This is essential for justification of storage deployment within regulated utilities where the absence of market opportunities limits the economic advantage of storage technologies over gas-fired generators. A control strategy is also proposed to achieve the maximum reliability using energy storage systems. A cost-benefit analysis compares storage technologies and conventional alternatives to reliably and efficiently integrate different wind penetrations and determines the most economical design. Our simulation results demonstrate the necessity of optimal storage placement for different wind applications. This dissertation also proposes a new stochastic framework to optimally charge and discharge electric vehicles (EVs) to mitigate the effects of wind power uncertainties. Vehicle-to-grid (V2G) service for hedging against wind power imbalances is introduced as a novel application for EVs. This application enhances the predictability of wind power and reduces the power imbalances between the scheduled output and actual power. An Auto Regressive Moving Average (ARMA) wind speed model is developed to forecast the wind power output. Driving patterns of EVs are stochastically modeled and the EVs are clustered in the fleets of similar daily driving patterns. Monte Carlo Simulation (MCS) simulates the system behavior by generating samples of system states using the wind ARMA model and EVs driving patterns. A Genetic Algorithm (GA) is used in combination with MCS to optimally coordinate the EV fleets for their V2G services and minimize the penalty cost associated with wind power imbalances. The economic characteristics of automotive battery technologies and costs of V2G service are incorporated into a cost-benefit analysis which evaluates the economic justification of the proposed V2G application. Simulation results demonstrate that the developed algorithm enhances wind power utilization and reduces the penalty cost for wind power under-/over-production. This offers potential revenues for the wind producer. Our cost-benefit analysis also demonstrates that the proposed algorithm will provide the EV owners with economic incentives to participate in V2G services. The proposed smart scheduling strategy develops a sustainable integrated electricity and transportation infrastructure.

  15. Interphase Thermomechanical Reliability and Optimization for High-Performance Ti Metal Laminates

    DTIC Science & Technology

    2011-12-19

    Thermomechanical Reliability and Optimization for High-Performance Ti FA9550-08-l-0015 Metal Laminates Sb. GRANT NUMBER Program Manager: Dr Joycelyn Harrison...OSR-VA-TR-2012-0202 12. DISTRIBUTION/AVAILABILITY STATEMENT A 13. SUPPLEMENTARY NOTES 14. ABSTRACT Hybrid laminated composites such as titanium...graphite (TiGr) laminates are an emerging class of structural materials with the potential to enable a new generation of efficient, high-performance

  16. Reliability-based management of buried pipelines considering external corrosion defects

    NASA Astrophysics Data System (ADS)

    Miran, Seyedeh Azadeh

    Corrosion is one of the main deteriorating mechanisms that degrade the energy pipeline integrity, due to transferring corrosive fluid or gas and interacting with corrosive environment. Corrosion defects are usually detected by periodical inspections using in-line inspection (ILI) methods. In order to ensure pipeline safety, this study develops a cost-effective maintenance strategy that consists of three aspects: corrosion growth model development using ILI data, time-dependent performance evaluation, and optimal inspection interval determination. In particular, the proposed study is applied to a cathodic protected buried steel pipeline located in Mexico. First, time-dependent power-law formulation is adopted to probabilistically characterize growth of the maximum depth and length of the external corrosion defects. Dependency between defect depth and length are considered in the model development and generation of the corrosion defects over time is characterized by the homogenous Poisson process. The growth models unknown parameters are evaluated based on the ILI data through the Bayesian updating method with Markov Chain Monte Carlo (MCMC) simulation technique. The proposed corrosion growth models can be used when either matched or non-matched defects are available, and have ability to consider newly generated defects since last inspection. Results of this part of study show that both depth and length growth models can predict damage quantities reasonably well and a strong correlation between defect depth and length is found. Next, time-dependent system failure probabilities are evaluated using developed corrosion growth models considering prevailing uncertainties where three failure modes, namely small leak, large leak and rupture are considered. Performance of the pipeline is evaluated through failure probability per km (or called a sub-system) where each subsystem is considered as a series system of detected and newly generated defects within that sub-system. Sensitivity analysis is also performed to determine to which incorporated parameter(s) in the growth models reliability of the studied pipeline is most sensitive. The reliability analysis results suggest that newly generated defects should be considered in calculating failure probability, especially for prediction of long-term performance of the pipeline and also, impact of the statistical uncertainty in the model parameters is significant that should be considered in the reliability analysis. Finally, with the evaluated time-dependent failure probabilities, a life cycle-cost analysis is conducted to determine optimal inspection interval of studied pipeline. The expected total life-cycle costs consists construction cost and expected costs of inspections, repair, and failure. The repair is conducted when failure probability from any described failure mode exceeds pre-defined probability threshold after each inspection. Moreover, this study also investigates impact of repair threshold values and unit costs of inspection and failure on the expected total life-cycle cost and optimal inspection interval through a parametric study. The analysis suggests that a smaller inspection interval leads to higher inspection costs, but can lower failure cost and also repair cost is less significant compared to inspection and failure costs.

  17. FAST: a framework for simulation and analysis of large-scale protein-silicon biosensor circuits.

    PubMed

    Gu, Ming; Chakrabartty, Shantanu

    2013-08-01

    This paper presents a computer aided design (CAD) framework for verification and reliability analysis of protein-silicon hybrid circuits used in biosensors. It is envisioned that similar to integrated circuit (IC) CAD design tools, the proposed framework will be useful for system level optimization of biosensors and for discovery of new sensing modalities without resorting to laborious fabrication and experimental procedures. The framework referred to as FAST analyzes protein-based circuits by solving inverse problems involving stochastic functional elements that admit non-linear relationships between different circuit variables. In this regard, FAST uses a factor-graph netlist as a user interface and solving the inverse problem entails passing messages/signals between the internal nodes of the netlist. Stochastic analysis techniques like density evolution are used to understand the dynamics of the circuit and estimate the reliability of the solution. As an example, we present a complete design flow using FAST for synthesis, analysis and verification of our previously reported conductometric immunoassay that uses antibody-based circuits to implement forward error-correction (FEC).

  18. A hybrid approach to near-optimal launch vehicle guidance

    NASA Technical Reports Server (NTRS)

    Leung, Martin S. K.; Calise, Anthony J.

    1992-01-01

    This paper evaluates a proposed hybrid analytical/numerical approach to launch-vehicle guidance for ascent to orbit injection. The feedback-guidance approach is based on a piecewise nearly analytic zero-order solution evaluated using a collocation method. The zero-order solution is then improved through a regular perturbation analysis, wherein the neglected dynamics are corrected in the first-order term. For real-time implementation, the guidance approach requires solving a set of small dimension nonlinear algebraic equations and performing quadrature. Assessment of performance and reliability are carried out through closed-loop simulation for a vertically launched 2-stage heavy-lift capacity vehicle to a low earth orbit. The solutions are compared with optimal solutions generated from a multiple shooting code. In the example the guidance approach delivers over 99.9 percent of optimal performance and terminal constraint accuracy.

  19. Reliability-based optimization of maintenance scheduling of mechanical components under fatigue

    PubMed Central

    Beaurepaire, P.; Valdebenito, M.A.; Schuëller, G.I.; Jensen, H.A.

    2012-01-01

    This study presents the optimization of the maintenance scheduling of mechanical components under fatigue loading. The cracks of damaged structures may be detected during non-destructive inspection and subsequently repaired. Fatigue crack initiation and growth show inherent variability, and as well the outcome of inspection activities. The problem is addressed under the framework of reliability based optimization. The initiation and propagation of fatigue cracks are efficiently modeled using cohesive zone elements. The applicability of the method is demonstrated by a numerical example, which involves a plate with two holes subject to alternating stress. PMID:23564979

  20. Concurrent validation and reliability of digital image analysis of granulation tissue color for clinical pressure ulcers.

    PubMed

    Iizaka, Shinji; Sugama, Junko; Nakagami, Gojiro; Kaitani, Toshiko; Naito, Ayumi; Koyanagi, Hiroe; Matsuo, Junko; Kadono, Takafumi; Konya, Chizuko; Sanada, Hiromi

    2011-01-01

    Granulation tissue color is one indicator for pressure ulcer (PU) assessment. However, it entails a subjective evaluation only, and quantitative methods have not been established. We developed color indicators from digital image analysis and investigated their concurrent validity and reliability for clinical PUs. A cross-sectional study was conducted on 47 patients with 55 full-thickness PUs. After color calibration, a wound photograph was converted into three images representing red color: erythema index (EI), modified erythema index with additional color calibration (granulation red index [GRI]), and , which represents the artificially created red-green axis of L(*) a(*) b(*) color space. The mean intensity of the granulation tissue region and the percentage of pixels exceeding the optimal cutoff intensity (% intensity) were calculated. Mean GRI (ρ=0.39, p=0.007) and (ρ=0.55, p<0.001), as well as their % intensity indicators, showed positive correlations with a(*) measured by tristimulus colorimeter, but erythema index did not. They were correlated with hydroxyproline concentration in wound fluid, healthy granulation tissue area, and blood hemoglobin level. Intra- and interrater reliability of the indicator calculation using both GRI and had an intraclass correlation coefficient >0.9. GRI and from digital image analysis can quantitatively evaluate granulation tissue color of clinical PUs. © 2011 by the Wound Healing Society.

  1. An optimal ultrasonographic diagnostic test for early gout: A prospective controlled study

    PubMed Central

    Petraitis, Mykolas; Apanaviciene, Indre; Virviciute, Dalia; Baranauskaite, Asta

    2017-01-01

    Objective To identify the optimal sites for classification of early gout by ultrasonography. Methods Sixty patients with monosodium urate crystal-proven gout (25 with early gout [≤2-year symptom duration], 35 with late gout [>2-year symptom duration], and 36 normouricemic healthy controls) from one centre were prospectively evaluated. Standardized blinded ultrasound examination of 36 joints and the triceps and patellar tendons was performed to identify tophi and the double contour (DC) sign. Results Ultrasonographic sensitivity was lower in early than late gout. Binary logistic regression analysis showed that two ultrasonographic signs (tophi in the first metatarsophalangeal joint [odds ratio, 16.46] and the DC sign in the ankle [odds ratio, 25.18]) significantly contributed to the final model for early gout diagnosis (sensitivity and specificity of 84% and 81%, respectively). The inter-reader reliability kappa value for the DC sign and tophi was 0.712. Conclusions Four-joint investigation (both first metatarsophalangeal joints for tophi and both ankles for the DC sign) is feasible and reliable and could be proposed as a screening test for early ultrasonographic gout classification in daily practice. PMID:28617199

  2. An optimal ultrasonographic diagnostic test for early gout: A prospective controlled study.

    PubMed

    Norkuviene, Eleonora; Petraitis, Mykolas; Apanaviciene, Indre; Virviciute, Dalia; Baranauskaite, Asta

    2017-08-01

    Objective To identify the optimal sites for classification of early gout by ultrasonography. Methods Sixty patients with monosodium urate crystal-proven gout (25 with early gout [≤2-year symptom duration], 35 with late gout [>2-year symptom duration], and 36 normouricemic healthy controls) from one centre were prospectively evaluated. Standardized blinded ultrasound examination of 36 joints and the triceps and patellar tendons was performed to identify tophi and the double contour (DC) sign. Results Ultrasonographic sensitivity was lower in early than late gout. Binary logistic regression analysis showed that two ultrasonographic signs (tophi in the first metatarsophalangeal joint [odds ratio, 16.46] and the DC sign in the ankle [odds ratio, 25.18]) significantly contributed to the final model for early gout diagnosis (sensitivity and specificity of 84% and 81%, respectively). The inter-reader reliability kappa value for the DC sign and tophi was 0.712. Conclusions Four-joint investigation (both first metatarsophalangeal joints for tophi and both ankles for the DC sign) is feasible and reliable and could be proposed as a screening test for early ultrasonographic gout classification in daily practice.

  3. Towards Risk Based Design for NASA's Missions

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; Barrientos, Francesca; Meshkat, Leila

    2004-01-01

    This paper describes the concept of Risk Based Design in the context of NASA s low volume, high cost missions. The concept of accounting for risk in the design lifecycle has been discussed and proposed under several research topics, including reliability, risk analysis, optimization, uncertainty, decision-based design, and robust design. This work aims to identify and develop methods to enable and automate a means to characterize and optimize risk, and use risk as a tradeable resource to make robust and reliable decisions, in the context of the uncertain and ambiguous stage of early conceptual design. This paper first presents a survey of the related topics explored in the design research community as they relate to risk based design. Then, a summary of the topics from the NASA-led Risk Colloquium is presented, followed by current efforts within NASA to account for risk in early design. Finally, a list of "risk elements", identified for early-phase conceptual design at NASA, is presented. The purpose is to lay the foundation and develop a roadmap for future work and collaborations for research to eliminate and mitigate these risk elements in early phase design.

  4. Establishing high resolution melting analysis: method validation and evaluation for c-RET proto-oncogene mutation screening.

    PubMed

    Benej, Martin; Bendlova, Bela; Vaclavikova, Eliska; Poturnajova, Martina

    2011-10-06

    Reliable and effective primary screening of mutation carriers is the key condition for common diagnostic use. The objective of this study is to validate the method high resolution melting (HRM) analysis for routine primary mutation screening and accomplish its optimization, evaluation and validation. Due to their heterozygous nature, germline point mutations of c-RET proto-oncogene, associated to multiple endocrine neoplasia type 2 (MEN2), are suitable for HRM analysis. Early identification of mutation carriers has a major impact on patients' survival due to early onset of medullary thyroid carcinoma (MTC) and resistance to conventional therapy. The authors performed a series of validation assays according to International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH) guidelines for validation of analytical procedures, along with appropriate design and optimization experiments. After validated evaluation of HRM, the method was utilized for primary screening of 28 pathogenic c-RET mutations distributed among nine exons of c-RET gene. Validation experiments confirm the repeatability, robustness, accuracy and reproducibility of HRM. All c-RET gene pathogenic variants were detected with no occurrence of false-positive/false-negative results. The data provide basic information about design, establishment and validation of HRM for primary screening of genetic variants in order to distinguish heterozygous point mutation carriers among the wild-type sequence carriers. HRM analysis is a powerful and reliable tool for rapid and cost-effective primary screening, e.g., of c-RET gene germline and/or sporadic mutations and can be used as a first line potential diagnostic tool.

  5. Clinically orientated classification incorporating shoulder balance for the surgical treatment of adolescent idiopathic scoliosis.

    PubMed

    Elsebaie, H B; Dannawi, Z; Altaf, F; Zaidan, A; Al Mukhtar, M; Shaw, M J; Gibson, A; Noordeen, H

    2016-02-01

    The achievement of shoulder balance is an important measure of successful scoliosis surgery. No previously described classification system has taken shoulder balance into account. We propose a simple classification system for AIS based on two components which include the curve type and shoulder level. Altogether, three curve types have been defined according to the size and location of the curves, each curve pattern is subdivided into type A or B depending on the shoulder level. This classification was tested for interobserver reproducibility and intraobserver reliability. A retrospective analysis of the radiographs of 232 consecutive cases of AIS patients treated surgically between 2005 and 2009 was also performed. Three major types and six subtypes were identified. Type I accounted for 30 %, type II 28 % and type III 42 %. The retrospective analysis showed three patients developed a decompensation that required extension of the fusion. One case developed worsening of shoulder balance requiring further surgery. This classification was tested for interobserver and intraobserver reliability. The mean kappa coefficients for interobserver reproducibility ranged from 0.89 to 0.952, while the mean kappa value for intraobserver reliability was 0.964 indicating a good-to-excellent reliability. The treatment algorithm guides the spinal surgeon to achieve optimal curve correction and postoperative shoulder balance whilst fusing the smallest number of spinal segments. The high interobserver reproducibility and intraobserver reliability makes it an invaluable tool to describe scoliosis curves in everyday clinical practice.

  6. A Cross-Layer Optimized Opportunistic Routing Scheme for Loss-and-Delay Sensitive WSNs

    PubMed Central

    Xu, Xin; Yuan, Minjiao; Liu, Xiao; Cai, Zhiping; Wang, Tian

    2018-01-01

    In wireless sensor networks (WSNs), communication links are typically error-prone and unreliable, so providing reliable and timely data routing for loss- and delay-sensitive applications in WSNs it is a challenge issue. Additionally, with specific thresholds in practical applications, the loss and delay sensitivity implies requirements for high reliability and low delay. Opportunistic Routing (OR) has been well studied in WSNs to improve reliability for error-prone and unreliable wireless communication links where the transmission power is assumed to be identical in the whole network. In this paper, a Cross-layer Optimized Opportunistic Routing (COOR) scheme is proposed to improve the communication link reliability and reduce delay for loss-and-delay sensitive WSNs. The main contribution of the COOR scheme is making full use of the remaining energy in networks to increase the transmission power of most nodes, which will provide a higher communication reliability or further transmission distance. Two optimization strategies referred to as COOR(R) and COOR(P) of the COOR scheme are proposed to improve network performance. In the case of increasing the transmission power, the COOR(R) strategy chooses a node that has a higher communication reliability with same distance in comparison to the traditional opportunistic routing when selecting the next hop candidate node. Since the reliability of data transmission is improved, the delay of the data reaching the sink is reduced by shortening the time of communication between candidate nodes. On the other hand, the COOR(P) strategy prefers a node that has the same communication reliability with longer distance. As a result, network performance can be improved for the following reasons: (a) the delay is reduced as fewer hops are needed while the packet reaches the sink in longer transmission distance circumstances; (b) the reliability can be improved since it is the product of the reliability of every hop of the routing path, and the count is reduced while the reliability of each hop is the same as the traditional method. After analyzing the energy consumption of the network in detail, the value of optimized transmission power in different areas is given. On the basis of a large number of experimental and theoretical analyses, the results show that the COOR scheme will increase communication reliability by 36.62–87.77%, decrease delay by 21.09–52.48%, and balance the energy consumption of 86.97% of the nodes in the WSNs. PMID:29751589

  7. A Cross-Layer Optimized Opportunistic Routing Scheme for Loss-and-Delay Sensitive WSNs.

    PubMed

    Xu, Xin; Yuan, Minjiao; Liu, Xiao; Liu, Anfeng; Xiong, Neal N; Cai, Zhiping; Wang, Tian

    2018-05-03

    In wireless sensor networks (WSNs), communication links are typically error-prone and unreliable, so providing reliable and timely data routing for loss- and delay-sensitive applications in WSNs it is a challenge issue. Additionally, with specific thresholds in practical applications, the loss and delay sensitivity implies requirements for high reliability and low delay. Opportunistic Routing (OR) has been well studied in WSNs to improve reliability for error-prone and unreliable wireless communication links where the transmission power is assumed to be identical in the whole network. In this paper, a Cross-layer Optimized Opportunistic Routing (COOR) scheme is proposed to improve the communication link reliability and reduce delay for loss-and-delay sensitive WSNs. The main contribution of the COOR scheme is making full use of the remaining energy in networks to increase the transmission power of most nodes, which will provide a higher communication reliability or further transmission distance. Two optimization strategies referred to as COOR(R) and COOR(P) of the COOR scheme are proposed to improve network performance. In the case of increasing the transmission power, the COOR(R) strategy chooses a node that has a higher communication reliability with same distance in comparison to the traditional opportunistic routing when selecting the next hop candidate node. Since the reliability of data transmission is improved, the delay of the data reaching the sink is reduced by shortening the time of communication between candidate nodes. On the other hand, the COOR(P) strategy prefers a node that has the same communication reliability with longer distance. As a result, network performance can be improved for the following reasons: (a) the delay is reduced as fewer hops are needed while the packet reaches the sink in longer transmission distance circumstances; (b) the reliability can be improved since it is the product of the reliability of every hop of the routing path, and the count is reduced while the reliability of each hop is the same as the traditional method. After analyzing the energy consumption of the network in detail, the value of optimized transmission power in different areas is given. On the basis of a large number of experimental and theoretical analyses, the results show that the COOR scheme will increase communication reliability by 36.62⁻87.77%, decrease delay by 21.09⁻52.48%, and balance the energy consumption of 86.97% of the nodes in the WSNs.

  8. Optimized approach for Ion Proton RNA sequencing reveals details of RNA splicing and editing features of the transcriptome.

    PubMed

    Brown, Roger B; Madrid, Nathaniel J; Suzuki, Hideaki; Ness, Scott A

    2017-01-01

    RNA-sequencing (RNA-seq) has become the standard method for unbiased analysis of gene expression but also provides access to more complex transcriptome features, including alternative RNA splicing, RNA editing, and even detection of fusion transcripts formed through chromosomal translocations. However, differences in library methods can adversely affect the ability to recover these different types of transcriptome data. For example, some methods have bias for one end of transcripts or rely on low-efficiency steps that limit the complexity of the resulting library, making detection of rare transcripts less likely. We tested several commonly used methods of RNA-seq library preparation and found vast differences in the detection of advanced transcriptome features, such as alternatively spliced isoforms and RNA editing sites. By comparing several different protocols available for the Ion Proton sequencer and by utilizing detailed bioinformatics analysis tools, we were able to develop an optimized random primer based RNA-seq technique that is reliable at uncovering rare transcript isoforms and RNA editing features, as well as fusion reads from oncogenic chromosome rearrangements. The combination of optimized libraries and rapid Ion Proton sequencing provides a powerful platform for the transcriptome analysis of research and clinical samples.

  9. Family System of Advanced Charring Ablators for Planetary Exploration Missions

    NASA Technical Reports Server (NTRS)

    Congdon, William M.; Curry, Donald M.

    2005-01-01

    Advanced Ablators Program Objectives: 1) Flight-ready(TRL-6) ablative heat shields for deep-space missions; 2) Diversity of selection from family-system approach; 3) Minimum weight systems with high reliability; 4) Optimized formulations and processing; 5) Fully characterized properties; and 6) Low-cost manufacturing. Definition and integration of candidate lightweight structures. Test and analysis database to support flight-vehicle engineering. Results from production scale-up studies and production-cost analyses.

  10. Purification of Derivatized Oligosaccharides by Solid Phase Extraction for Glycomic Analysis

    PubMed Central

    Zhang, Qiwei; Li, Henghui; Feng, Xiaojun; Liu, Bi-Feng; Liu, Xin

    2014-01-01

    Profiling of glycans released from proteins is very complex and important. To enhance the detection sensitivity, chemical derivatization is required for the analysis of carbohydrates. Due to the interference of excess reagents, a simple and reliable purification method is usually necessary for the derivatized oligosaccharides. Various SPE based methods have been applied for the clean-up process. To demonstrate the differences among these methods, seven types of self-packed SPE cartridges were systematically compared in this study. The optimized conditions were determined for each type of cartridge and it was found that microcrystalline cellulose was the most appropriate SPE material for the purification of derivatized oligosaccharide. Normal phase HPLC analysis of the derivatized maltoheptaose was realized with a detection limit of 0.12 pmol (S N−1 = 3) and a recovery over 70%. With the optimized SPE method, relative quantification analysis of N-glycans from model glycoproteins were carried out accurately and over 40 N-glycans from human serum samples were determined regardless of the isomers. Due to the high stability and sensitivity, microcrystalline cellulose cartridge showed potential applications in glycomics analysis. PMID:24705408

  11. Optimizing the switching time for 400 kV SF6 circuit breakers

    NASA Astrophysics Data System (ADS)

    Ciulica, D.

    2018-01-01

    This paper presents real-time voltage and current analysis for optimizing the wave switching point of the circuit breaker SF6. Circuit Breaker plays an important role in power systems. It provides protection for equipment in embedded stations in transport networks. SF6 Circuit Breaker is very important equipment in Power Systems, which is used for up to 400 kV due to its excellent performance. The controlled switching is used to eliminate transient modes and electrodynamic and dielectric charges in the network at manual switching of capacitor, shunt reactors and power transformers. These effects reduce the reliability and lifetime of the equipment installed on the network, or may lead to erroneous protection.

  12. Hologram interferometry in automotive component vibration testing

    NASA Astrophysics Data System (ADS)

    Brown, Gordon M.; Forbes, Jamie W.; Marchi, Mitchell M.; Wales, Raymond R.

    1993-02-01

    An ever increasing variety of automotive component vibration testing is being pursued at Ford Motor Company, U.S.A. The driving force for use of hologram interferometry in these tests is the continuing need to design component structures to meet more stringent functional performance criteria. Parameters such as noise and vibration, sound quality, and reliability must be optimized for the lightest weight component possible. Continually increasing customer expectations and regulatory pressures on fuel economy and safety mandate that vehicles be built from highly optimized components. This paper includes applications of holographic interferometry for powertrain support structure tuning, body panel noise reduction, wiper system noise and vibration path analysis, and other vehicle component studies.

  13. A tabu search evalutionary algorithm for multiobjective optimization: Application to a bi-criterion aircraft structural reliability problem

    NASA Astrophysics Data System (ADS)

    Long, Kim Chenming

    Real-world engineering optimization problems often require the consideration of multiple conflicting and noncommensurate objectives, subject to nonconvex constraint regions in a high-dimensional decision space. Further challenges occur for combinatorial multiobjective problems in which the decision variables are not continuous. Traditional multiobjective optimization methods of operations research, such as weighting and epsilon constraint methods, are ill-suited to solving these complex, multiobjective problems. This has given rise to the application of a wide range of metaheuristic optimization algorithms, such as evolutionary, particle swarm, simulated annealing, and ant colony methods, to multiobjective optimization. Several multiobjective evolutionary algorithms have been developed, including the strength Pareto evolutionary algorithm (SPEA) and the non-dominated sorting genetic algorithm (NSGA), for determining the Pareto-optimal set of non-dominated solutions. Although numerous researchers have developed a wide range of multiobjective optimization algorithms, there is a continuing need to construct computationally efficient algorithms with an improved ability to converge to globally non-dominated solutions along the Pareto-optimal front for complex, large-scale, multiobjective engineering optimization problems. This is particularly important when the multiple objective functions and constraints of the real-world system cannot be expressed in explicit mathematical representations. This research presents a novel metaheuristic evolutionary algorithm for complex multiobjective optimization problems, which combines the metaheuristic tabu search algorithm with the evolutionary algorithm (TSEA), as embodied in genetic algorithms. TSEA is successfully applied to bicriteria (i.e., structural reliability and retrofit cost) optimization of the aircraft tail structure fatigue life, which increases its reliability by prolonging fatigue life. A comparison for this application of the proposed algorithm, TSEA, with several state-of-the-art multiobjective optimization algorithms reveals that TSEA outperforms these algorithms by providing retrofit solutions with greater reliability for the same costs (i.e., closer to the Pareto-optimal front) after the algorithms are executed for the same number of generations. This research also demonstrates that TSEA competes with and, in some situations, outperforms state-of-the-art multiobjective optimization algorithms such as NSGA II and SPEA 2 when applied to classic bicriteria test problems in the technical literature and other complex, sizable real-world applications. The successful implementation of TSEA contributes to the safety of aeronautical structures by providing a systematic way to guide aircraft structural retrofitting efforts, as well as a potentially useful algorithm for a wide range of multiobjective optimization problems in engineering and other fields.

  14. Statistical analysis of global horizontal solar irradiation GHI in Fez city, Morocco

    NASA Astrophysics Data System (ADS)

    Bounoua, Z.; Mechaqrane, A.

    2018-05-01

    An accurate knowledge of the solar energy reaching the ground is necessary for sizing and optimizing the performances of solar installations. This paper describes a statistical analysis of the global horizontal solar irradiation (GHI) at Fez city, Morocco. For better reliability, we have first applied a set of check procedures to test the quality of hourly GHI measurements. We then eliminate the erroneous values which are generally due to measurement or the cosine effect errors. Statistical analysis show that the annual mean daily values of GHI is of approximately 5 kWh/m²/day. Daily monthly mean values and other parameter are also calculated.

  15. Reliability and validity of Edinburgh visual gait score as an evaluation tool for children with cerebral palsy.

    PubMed

    Del Pilar Duque Orozco, Maria; Abousamra, Oussama; Church, Chris; Lennon, Nancy; Henley, John; Rogers, Kenneth J; Sees, Julieanne P; Connor, Justin; Miller, Freeman

    2016-09-01

    Assessment of gait abnormalities in cerebral palsy (CP) is challenging, and access to instrumented gait analysis is not always feasible. Therefore, many observational gait analysis scales have been devised. This study aimed to evaluate the interobserver reliability, intraobserver reliability, and validity of Edinburgh visual gait score (EVGS). Video of 30 children with spastic CP were reviewed by 7 raters (10 children each in GMFCS levels I, II, and III, age 6-12 years). Three observers had high level of experience in gait analysis (10+ years), two had medium level (2-5 years) and two had no previous experience (orthopedic fellows). Interobserver reliability was evaluated using percentage of complete agreement and kappa values. Criterion validity was evaluated by comparing EVGS scores with 3DGA data taken from the same video visit. Interobserver agreement was 60-90% and Kappa values were 0.18-0.85 for the 17 items in EVGS. Reliability was higher for distal segments (foot/ankle/knee 63-90%; trunk/pelvis/hip 60-76%), with greater experience (high 66-91%, medium 62-90%, no-experience 41-87%), with more EVGS practice (1st 10 videos 52-88%, last 10 videos 64-97%) and when used with higher functioning children (GMFCS I 65-96%, II 58-90%, III 35-65%). Intraobserver agreement was 64-92%. Agreement between EVGS and 3DGA was 52-73%. We believe that having EVGS as part of the standardized gait evaluation is helpful in optimizing the visual scoring. EVGS can be a supportive tool that adds quantitative data instead of only qualitative assessment to a video only gait evaluation. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Advanced propulsion engine assessment based on a cermet reactor

    NASA Technical Reports Server (NTRS)

    Parsley, Randy C.

    1993-01-01

    A preferred Pratt & Whitney conceptual Nuclear Thermal Rocket Engine (NTRE) has been designed based on the fundamental NASA priorities of safety, reliability, cost, and performance. The basic philosophy underlying the design of the XNR2000 is the utilization of the most reliable form of ultrahigh temperature nuclear fuel and development of a core configuration which is optimized for uniform power distribution, operational flexibility, power maneuverability, weight, and robustness. The P&W NTRE system employs a fast spectrum, cermet fueled reactor configured in an expander cycle to ensure maximum operational safety. The cermet fuel form provides retention of fuel and fission products as well as high strength. A high level of confidence is provided by benchmark analysis and independent evaluations.

  17. A Network Coding Based Hybrid ARQ Protocol for Underwater Acoustic Sensor Networks

    PubMed Central

    Wang, Hao; Wang, Shilian; Zhang, Eryang; Zou, Jianbin

    2016-01-01

    Underwater Acoustic Sensor Networks (UASNs) have attracted increasing interest in recent years due to their extensive commercial and military applications. However, the harsh underwater channel causes many challenges for the design of reliable underwater data transport protocol. In this paper, we propose an energy efficient data transport protocol based on network coding and hybrid automatic repeat request (NCHARQ) to ensure reliability, efficiency and availability in UASNs. Moreover, an adaptive window length estimation algorithm is designed to optimize the throughput and energy consumption tradeoff. The algorithm can adaptively change the code rate and can be insensitive to the environment change. Extensive simulations and analysis show that NCHARQ significantly reduces energy consumption with short end-to-end delay. PMID:27618044

  18. Reliability-Based Control Design for Uncertain Systems

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.

    2005-01-01

    This paper presents a robust control design methodology for systems with probabilistic parametric uncertainty. Control design is carried out by solving a reliability-based multi-objective optimization problem where the probability of violating design requirements is minimized. Simultaneously, failure domains are optimally enlarged to enable global improvements in the closed-loop performance. To enable an efficient numerical implementation, a hybrid approach for estimating reliability metrics is developed. This approach, which integrates deterministic sampling and asymptotic approximations, greatly reduces the numerical burden associated with complex probabilistic computations without compromising the accuracy of the results. Examples using output-feedback and full-state feedback with state estimation are used to demonstrate the ideas proposed.

  19. [Methods of statistical analysis in differential diagnostics of the degree of brain glioma anaplasia during preoperative stage].

    PubMed

    Glavatskiĭ, A Ia; Guzhovskaia, N V; Lysenko, S N; Kulik, A V

    2005-12-01

    The authors proposed a possible preoperative diagnostics of the degree of supratentorial brain gliom anaplasia using statistical analysis methods. It relies on a complex examination of 934 patients with I-IV degree anaplasias, which had been treated in the Institute of Neurosurgery from 1990 to 2004. The use of statistical analysis methods for differential diagnostics of the degree of brain gliom anaplasia may optimize a diagnostic algorithm, increase reliability of obtained data and in some cases avoid carrying out irrational operative intrusions. Clinically important signs for the use of statistical analysis methods directed to preoperative diagnostics of brain gliom anaplasia have been defined

  20. System Design under Uncertainty: Evolutionary Optimization of the Gravity Probe-B Spacecraft

    NASA Technical Reports Server (NTRS)

    Pullen, Samuel P.; Parkinson, Bradford W.

    1994-01-01

    This paper discusses the application of evolutionary random-search algorithms (Simulated Annealing and Genetic Algorithms) to the problem of spacecraft design under performance uncertainty. Traditionally, spacecraft performance uncertainty has been measured by reliability. Published algorithms for reliability optimization are seldom used in practice because they oversimplify reality. The algorithm developed here uses random-search optimization to allow us to model the problem more realistically. Monte Carlo simulations are used to evaluate the objective function for each trial design solution. These methods have been applied to the Gravity Probe-B (GP-B) spacecraft being developed at Stanford University for launch in 1999, Results of the algorithm developed here for GP-13 are shown, and their implications for design optimization by evolutionary algorithms are discussed.

  1. Many-objective optimization and visual analytics reveal key trade-offs for London's water supply

    NASA Astrophysics Data System (ADS)

    Matrosov, Evgenii S.; Huskova, Ivana; Kasprzyk, Joseph R.; Harou, Julien J.; Lambert, Chris; Reed, Patrick M.

    2015-12-01

    In this study, we link a water resource management simulator to multi-objective search to reveal the key trade-offs inherent in planning a real-world water resource system. We consider new supplies and demand management (conservation) options while seeking to elucidate the trade-offs between the best portfolios of schemes to satisfy projected water demands. Alternative system designs are evaluated using performance measures that minimize capital and operating costs and energy use while maximizing resilience, engineering and environmental metrics, subject to supply reliability constraints. Our analysis shows many-objective evolutionary optimization coupled with state-of-the art visual analytics can help planners discover more diverse water supply system designs and better understand their inherent trade-offs. The approach is used to explore future water supply options for the Thames water resource system (including London's water supply). New supply options include a new reservoir, water transfers, artificial recharge, wastewater reuse and brackish groundwater desalination. Demand management options include leakage reduction, compulsory metering and seasonal tariffs. The Thames system's Pareto approximate portfolios cluster into distinct groups of water supply options; for example implementing a pipe refurbishment program leads to higher capital costs but greater reliability. This study highlights that traditional least-cost reliability constrained design of water supply systems masks asset combinations whose benefits only become apparent when more planning objectives are considered.

  2. Three-dimensional Numerical Analysis on Blade Response of Vertical Axis Tidal Current Turbine Under Operational Condition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Ye; Karri, Naveen K.; Wang, Qi

    Tidal power as a large-scale renewable source of energy has been receiving significant attention recently because of its advantages over the wind and other renewal energy sources. The technology used to harvest energy from tidal current is called a tidal current turbine. Though some of the principles of wind turbine design are applicable to tidal current turbines, the design of latter ones need additional considerations like cavitation damage, corrosion etc. for the long-term reliability of such turbines. Depending up on the orientation of axis, tidal current turbines can be classified as vertical axis turbines or horizontal axis turbines. Existing studiesmore » on the vertical axis tidal current turbine focus more on the hydrodynamic aspects of the turbine rather than the structural aspects. This paper summarizes our recent efforts to study the integrated hydrodynamic and structural aspects of the vertical axis tidal current turbines. After reviewing existing methods in modeling tidal current turbines, we developed a hybrid approach that combines discrete vortex method -finite element method that can simulate the integrated hydrodynamic and structural response of a vertical axis turbine. This hybrid method was initially employed to analyze a typical three-blade vertical axis turbine. The power coefficient was used to evaluate the hydrodynamic performance, and critical deflection was considered to evaluate the structural reliability. A sensitivity analysis was also conducted with various turbine height-to-radius ratios. The results indicate that both the power output and failure probability increase with the turbine height, suggesting a necessity for optimal design. An attempt to optimize a 3-blade vertical axis turbine design with hybrid method yielded a ratio of turbine height to radius (H/R) about 3.0 for reliable maximum power output.« less

  3. On the Effectiveness of Nature-Inspired Metaheuristic Algorithms for Performing Phase Equilibrium Thermodynamic Calculations

    PubMed Central

    Fateen, Seif-Eddeen K.; Bonilla-Petriciolet, Adrian

    2014-01-01

    The search for reliable and efficient global optimization algorithms for solving phase stability and phase equilibrium problems in applied thermodynamics is an ongoing area of research. In this study, we evaluated and compared the reliability and efficiency of eight selected nature-inspired metaheuristic algorithms for solving difficult phase stability and phase equilibrium problems. These algorithms are the cuckoo search (CS), intelligent firefly (IFA), bat (BA), artificial bee colony (ABC), MAKHA, a hybrid between monkey algorithm and krill herd algorithm, covariance matrix adaptation evolution strategy (CMAES), magnetic charged system search (MCSS), and bare bones particle swarm optimization (BBPSO). The results clearly showed that CS is the most reliable of all methods as it successfully solved all thermodynamic problems tested in this study. CS proved to be a promising nature-inspired optimization method to perform applied thermodynamic calculations for process design. PMID:24967430

  4. On the effectiveness of nature-inspired metaheuristic algorithms for performing phase equilibrium thermodynamic calculations.

    PubMed

    Fateen, Seif-Eddeen K; Bonilla-Petriciolet, Adrian

    2014-01-01

    The search for reliable and efficient global optimization algorithms for solving phase stability and phase equilibrium problems in applied thermodynamics is an ongoing area of research. In this study, we evaluated and compared the reliability and efficiency of eight selected nature-inspired metaheuristic algorithms for solving difficult phase stability and phase equilibrium problems. These algorithms are the cuckoo search (CS), intelligent firefly (IFA), bat (BA), artificial bee colony (ABC), MAKHA, a hybrid between monkey algorithm and krill herd algorithm, covariance matrix adaptation evolution strategy (CMAES), magnetic charged system search (MCSS), and bare bones particle swarm optimization (BBPSO). The results clearly showed that CS is the most reliable of all methods as it successfully solved all thermodynamic problems tested in this study. CS proved to be a promising nature-inspired optimization method to perform applied thermodynamic calculations for process design.

  5. Optimal Implementations for Reliable Circadian Clocks

    NASA Astrophysics Data System (ADS)

    Hasegawa, Yoshihiko; Arita, Masanori

    2014-09-01

    Circadian rhythms are acquired through evolution to increase the chances for survival through synchronizing with the daylight cycle. Reliable synchronization is realized through two trade-off properties: regularity to keep time precisely, and entrainability to synchronize the internal time with daylight. We find by using a phase model with multiple inputs that achieving the maximal limit of regularity and entrainability entails many inherent features of the circadian mechanism. At the molecular level, we demonstrate the role sharing of two light inputs, phase advance and delay, as is well observed in mammals. At the behavioral level, the optimal phase-response curve inevitably contains a dead zone, a time during which light pulses neither advance nor delay the clock. We reproduce the results of phase-controlling experiments entrained by two types of periodic light pulses. Our results indicate that circadian clocks are designed optimally for reliable clockwork through evolution.

  6. Quantification of larval zebrafish motor function in multiwell plates using open-source MATLAB applications.

    PubMed

    Zhou, Yangzhong; Cattley, Richard T; Cario, Clinton L; Bai, Qing; Burton, Edward A

    2014-07-01

    This article describes a method to quantify the movements of larval zebrafish in multiwell plates, using the open-source MATLAB applications LSRtrack and LSRanalyze. The protocol comprises four stages: generation of high-quality, flatly illuminated video recordings with exposure settings that facilitate object recognition; analysis of the resulting recordings using tools provided in LSRtrack to optimize tracking accuracy and motion detection; analysis of tracking data using LSRanalyze or custom MATLAB scripts; and implementation of validation controls. The method is reliable, automated and flexible, requires <1 h of hands-on work for completion once optimized and shows excellent signal:noise characteristics. The resulting data can be analyzed to determine the following: positional preference; displacement, velocity and acceleration; and duration and frequency of movement events and rest periods. This approach is widely applicable to the analysis of spontaneous or stimulus-evoked zebrafish larval neurobehavioral phenotypes resulting from a broad array of genetic and environmental manipulations, in a multiwell plate format suitable for high-throughput applications.

  7. Grid sensitivity for aerodynamic optimization and flow analysis

    NASA Technical Reports Server (NTRS)

    Sadrehaghighi, I.; Tiwari, S. N.

    1993-01-01

    After reviewing relevant literature, it is apparent that one aspect of aerodynamic sensitivity analysis, namely grid sensitivity, has not been investigated extensively. The grid sensitivity algorithms in most of these studies are based on structural design models. Such models, although sufficient for preliminary or conceptional design, are not acceptable for detailed design analysis. Careless grid sensitivity evaluations, would introduce gradient errors within the sensitivity module, therefore, infecting the overall optimization process. Development of an efficient and reliable grid sensitivity module with special emphasis on aerodynamic applications appear essential. The organization of this study is as follows. The physical and geometric representations of a typical model are derived in chapter 2. The grid generation algorithm and boundary grid distribution are developed in chapter 3. Chapter 4 discusses the theoretical formulation and aerodynamic sensitivity equation. The method of solution is provided in chapter 5. The results are presented and discussed in chapter 6. Finally, some concluding remarks are provided in chapter 7.

  8. Computationally efficient stochastic optimization using multiple realizations

    NASA Astrophysics Data System (ADS)

    Bayer, P.; Bürger, C. M.; Finkel, M.

    2008-02-01

    The presented study is concerned with computationally efficient methods for solving stochastic optimization problems involving multiple equally probable realizations of uncertain parameters. A new and straightforward technique is introduced that is based on dynamically ordering the stack of realizations during the search procedure. The rationale is that a small number of critical realizations govern the output of a reliability-based objective function. By utilizing a problem, which is typical to designing a water supply well field, several variants of this "stack ordering" approach are tested. The results are statistically assessed, in terms of optimality and nominal reliability. This study demonstrates that the simple ordering of a given number of 500 realizations while applying an evolutionary search algorithm can save about half of the model runs without compromising the optimization procedure. More advanced variants of stack ordering can, if properly configured, save up to more than 97% of the computational effort that would be required if the entire number of realizations were considered. The findings herein are promising for similar problems of water management and reliability-based design in general, and particularly for non-convex problems that require heuristic search techniques.

  9. Subsystem Analysis/Optimization for the X-34 Main Propulsion System

    NASA Technical Reports Server (NTRS)

    McDonald, J. P.; Hedayat, A.; Brown, T. M.; Knight, K. C.; Champion, R. H., Jr.

    1998-01-01

    The Orbital Sciences Corporation X-34 vehicle demonstrates technologies and operations key to future reusable launch vehicles. The general flight performance goal of this unmanned rocket plane is Mach 8 flight at an altitude of 250,000 feet. The Main Propulsion System (MPS) supplies liquid propellants to the main engine, which provides the primary thrust for attaining mission goals. Major MPS design and operational goals are aircraft-like ground operations, quick turnaround between missions, and low initial/operational costs. Analyses related to optimal MPS subsystem design are reviewed in this paper. A pressurization system trade weighs maintenance/reliability concerns against those for safety in a comparison of designs using pressure regulators versus orifices to control pressurant flow. A propellant dump/feed system analysis weighs the issues of maximum allowable vehicle landing weight, trajectory, and MPS complexity to arrive at a final configuration for propellant dump/feed systems.

  10. Numerical analysis and design optimization of supersonic after-burning with strut fuel injectors for scramjet engines

    NASA Astrophysics Data System (ADS)

    Candon, M. J.; Ogawa, H.

    2018-06-01

    Scramjets are a class of hypersonic airbreathing engine that offer promise for economical, reliable and high-speed access-to-space and atmospheric transport. The expanding flow in the scramjet nozzle comprises of unburned hydrogen. An after-burning scheme can be used to effectively utilize the remaining hydrogen by supplying additional oxygen into the nozzle, aiming to augment the thrust. This paper presents the results of a single-objective design optimization for a strut fuel injection scheme considering four design variables with the objective of maximizing thrust augmentation. Thrust is found to be augmented significantly owing to a combination of contributions from aerodynamic and combustion effects. Further understanding and physical insights have been gained by performing variance-based global sensitivity analysis, scrutinizing the nozzle flowfields, analyzing the distributions and contributions of the forces acting on the nozzle wall, and examining the combustion efficiency.

  11. Requirements analysis and preliminary design of a robotic assistant for reconstructive microsurgery.

    PubMed

    Vanthournhout, L; Herman, B; Duisit, J; Château, F; Szewczyk, J; Lengelé, B; Raucent, B

    2015-08-01

    Microanastomosis is a microsurgical gesture that involves suturing two very small blood vessels together. This gesture is used in many operations such as avulsed member auto-grafting, pediatric surgery, reconstructive surgery - including breast reconstruction by free flap. When vessels have diameters smaller than one millimeter, hand tremors make movements difficult to control. This paper introduces our preliminary steps towards robotic assistance for helping surgeons to perform microanastomosis in optimal conditions, in order to increase gesture quality and reliability even on smaller diameters. A general needs assessment and an experimental motion analysis were performed to define the requirements of the robot. Geometric parameters of the kinematic structure were then optimized to fulfill specific objectives. A prototype of the robot is currently being designed and built in order to providing a sufficient increase in accuracy without prolonging the duration of the procedure.

  12. A Novel Protocol for Model Calibration in Biological Wastewater Treatment

    PubMed Central

    Zhu, Ao; Guo, Jianhua; Ni, Bing-Jie; Wang, Shuying; Yang, Qing; Peng, Yongzhen

    2015-01-01

    Activated sludge models (ASMs) have been widely used for process design, operation and optimization in wastewater treatment plants. However, it is still a challenge to achieve an efficient calibration for reliable application by using the conventional approaches. Hereby, we propose a novel calibration protocol, i.e. Numerical Optimal Approaching Procedure (NOAP), for the systematic calibration of ASMs. The NOAP consists of three key steps in an iterative scheme flow: i) global factors sensitivity analysis for factors fixing; ii) pseudo-global parameter correlation analysis for non-identifiable factors detection; and iii) formation of a parameter subset through an estimation by using genetic algorithm. The validity and applicability are confirmed using experimental data obtained from two independent wastewater treatment systems, including a sequencing batch reactor and a continuous stirred-tank reactor. The results indicate that the NOAP can effectively determine the optimal parameter subset and successfully perform model calibration and validation for these two different systems. The proposed NOAP is expected to use for automatic calibration of ASMs and be applied potentially to other ordinary differential equations models. PMID:25682959

  13. Optimizing 4-Dimensional Magnetic Resonance Imaging Data Sampling for Respiratory Motion Analysis of Pancreatic Tumors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stemkens, Bjorn, E-mail: b.stemkens@umcutrecht.nl; Tijssen, Rob H.N.; Senneville, Baudouin D. de

    2015-03-01

    Purpose: To determine the optimum sampling strategy for retrospective reconstruction of 4-dimensional (4D) MR data for nonrigid motion characterization of tumor and organs at risk for radiation therapy purposes. Methods and Materials: For optimization, we compared 2 surrogate signals (external respiratory bellows and internal MRI navigators) and 2 MR sampling strategies (Cartesian and radial) in terms of image quality and robustness. Using the optimized protocol, 6 pancreatic cancer patients were scanned to calculate the 4D motion. Region of interest analysis was performed to characterize the respiratory-induced motion of the tumor and organs at risk simultaneously. Results: The MRI navigator was foundmore » to be a more reliable surrogate for pancreatic motion than the respiratory bellows signal. Radial sampling is most benign for undersampling artifacts and intraview motion. Motion characterization revealed interorgan and interpatient variation, as well as heterogeneity within the tumor. Conclusions: A robust 4D-MRI method, based on clinically available protocols, is presented and successfully applied to characterize the abdominal motion in a small number of pancreatic cancer patients.« less

  14. Optimization and validation of a method using UHPLC-fluorescence for the analysis of polycyclic aromatic hydrocarbons in cold-pressed vegetable oils.

    PubMed

    Silva, Simone Alves da; Sampaio, Geni Rodrigues; Torres, Elizabeth Aparecida Ferraz da Silva

    2017-04-15

    Among the different food categories, the oils and fats are important sources of exposure to polycyclic aromatic hydrocarbons (PAHs), a group of organic chemical contaminants. The use of a validated method is essential to obtain reliable analytical results since the legislation establishes maximum limits in different foods. The objective of this study was to optimize and validate a method for the quantification of four PAHs [benzo(a)anthracene, chrysene, benzo(b)fluoranthene, benzo(a)pyrene] in vegetable oils. The samples were submitted to liquid-liquid extraction, followed by solid-phase extraction, and analyzed by ultra-high performance liquid chromatography. Under the optimized conditions, the validation parameters were evaluated according to the INMETRO Guidelines: linearity (r2 >0.99), selectivity (no matrix interference), limits of detection (0.08-0.30μgkg -1 ) and quantification (0.25-1.00μgkg -1 ), recovery (80.13-100.04%), repeatability and intermediate precision (<10% RSD). The method was found to be adequate for routine analysis of PAHs in the vegetable oils evaluated. Copyright © 2016. Published by Elsevier Ltd.

  15. Application of Hyperspectral Imaging to Detect Sclerotinia sclerotiorum on Oilseed Rape Stems

    PubMed Central

    Kong, Wenwen; Zhang, Chu; Huang, Weihao

    2018-01-01

    Hyperspectral imaging covering the spectral range of 384–1034 nm combined with chemometric methods was used to detect Sclerotinia sclerotiorum (SS) on oilseed rape stems by two sample sets (60 healthy and 60 infected stems for each set). Second derivative spectra and PCA loadings were used to select the optimal wavelengths. Discriminant models were built and compared to detect SS on oilseed rape stems, including partial least squares-discriminant analysis, radial basis function neural network, support vector machine and extreme learning machine. The discriminant models using full spectra and optimal wavelengths showed good performance with classification accuracies of over 80% for the calibration and prediction set. Comparing all developed models, the optimal classification accuracies of the calibration and prediction set were over 90%. The similarity of selected optimal wavelengths also indicated the feasibility of using hyperspectral imaging to detect SS on oilseed rape stems. The results indicated that hyperspectral imaging could be used as a fast, non-destructive and reliable technique to detect plant diseases on stems. PMID:29300315

  16. Psychometric properties of the self-report Malay version of the Pediatric Quality of Life (PedsQLTM) 4.0 Generic Core Scales among multiethnic Malaysian adolescents.

    PubMed

    Ainuddin, Husna A; Loh, Siew Yim; Chinna, Karuthan; Low, Wah Yun; Roslani, April Camilla

    2015-06-01

    Adolescence is the potential period for growth and optimal functioning, but developmental issues like time of transition from childhood to adulthood will create stress and affect the adolescent's quality of life (QOL). However, there is a lack of research tool for measuring adolescent's QOL in Malaysia. The aim of the study was to determine the validity and reliability of the self-report Malay version of the pediatric QOL (PedsQL™) 4.0 Generic Core Scales in assessing the QOL of Malaysian adolescents. A cross-sectional study design using the 23-item self-report Malay version of the PedsQL 4.0 Generic Core Scales was administered on a convenient cluster sampling (n = 297 adolescent) from a secondary school. The internal consistency reliability had Cronbach's α values ranging from .70 to .89. Factor analysis reported a six-factor structure via principal axis factor analysis. In conclusion, the self-report Malay version of the pediatric QOL 4.0 Generic Core Scales is a reliable and valid tool to measure the QOL of multiethnic Malaysian adolescents. © The Author(s) 2013.

  17. Diabetes screening in overweight and obese children and adolescents: choosing the right test.

    PubMed

    Ehehalt, Stefan; Wiegand, Susanna; Körner, Antje; Schweizer, Roland; Liesenkötter, Klaus-Peter; Partsch, Carl-Joachim; Blumenstock, Gunnar; Spielau, Ulrike; Denzer, Christian; Ranke, Michael B; Neu, Andreas; Binder, Gerhard; Wabitsch, Martin; Kiess, Wieland; Reinehr, Thomas

    2017-01-01

    Type 2 diabetes can occur without any symptoms, and health problems associated with the disease are serious. Screening tests allowing an early diagnosis are desirable. However, optimal screening tests for diabetes in obese youth are discussed controversially. We performed an observational multicenter analysis including 4848 (2668 female) overweight and obese children aged 7 to 17 years without previously known diabetes. Using HbA1c and OGTT as diagnostic criteria, 2.4% (n = 115, 55 female) could be classified as having diabetes. Within this group, 68.7% had HbA1c levels ≥48 mmol/mol (≥6.5%). FPG ≥126 mg/dl (≥7.0 mmol/l) and/or 2-h glucose levels ≥200 mg/dl (≥11.1 mmol/l) were found in 46.1%. Out of the 115 cases fulfilling the OGTT and/or HbA1c criteria for diabetes, diabetes was confirmed in 43.5%. For FPG, the ROC analysis revealed an optimal threshold of 98 mg/dl (5.4 mmol/l) (sensitivity 70%, specificity 88%). For HbA1c, the best cut-off value was 42 mmol/mol (6.0%) (sensitivity 94%, specificity 93%). HbA1c seems to be more reliable than OGTT for diabetes screening in overweight and obese children and adolescents. The optimal HbA1c threshold for identifying patients with diabetes was found to be 42 mmol/mol (6.0%). What is Known: • The prevalence of obesity is increasing and health problems related to type 2 DM can be serious. However, an optimal screening test for diabetes in obese youth seems to be controversial in the literature. What is New: • In our study, the ROC analysis revealed for FPG an optimal threshold of 98 mg/dl (5.4 mmol/l, sensitivity 70%, specificity 88%) and for HbA1c a best cut-off value of 42 mmol/mol (6.0%, sensitivity 94%, specificity 93%) to detect diabetes. Thus, in overweight and obese children and adolescents, HbA1c seems to be a more reliable screening tool than OGTT.

  18. Maximizing the reliability of genomic selection by optimizing the calibration set of reference individuals: comparison of methods in two diverse groups of maize inbreds (Zea mays L.).

    PubMed

    Rincent, R; Laloë, D; Nicolas, S; Altmann, T; Brunel, D; Revilla, P; Rodríguez, V M; Moreno-Gonzalez, J; Melchinger, A; Bauer, E; Schoen, C-C; Meyer, N; Giauffret, C; Bauland, C; Jamin, P; Laborde, J; Monod, H; Flament, P; Charcosset, A; Moreau, L

    2012-10-01

    Genomic selection refers to the use of genotypic information for predicting breeding values of selection candidates. A prediction formula is calibrated with the genotypes and phenotypes of reference individuals constituting the calibration set. The size and the composition of this set are essential parameters affecting the prediction reliabilities. The objective of this study was to maximize reliabilities by optimizing the calibration set. Different criteria based on the diversity or on the prediction error variance (PEV) derived from the realized additive relationship matrix-best linear unbiased predictions model (RA-BLUP) were used to select the reference individuals. For the latter, we considered the mean of the PEV of the contrasts between each selection candidate and the mean of the population (PEVmean) and the mean of the expected reliabilities of the same contrasts (CDmean). These criteria were tested with phenotypic data collected on two diversity panels of maize (Zea mays L.) genotyped with a 50k SNPs array. In the two panels, samples chosen based on CDmean gave higher reliabilities than random samples for various calibration set sizes. CDmean also appeared superior to PEVmean, which can be explained by the fact that it takes into account the reduction of variance due to the relatedness between individuals. Selected samples were close to optimality for a wide range of trait heritabilities, which suggests that the strategy presented here can efficiently sample subsets in panels of inbred lines. A script to optimize reference samples based on CDmean is available on request.

  19. THESEUS: maximum likelihood superpositioning and analysis of macromolecular structures

    PubMed Central

    Theobald, Douglas L.; Wuttke, Deborah S.

    2008-01-01

    Summary THESEUS is a command line program for performing maximum likelihood (ML) superpositions and analysis of macromolecular structures. While conventional superpositioning methods use ordinary least-squares (LS) as the optimization criterion, ML superpositions provide substantially improved accuracy by down-weighting variable structural regions and by correcting for correlations among atoms. ML superpositioning is robust and insensitive to the specific atoms included in the analysis, and thus it does not require subjective pruning of selected variable atomic coordinates. Output includes both likelihood-based and frequentist statistics for accurate evaluation of the adequacy of a superposition and for reliable analysis of structural similarities and differences. THESEUS performs principal components analysis for analyzing the complex correlations found among atoms within a structural ensemble. PMID:16777907

  20. Advances and trends in computational structural mechanics

    NASA Technical Reports Server (NTRS)

    Noor, A. K.

    1986-01-01

    Recent developments in computational structural mechanics are reviewed with reference to computational needs for future structures technology, advances in computational models for material behavior, discrete element technology, assessment and control of numerical simulations of structural response, hybrid analysis, and techniques for large-scale optimization. Research areas in computational structural mechanics which have high potential for meeting future technological needs are identified. These include prediction and analysis of the failure of structural components made of new materials, development of computational strategies and solution methodologies for large-scale structural calculations, and assessment of reliability and adaptive improvement of response predictions.

  1. Design of a setup for 252Cf neutron source for storage and analysis purpose

    NASA Astrophysics Data System (ADS)

    Hei, Daqian; Zhuang, Haocheng; Jia, Wenbao; Cheng, Can; Jiang, Zhou; Wang, Hongtao; Chen, Da

    2016-11-01

    252Cf is a reliable isotopic neutron source and widely used in the prompt gamma ray neutron activation analysis (PGNAA) technique. A cylindrical barrel made by polymethyl methacrylate contained with the boric acid solution was designed for storage and application of a 5 μg 252Cf neutron source. The size of the setup was optimized with Monte Carlo code. The experiments were performed and the results showed the doses were reduced with the setup and less than the allowable limit. The intensity and collimating radius of the neutron beam could also be adjusted through different collimator.

  2. Response surface optimization of medium components for naringinase production from Staphylococcus xylosus MAK2.

    PubMed

    Puri, Munish; Kaur, Aneet; Singh, Ram Sarup; Singh, Anubhav

    2010-09-01

    Response surface methodology was used to optimize the fermentation medium for enhancing naringinase production by Staphylococcus xylosus. The first step of this process involved the individual adjustment and optimization of various medium components at shake flask level. Sources of carbon (sucrose) and nitrogen (sodium nitrate), as well as an inducer (naringin) and pH levels were all found to be the important factors significantly affecting naringinase production. In the second step, a 22 full factorial central composite design was applied to determine the optimal levels of each of the significant variables. A second-order polynomial was derived by multiple regression analysis on the experimental data. Using this methodology, the optimum values for the critical components were obtained as follows: sucrose, 10.0%; sodium nitrate, 10.0%; pH 5.6; biomass concentration, 1.58%; and naringin, 0.50% (w/v), respectively. Under optimal conditions, the experimental naringinase production was 8.45 U/mL. The determination coefficients (R(2)) were 0.9908 and 0.9950 for naringinase activity and biomass production, respectively, indicating an adequate degree of reliability in the model.

  3. Evaluating information content of SNPs for sample-tagging in re-sequencing projects.

    PubMed

    Hu, Hao; Liu, Xiang; Jin, Wenfei; Hilger Ropers, H; Wienker, Thomas F

    2015-05-15

    Sample-tagging is designed for identification of accidental sample mix-up, which is a major issue in re-sequencing studies. In this work, we develop a model to measure the information content of SNPs, so that we can optimize a panel of SNPs that approach the maximal information for discrimination. The analysis shows that as low as 60 optimized SNPs can differentiate the individuals in a population as large as the present world, and only 30 optimized SNPs are in practice sufficient in labeling up to 100 thousand individuals. In the simulated populations of 100 thousand individuals, the average Hamming distances, generated by the optimized set of 30 SNPs are larger than 18, and the duality frequency, is lower than 1 in 10 thousand. This strategy of sample discrimination is proved robust in large sample size and different datasets. The optimized sets of SNPs are designed for Whole Exome Sequencing, and a program is provided for SNP selection, allowing for customized SNP numbers and interested genes. The sample-tagging plan based on this framework will improve re-sequencing projects in terms of reliability and cost-effectiveness.

  4. Strategies and Approaches to TPS Design

    NASA Technical Reports Server (NTRS)

    Kolodziej, Paul

    2005-01-01

    Thermal protection systems (TPS) insulate planetary probes and Earth re-entry vehicles from the aerothermal heating experienced during hypersonic deceleration to the planet s surface. The systems are typically designed with some additional capability to compensate for both variations in the TPS material and for uncertainties in the heating environment. This additional capability, or robustness, also provides a surge capability for operating under abnormal severe conditions for a short period of time, and for unexpected events, such as meteoroid impact damage, that would detract from the nominal performance. Strategies and approaches to developing robust designs must also minimize mass because an extra kilogram of TPS displaces one kilogram of payload. Because aircraft structures must be optimized for minimum mass, reliability-based design approaches for mechanical components exist that minimize mass. Adapting these existing approaches to TPS component design takes advantage of the extensive work, knowledge, and experience from nearly fifty years of reliability-based design of mechanical components. A Non-Dimensional Load Interference (NDLI) method for calculating the thermal reliability of TPS components is presented in this lecture and applied to several examples. A sensitivity analysis from an existing numerical simulation of a carbon phenolic TPS provides insight into the effects of the various design parameters, and is used to demonstrate how sensitivity analysis may be used with NDLI to develop reliability-based designs of TPS components.

  5. The Pareidolia Test: A Simple Neuropsychological Test Measuring Visual Hallucination-Like Illusions.

    PubMed

    Mamiya, Yasuyuki; Nishio, Yoshiyuki; Watanabe, Hiroyuki; Yokoi, Kayoko; Uchiyama, Makoto; Baba, Toru; Iizuka, Osamu; Kanno, Shigenori; Kamimura, Naoto; Kazui, Hiroaki; Hashimoto, Mamoru; Ikeda, Manabu; Takeshita, Chieko; Shimomura, Tatsuo; Mori, Etsuro

    2016-01-01

    Visual hallucinations are a core clinical feature of dementia with Lewy bodies (DLB), and this symptom is important in the differential diagnosis and prediction of treatment response. The pareidolia test is a tool that evokes visual hallucination-like illusions, and these illusions may be a surrogate marker of visual hallucinations in DLB. We created a simplified version of the pareidolia test and examined its validity and reliability to establish the clinical utility of this test. The pareidolia test was administered to 52 patients with DLB, 52 patients with Alzheimer's disease (AD) and 20 healthy controls (HCs). We assessed the test-retest/inter-rater reliability using the intra-class correlation coefficient (ICC) and the concurrent validity using the Neuropsychiatric Inventory (NPI) hallucinations score as a reference. A receiver operating characteristic (ROC) analysis was used to evaluate the sensitivity and specificity of the pareidolia test to differentiate DLB from AD and HCs. The pareidolia test required approximately 15 minutes to administer, exhibited good test-retest/inter-rater reliability (ICC of 0.82), and moderately correlated with the NPI hallucinations score (rs = 0.42). Using an optimal cut-off score set according to the ROC analysis, and the pareidolia test differentiated DLB from AD with a sensitivity of 81% and a specificity of 92%. Our study suggests that the simplified version of the pareidolia test is a valid and reliable surrogate marker of visual hallucinations in DLB.

  6. Honing process optimization algorithms

    NASA Astrophysics Data System (ADS)

    Kadyrov, Ramil R.; Charikov, Pavel N.; Pryanichnikova, Valeria V.

    2018-03-01

    This article considers the relevance of honing processes for creating high-quality mechanical engineering products. The features of the honing process are revealed and such important concepts as the task for optimization of honing operations, the optimal structure of the honing working cycles, stepped and stepless honing cycles, simulation of processing and its purpose are emphasized. It is noted that the reliability of the mathematical model determines the quality parameters of the honing process control. An algorithm for continuous control of the honing process is proposed. The process model reliably describes the machining of a workpiece in a sufficiently wide area and can be used to operate the CNC machine CC743.

  7. Reliability-based optimization of an active vibration controller using evolutionary algorithms

    NASA Astrophysics Data System (ADS)

    Saraygord Afshari, Sajad; Pourtakdoust, Seid H.

    2017-04-01

    Many modern industrialized systems such as aircrafts, rotating turbines, satellite booms, etc. cannot perform their desired tasks accurately if their uninhibited structural vibrations are not controlled properly. Structural health monitoring and online reliability calculations are emerging new means to handle system imposed uncertainties. As stochastic forcing are unavoidable, in most engineering systems, it is often needed to take them into the account for the control design process. In this research, smart material technology is utilized for structural health monitoring and control in order to keep the system in a reliable performance range. In this regard, a reliability-based cost function is assigned for both controller gain optimization as well as sensor placement. The proposed scheme is implemented and verified for a wing section. Comparison of results for the frequency responses is considered to show potential applicability of the presented technique.

  8. Optimization of turning process through the analytic flank wear modelling

    NASA Astrophysics Data System (ADS)

    Del Prete, A.; Franchi, R.; De Lorenzis, D.

    2018-05-01

    In the present work, the approach used for the optimization of the process capabilities for Oil&Gas components machining will be described. These components are machined by turning of stainless steel castings workpieces. For this purpose, a proper Design Of Experiments (DOE) plan has been designed and executed: as output of the experimentation, data about tool wear have been collected. The DOE has been designed starting from the cutting speed and feed values recommended by the tools manufacturer; the depth of cut parameter has been maintained as a constant. Wear data has been obtained by means the observation of the tool flank wear under an optical microscope: the data acquisition has been carried out at regular intervals of working times. Through a statistical data and regression analysis, analytical models of the flank wear and the tool life have been obtained. The optimization approach used is a multi-objective optimization, which minimizes the production time and the number of cutting tools used, under the constraint on a defined flank wear level. The technique used to solve the optimization problem is a Multi Objective Particle Swarm Optimization (MOPS). The optimization results, validated by the execution of a further experimental campaign, highlighted the reliability of the work and confirmed the usability of the optimized process parameters and the potential benefit for the company.

  9. Welded joints integrity analysis and optimization for fiber laser welding of dissimilar materials

    NASA Astrophysics Data System (ADS)

    Ai, Yuewei; Shao, Xinyu; Jiang, Ping; Li, Peigen; Liu, Yang; Liu, Wei

    2016-11-01

    Dissimilar materials welded joints provide many advantages in power, automotive, chemical, and spacecraft industries. The weld bead integrity which is determined by process parameters plays a significant role in the welding quality during the fiber laser welding (FLW) of dissimilar materials. In this paper, an optimization method by taking the integrity of the weld bead and weld area into consideration is proposed for FLW of dissimilar materials, the low carbon steel and stainless steel. The relationships between the weld bead integrity and process parameters are developed by the genetic algorithm optimized back propagation neural network (GA-BPNN). The particle swarm optimization (PSO) algorithm is taken for optimizing the predicted outputs from GA-BPNN for the objective. Through the optimization process, the desired weld bead with good integrity and minimum weld area are obtained and the corresponding microstructure and microhardness are excellent. The mechanical properties of the optimized joints are greatly improved compared with that of the un-optimized welded joints. Moreover, the effects of significant factors are analyzed based on the statistical approach and the laser power (LP) is identified as the most significant factor on the weld bead integrity and weld area. The results indicate that the proposed method is effective for improving the reliability and stability of welded joints in the practical production.

  10. The T/R modules for phased-array antennas

    NASA Astrophysics Data System (ADS)

    Peignet, Colette; Mancuso, Yves; Resneau, J. Claude

    1990-09-01

    The concept of phased array radar is critically dependent on the availability of compact, reliable and low power consuming Transmitter/Receiver (T/R) modules. An overview is given on two major programs actually at development stage within the Thomson group and on three major development axis (electrical concept optimization, packaging, and size reduction). The technical feasibility of the concept was proven and the three major axis were enlightened, based on reliability, power added efficiency, and RF tests optimization.

  11. A hybrid Jaya algorithm for reliability-redundancy allocation problems

    NASA Astrophysics Data System (ADS)

    Ghavidel, Sahand; Azizivahed, Ali; Li, Li

    2018-04-01

    This article proposes an efficient improved hybrid Jaya algorithm based on time-varying acceleration coefficients (TVACs) and the learning phase introduced in teaching-learning-based optimization (TLBO), named the LJaya-TVAC algorithm, for solving various types of nonlinear mixed-integer reliability-redundancy allocation problems (RRAPs) and standard real-parameter test functions. RRAPs include series, series-parallel, complex (bridge) and overspeed protection systems. The search power of the proposed LJaya-TVAC algorithm for finding the optimal solutions is first tested on the standard real-parameter unimodal and multi-modal functions with dimensions of 30-100, and then tested on various types of nonlinear mixed-integer RRAPs. The results are compared with the original Jaya algorithm and the best results reported in the recent literature. The optimal results obtained with the proposed LJaya-TVAC algorithm provide evidence for its better and acceptable optimization performance compared to the original Jaya algorithm and other reported optimal results.

  12. [MaRS Project

    NASA Technical Reports Server (NTRS)

    Aruljothi, Arunvenkatesh

    2016-01-01

    The Space Exploration Division of the Safety and Mission Assurances Directorate is responsible for reducing the risk to Human Space Flight Programs by providing system safety, reliability, and risk analysis. The Risk & Reliability Analysis branch plays a part in this by utilizing Probabilistic Risk Assessment (PRA) and Reliability and Maintainability (R&M) tools to identify possible types of failure and effective solutions. A continuous effort of this branch is MaRS, or Mass and Reliability System, a tool that was the focus of this internship. Future long duration space missions will have to find a balance between the mass and reliability of their spare parts. They will be unable take spares of everything and will have to determine what is most likely to require maintenance and spares. Currently there is no database that combines mass and reliability data of low level space-grade components. MaRS aims to be the first database to do this. The data in MaRS will be based on the hardware flown on the International Space Stations (ISS). The components on the ISS have a long history and are well documented, making them the perfect source. Currently, MaRS is a functioning excel workbook database; the backend is complete and only requires optimization. MaRS has been populated with all the assemblies and their components that are used on the ISS; the failures of these components are updated regularly. This project was a continuation on the efforts of previous intern groups. Once complete, R&M engineers working on future space flight missions will be able to quickly access failure and mass data on assemblies and components, allowing them to make important decisions and tradeoffs.

  13. [Optimization of one-step pelletization technology of Biqiu granules by Plackett-Burman design and Box-Behnken response surface methodology].

    PubMed

    Zhang, Yan-jun; Liu, Li-li; Hu, Jun-hua; Wu, Yun; Chao, En-xiang; Xiao, Wei

    2015-11-01

    First with the qualified rate of granules as the evaluation index, significant influencing factors were firstly screened by Plackett-Burman design. Then, with the qualified rate and moisture content as the evaluation indexes, significant factors that affect one-step pelletization technology were further optimized by Box-Behnken design; experimental data were imitated by multiple regression and second-order polynomial equation; and response surface method was used for predictive analysis of optimal technology. The best conditions were as follows: inlet air temperature of 85 degrees C, sample introduction speed of 33 r x min(-1), density of concrete 1. 10. One-step pelletization technology of Biqiu granules by Plackett-Burman design and Box-Behnken response surface methodology was stable and feasible with good predictability, which provided reliable basis for the industrialized production of Biqiu granules.

  14. HOMER: The hybrid optimization model for electric renewable

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lilienthal, P.; Flowers, L.; Rossmann, C.

    1995-12-31

    Hybrid renewable systems are often more cost-effective than grid extensions or isolated diesel generators for providing power to remote villages. There are a wide variety of hybrid systems being developed for village applications that have differing combinations of wind, photovoltaics, batteries, and diesel generators. Due to variations in loads and resources determining the most appropriate combination of these components for a particular village is a difficult modelling task. To address this design problem the National Renewable Energy Laboratory has developed the Hybrid Optimization Model for Electric Renewables (HOMER). Existing models are either too detailed for screening analysis or too simplemore » for reliable estimation of performance. HOMER is a design optimization model that determines the configuration, dispatch, and load management strategy that minimizes life-cycle costs for a particular site and application. This paper describes the HOMER methodology and presents representative results.« less

  15. Motivations for genetic testing for lung cancer risk among young smokers.

    PubMed

    O'Neill, Suzanne C; Lipkus, Isaac M; Sanderson, Saskia C; Shepperd, James; Docherty, Sharron; McBride, Colleen M

    2013-11-01

    To examine why young people might want to undergo genetic susceptibility testing for lung cancer despite knowing that tested gene variants are associated with small increases in disease risk. The authors used a mixed-method approach to evaluate motives for and against genetic testing and the association between these motivations and testing intentions in 128 college students who smoke. Exploratory factor analysis yielded four reliable factors: Test Scepticism, Test Optimism, Knowledge Enhancement and Smoking Optimism. Test Optimism and Knowledge Enhancement correlated positively with intentions to test in bivariate and multivariate analyses (ps<0.001). Test Scepticism correlated negatively with testing intentions in multivariate analyses (p<0.05). Open-ended questions assessing testing motivations generally replicated themes of the quantitative survey. In addition to learning about health risks, young people may be motivated to seek genetic testing for reasons, such as gaining knowledge about new genetic technologies more broadly.

  16. A simulation model to estimate the cost and effectiveness of alternative dialysis initiation strategies.

    PubMed

    Lee, Chris P; Chertow, Glenn M; Zenios, Stefanos A

    2006-01-01

    Patients with end-stage renal disease (ESRD) require dialysis to maintain survival. The optimal timing of dialysis initiation in terms of cost-effectiveness has not been established. We developed a simulation model of individuals progressing towards ESRD and requiring dialysis. It can be used to analyze dialysis strategies and scenarios. It was embedded in an optimization frame worked to derive improved strategies. Actual (historical) and simulated survival curves and hospitalization rates were virtually indistinguishable. The model overestimated transplantation costs (10%) but it was related to confounding by Medicare coverage. To assess the model's robustness, we examined several dialysis strategies while input parameters were perturbed. Under all 38 scenarios, relative rankings remained unchanged. An improved policy for a hypothetical patient was derived using an optimization algorithm. The model produces reliable results and is robust. It enables the cost-effectiveness analysis of dialysis strategies.

  17. Scheduling for energy and reliability management on multiprocessor real-time systems

    NASA Astrophysics Data System (ADS)

    Qi, Xuan

    Scheduling algorithms for multiprocessor real-time systems have been studied for years with many well-recognized algorithms proposed. However, it is still an evolving research area and many problems remain open due to their intrinsic complexities. With the emergence of multicore processors, it is necessary to re-investigate the scheduling problems and design/develop efficient algorithms for better system utilization, low scheduling overhead, high energy efficiency, and better system reliability. Focusing cluster schedulings with optimal global schedulers, we study the utilization bound and scheduling overhead for a class of cluster-optimal schedulers. Then, taking energy/power consumption into consideration, we developed energy-efficient scheduling algorithms for real-time systems, especially for the proliferating embedded systems with limited energy budget. As the commonly deployed energy-saving technique (e.g. dynamic voltage frequency scaling (DVFS)) will significantly affect system reliability, we study schedulers that have intelligent mechanisms to recuperate system reliability to satisfy the quality assurance requirements. Extensive simulation is conducted to evaluate the performance of the proposed algorithms on reduction of scheduling overhead, energy saving, and reliability improvement. The simulation results show that the proposed reliability-aware power management schemes could preserve the system reliability while still achieving substantial energy saving.

  18. Reliability Based Design for a Raked Wing Tip of an Airframe

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.

    2011-01-01

    A reliability-based optimization methodology has been developed to design the raked wing tip of the Boeing 767-400 extended range airliner made of composite and metallic materials. Design is formulated for an accepted level of risk or reliability. The design variables, weight and the constraints became functions of reliability. Uncertainties in the load, strength and the material properties, as well as the design variables, were modeled as random parameters with specified distributions, like normal, Weibull or Gumbel functions. The objective function and constraint, or a failure mode, became derived functions of the risk-level. Solution to the problem produced the optimum design with weight, variables and constraints as a function of the risk-level. Optimum weight versus reliability traced out an inverted-S shaped graph. The center of the graph corresponded to a 50 percent probability of success, or one failure in two samples. Under some assumptions, this design would be quite close to the deterministic optimum solution. The weight increased when reliability exceeded 50 percent, and decreased when the reliability was compromised. A design could be selected depending on the level of risk acceptable to a situation. The optimization process achieved up to a 20-percent reduction in weight over traditional design.

  19. Optimization of Adaptive Intraply Hybrid Fiber Composites with Reliability Considerations

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1994-01-01

    The reliability with bounded distribution parameters (mean, standard deviation) was maximized and the reliability-based cost was minimized for adaptive intra-ply hybrid fiber composites by using a probabilistic method. The probabilistic method accounts for all naturally occurring uncertainties including those in constituent material properties, fabrication variables, structure geometry, and control-related parameters. Probabilistic sensitivity factors were computed and used in the optimization procedures. For actuated change in the angle of attack of an airfoil-like composite shell structure with an adaptive torque plate, the reliability was maximized to 0.9999 probability, with constraints on the mean and standard deviation of the actuation material volume ratio (percentage of actuation composite material in a ply) and the actuation strain coefficient. The reliability-based cost was minimized for an airfoil-like composite shell structure with an adaptive skin and a mean actuation material volume ratio as the design parameter. At a O.9-mean actuation material volume ratio, the minimum cost was obtained.

  20. Development of the IBSAL-SimMOpt Method for the Optimization of Quality in a Corn Stover Supply Chain

    DOE PAGES

    Chavez, Hernan; Castillo-Villar, Krystel; Webb, Erin

    2017-08-01

    Variability on the physical characteristics of feedstock has a relevant effect on the reactor’s reliability and operating cost. Most of the models developed to optimize biomass supply chains have failed to quantify the effect of biomass quality and preprocessing operations required to meet biomass specifications on overall cost and performance. The Integrated Biomass Supply Analysis and Logistics (IBSAL) model estimates the harvesting, collection, transportation, and storage cost while considering the stochastic behavior of the field-to-biorefinery supply chain. This paper proposes an IBSAL-SimMOpt (Simulation-based Multi-Objective Optimization) method for optimizing the biomass quality and costs associated with the efforts needed to meetmore » conversion technology specifications. The method is developed in two phases. For the first phase, a SimMOpt tool that interacts with the extended IBSAL is developed. For the second phase, the baseline IBSAL model is extended so that the cost for meeting and/or penalization for failing in meeting specifications are considered. The IBSAL-SimMOpt method is designed to optimize quality characteristics of biomass, cost related to activities intended to improve the quality of feedstock, and the penalization cost. A case study based on 1916 farms in Ontario, Canada is considered for testing the proposed method. Analysis of the results demonstrates that this method is able to find a high-quality set of non-dominated solutions.« less

  1. Development of the IBSAL-SimMOpt Method for the Optimization of Quality in a Corn Stover Supply Chain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chavez, Hernan; Castillo-Villar, Krystel; Webb, Erin

    Variability on the physical characteristics of feedstock has a relevant effect on the reactor’s reliability and operating cost. Most of the models developed to optimize biomass supply chains have failed to quantify the effect of biomass quality and preprocessing operations required to meet biomass specifications on overall cost and performance. The Integrated Biomass Supply Analysis and Logistics (IBSAL) model estimates the harvesting, collection, transportation, and storage cost while considering the stochastic behavior of the field-to-biorefinery supply chain. This paper proposes an IBSAL-SimMOpt (Simulation-based Multi-Objective Optimization) method for optimizing the biomass quality and costs associated with the efforts needed to meetmore » conversion technology specifications. The method is developed in two phases. For the first phase, a SimMOpt tool that interacts with the extended IBSAL is developed. For the second phase, the baseline IBSAL model is extended so that the cost for meeting and/or penalization for failing in meeting specifications are considered. The IBSAL-SimMOpt method is designed to optimize quality characteristics of biomass, cost related to activities intended to improve the quality of feedstock, and the penalization cost. A case study based on 1916 farms in Ontario, Canada is considered for testing the proposed method. Analysis of the results demonstrates that this method is able to find a high-quality set of non-dominated solutions.« less

  2. PAT: predictor for structured units and its application for the optimization of target molecules for the generation of synthetic antibodies.

    PubMed

    Jeon, Jouhyun; Arnold, Roland; Singh, Fateh; Teyra, Joan; Braun, Tatjana; Kim, Philip M

    2016-04-01

    The identification of structured units in a protein sequence is an important first step for most biochemical studies. Importantly for this study, the identification of stable structured region is a crucial first step to generate novel synthetic antibodies. While many approaches to find domains or predict structured regions exist, important limitations remain, such as the optimization of domain boundaries and the lack of identification of non-domain structured units. Moreover, no integrated tool exists to find and optimize structural domains within protein sequences. Here, we describe a new tool, PAT ( http://www.kimlab.org/software/pat ) that can efficiently identify both domains (with optimized boundaries) and non-domain putative structured units. PAT automatically analyzes various structural properties, evaluates the folding stability, and reports possible structural domains in a given protein sequence. For reliability evaluation of PAT, we applied PAT to identify antibody target molecules based on the notion that soluble and well-defined protein secondary and tertiary structures are appropriate target molecules for synthetic antibodies. PAT is an efficient and sensitive tool to identify structured units. A performance analysis shows that PAT can characterize structurally well-defined regions in a given sequence and outperforms other efforts to define reliable boundaries of domains. Specially, PAT successfully identifies experimentally confirmed target molecules for antibody generation. PAT also offers the pre-calculated results of 20,210 human proteins to accelerate common queries. PAT can therefore help to investigate large-scale structured domains and improve the success rate for synthetic antibody generation.

  3. The Factor Structure of the Spiritual Well-Being Scale in Veterans Experienced Chemical Weapon Exposure.

    PubMed

    Sharif Nia, Hamid; Pahlevan Sharif, Saeed; Boyle, Christopher; Yaghoobzadeh, Ameneh; Tahmasbi, Bahram; Rassool, G Hussein; Taebei, Mozhgan; Soleimani, Mohammad Ali

    2018-04-01

    This study aimed to determine the factor structure of the spiritual well-being among a sample of the Iranian veterans. In this methodological research, 211 male veterans of Iran-Iraq warfare completed the Paloutzian and Ellison spiritual well-being scale. Maximum likelihood (ML) with oblique rotation was used to assess domain structure of the spiritual well-being. The construct validity of the scale was assessed using confirmatory factor analysis (CFA), convergent validity, and discriminant validity. Reliability was evaluated with Cronbach's alpha, Theta (θ), and McDonald Omega (Ω) coefficients, intra-class correlation coefficient (ICC), and construct reliability (CR). Results of ML and CFA suggested three factors which were labeled "relationship with God," "belief in fate and destiny," and "life optimism." The ICC, coefficients of the internal consistency, and CR were >.7 for the factors of the scale. Convergent validity and discriminant validity did not fulfill the requirements. The Persian version of spiritual well-being scale demonstrated suitable validity and reliability among the veterans of Iran-Iraq warfare.

  4. An enhanced performance through agent-based secure approach for mobile ad hoc networks

    NASA Astrophysics Data System (ADS)

    Bisen, Dhananjay; Sharma, Sanjeev

    2018-01-01

    This paper proposes an agent-based secure enhanced performance approach (AB-SEP) for mobile ad hoc network. In this approach, agent nodes are selected through optimal node reliability as a factor. This factor is calculated on the basis of node performance features such as degree difference, normalised distance value, energy level, mobility and optimal hello interval of node. After selection of agent nodes, a procedure of malicious behaviour detection is performed using fuzzy-based secure architecture (FBSA). To evaluate the performance of the proposed approach, comparative analysis is done with conventional schemes using performance parameters such as packet delivery ratio, throughput, total packet forwarding, network overhead, end-to-end delay and percentage of malicious detection.

  5. The Improvement of Particle Swarm Optimization: a Case Study of Optimal Operation in Goupitan Reservoir

    NASA Astrophysics Data System (ADS)

    Li, Haichen; Qin, Tao; Wang, Weiping; Lei, Xiaohui; Wu, Wenhui

    2018-02-01

    Due to the weakness in holding diversity and reaching global optimum, the standard particle swarm optimization has not performed well in reservoir optimal operation. To solve this problem, this paper introduces downhill simplex method to work together with the standard particle swarm optimization. The application of this approach in Goupitan reservoir optimal operation proves that the improved method had better accuracy and higher reliability with small investment.

  6. Reliable Adaptive Data Aggregation Route Strategy for a Trade-off between Energy and Lifetime in WSNs

    PubMed Central

    Guo, Wenzhong; Hong, Wei; Zhang, Bin; Chen, Yuzhong; Xiong, Naixue

    2014-01-01

    Mobile security is one of the most fundamental problems in Wireless Sensor Networks (WSNs). The data transmission path will be compromised for some disabled nodes. To construct a secure and reliable network, designing an adaptive route strategy which optimizes energy consumption and network lifetime of the aggregation cost is of great importance. In this paper, we address the reliable data aggregation route problem for WSNs. Firstly, to ensure nodes work properly, we propose a data aggregation route algorithm which improves the energy efficiency in the WSN. The construction process achieved through discrete particle swarm optimization (DPSO) saves node energy costs. Then, to balance the network load and establish a reliable network, an adaptive route algorithm with the minimal energy and the maximum lifetime is proposed. Since it is a non-linear constrained multi-objective optimization problem, in this paper we propose a DPSO with the multi-objective fitness function combined with the phenotype sharing function and penalty function to find available routes. Experimental results show that compared with other tree routing algorithms our algorithm can effectively reduce energy consumption and trade off energy consumption and network lifetime. PMID:25215944

  7. Autism detection in early childhood (ADEC): reliability and validity data for a Level 2 screening tool for autistic disorder.

    PubMed

    Nah, Yong-Hwee; Young, Robyn L; Brewer, Neil; Berlingeri, Genna

    2014-03-01

    The Autism Detection in Early Childhood (ADEC; Young, 2007) was developed as a Level 2 clinician-administered autistic disorder (AD) screening tool that was time-efficient, suitable for children under 3 years, easy to administer, and suitable for persons with minimal training and experience with AD. A best estimate clinical Diagnostic and Statistical Manual of Mental Disorders (4th ed., text rev.; DSM-IV-TR; American Psychiatric Association, 2000) diagnosis of AD was made for 70 children using all available information and assessment results, except for the ADEC data. A screening study compared these children on the ADEC with 57 children with other developmental disorders and 64 typically developing children. Results indicated high internal consistency (α = .91). Interrater reliability and test-retest reliability of the ADEC were also adequate. ADEC scores reliably discriminated different diagnostic groups after controlling for nonverbal IQ and Vineland Adaptive Behavior Composite scores. Construct validity (using exploratory factor analysis) and concurrent validity using performance on the Autism Diagnostic Observation Schedule (Lord et al., 2000), the Autism Diagnostic Interview-Revised (Le Couteur, Lord, & Rutter, 2003), and DSM-IV-TR criteria were also demonstrated. Signal detection analysis identified the optimal ADEC cutoff score, with the ADEC identifying all children who had an AD (N = 70, sensitivity = 1.0) but overincluding children with other disabilities (N = 13, specificity ranging from .74 to .90). Together, the reliability and validity data indicate that the ADEC has potential to be established as a suitable and efficient screening tool for infants with AD. 2014 APA

  8. Autonomous Energy Grids | Grid Modernization | NREL

    Science.gov Websites

    control themselves using advanced machine learning and simulation to create resilient, reliable, and affordable optimized energy systems. Current frameworks to monitor, control, and optimize large-scale energy of optimization theory, control theory, big data analytics, and complex system theory and modeling to

  9. Psychometric properties of the Medical Student Well-Being Index among medical students in a Malaysian medical school.

    PubMed

    Yusoff, Muhamad Saiful Bahri; Yaacob, Mohd Jamil; Naing, Nyi Nyi; Esa, Ab Rahman

    2013-02-01

    This study evaluated the convergent, discriminant, construct, concurrent and discriminative validity of the Medical Student Wellbeing Index (MSWBI) as well as to evaluate its internal consistency and optimal cut-off total scores to detect at least moderate levels of general psychological distress, stress, anxiety and depression symptoms. A cross sectional study was done on 171 medical students. The MSWBI and DASS-21 were administered and returned immediately upon completion. Confirmatory factor analysis, reliability analysis, ROC analysis and Pearson correlation test were applied to assess psychometric properties of the MSWBI. A total of 168 (98.2%) medical students responded. The goodness of fit indices showed the MSWBI had a good construct (χ(2)=6.14, p=0.803, RMSEA<0.001, RMR=0.004, GFI=0.99, AGFI=0.97, CFI=1.00, IFI=1.02, TLI=1.04). The Cronbach's alpha value was 0.69 indicating an acceptable level of internal consistency. Pearson correlation coefficients and ROC analysis suggested each MSWBI's item showed adequate convergent and discriminant validity. Its optimal cut-off scores to detect at least moderate levels of general psychological distress, stress, anxiety, and depression were 1.5, 2.5, 1.5 and 2.5 respectively with sensitivity and specificity ranged from 62 to 80% and the areas under ROC curve ranged from 0.71 to 0.83. This study showed that the MSWBI had good level of psychometric properties. The MSWBI score more than 2 can be considered as having significant psychological distress. The MSWBI is a valid and reliable screening instrument to assess psychological distress of medical students. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Ejectors of power plants turbine units efficiency and reliability increasing

    NASA Astrophysics Data System (ADS)

    Aronson, K. E.; Ryabchikov, A. Yu.; Kuptsov, V. K.; Murmanskii, I. B.; Brodov, Yu. M.; Zhelonkin, N. V.; Khaet, S. I.

    2017-11-01

    The functioning of steam turbines condensation systems influence on the efficiency and reliability of a power plant a lot. At the same time, the condensation system operating is provided by basic ejectors, which maintain the vacuum level in the condenser. Development of methods of efficiency and reliability increasing for ejector functioning is an actual problem of up-to-date power engineering. In the paper there is presented statistical analysis of ejector breakdowns, revealed during repairing processes, the influence of such damages on the steam turbine operating reliability. It is determined, that 3% of steam turbine equipment breakdowns are the ejector breakdowns. At the same time, about 7% of turbine breakdowns are caused by different ejector malfunctions. Developed and approved design solutions, which can increase the ejector functioning indexes, are presented. Intercoolers are designed in separated cases, so the air-steam mixture can’t move from the high-pressure zones to the low-pressure zones and the maintainability of the apparatuses is increased. By U-type tubes application, the thermal expansion effect of intercooler tubes is compensated and the heat-transfer area is increased. By the applied nozzle fixing construction, it is possible to change the distance between a nozzle and a mixing chamber (nozzle exit position) for operating performance optimization. In operating conditions there are provided experimental researches of more than 30 serial ejectors and also high-efficient 3-staged ejector EPO-3-80, designed by authors. The measurement scheme of the designed ejector includes 21 indicator. The results of experimental tests with different nozzle exit positions of the ejector EPO-3-80 stream devices are presented. The pressure of primary stream (water steam) is optimized. Experimental data are well-approved by the calculation results.

  11. Analysis of the Multi Strategy Goal Programming for Micro-Grid Based on Dynamic ant Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Qiu, J. P.; Niu, D. X.

    Micro-grid is one of the key technologies of the future energy supplies. Take economic planning. reliability, and environmental protection of micro grid as a basis for the analysis of multi-strategy objective programming problems for micro grid which contains wind power, solar power, and battery and micro gas turbine. Establish the mathematical model of each power generation characteristics and energy dissipation. and change micro grid planning multi-objective function under different operating strategies to a single objective model based on AHP method. Example analysis shows that in combination with dynamic ant mixed genetic algorithm can get the optimal power output of this model.

  12. Risk-based maintenance of ethylene oxide production facilities.

    PubMed

    Khan, Faisal I; Haddara, Mahmoud R

    2004-05-20

    This paper discusses a methodology for the design of an optimum inspection and maintenance program. The methodology, called risk-based maintenance (RBM) is based on integrating a reliability approach and a risk assessment strategy to obtain an optimum maintenance schedule. First, the likely equipment failure scenarios are formulated. Out of many likely failure scenarios, the ones, which are most probable, are subjected to a detailed study. Detailed consequence analysis is done for the selected scenarios. Subsequently, these failure scenarios are subjected to a fault tree analysis to determine their probabilities. Finally, risk is computed by combining the results of the consequence and the probability analyses. The calculated risk is compared against known acceptable criteria. The frequencies of the maintenance tasks are obtained by minimizing the estimated risk. A case study involving an ethylene oxide production facility is presented. Out of the five most hazardous units considered, the pipeline used for the transportation of the ethylene is found to have the highest risk. Using available failure data and a lognormal reliability distribution function human health risk factors are calculated. Both societal risk factors and individual risk factors exceeded the acceptable risk criteria. To determine an optimal maintenance interval, a reverse fault tree analysis was used. The maintenance interval was determined such that the original high risk is brought down to an acceptable level. A sensitivity analysis is also undertaken to study the impact of changing the distribution of the reliability model as well as the error in the distribution parameters on the maintenance interval.

  13. RGCA: A Reliable GPU Cluster Architecture for Large-Scale Internet of Things Computing Based on Effective Performance-Energy Optimization

    PubMed Central

    Chen, Qingkui; Zhao, Deyu; Wang, Jingjuan

    2017-01-01

    This paper aims to develop a low-cost, high-performance and high-reliability computing system to process large-scale data using common data mining algorithms in the Internet of Things (IoT) computing environment. Considering the characteristics of IoT data processing, similar to mainstream high performance computing, we use a GPU (Graphics Processing Unit) cluster to achieve better IoT services. Firstly, we present an energy consumption calculation method (ECCM) based on WSNs. Then, using the CUDA (Compute Unified Device Architecture) Programming model, we propose a Two-level Parallel Optimization Model (TLPOM) which exploits reasonable resource planning and common compiler optimization techniques to obtain the best blocks and threads configuration considering the resource constraints of each node. The key to this part is dynamic coupling Thread-Level Parallelism (TLP) and Instruction-Level Parallelism (ILP) to improve the performance of the algorithms without additional energy consumption. Finally, combining the ECCM and the TLPOM, we use the Reliable GPU Cluster Architecture (RGCA) to obtain a high-reliability computing system considering the nodes’ diversity, algorithm characteristics, etc. The results show that the performance of the algorithms significantly increased by 34.1%, 33.96% and 24.07% for Fermi, Kepler and Maxwell on average with TLPOM and the RGCA ensures that our IoT computing system provides low-cost and high-reliability services. PMID:28777325

  14. RGCA: A Reliable GPU Cluster Architecture for Large-Scale Internet of Things Computing Based on Effective Performance-Energy Optimization.

    PubMed

    Fang, Yuling; Chen, Qingkui; Xiong, Neal N; Zhao, Deyu; Wang, Jingjuan

    2017-08-04

    This paper aims to develop a low-cost, high-performance and high-reliability computing system to process large-scale data using common data mining algorithms in the Internet of Things (IoT) computing environment. Considering the characteristics of IoT data processing, similar to mainstream high performance computing, we use a GPU (Graphics Processing Unit) cluster to achieve better IoT services. Firstly, we present an energy consumption calculation method (ECCM) based on WSNs. Then, using the CUDA (Compute Unified Device Architecture) Programming model, we propose a Two-level Parallel Optimization Model (TLPOM) which exploits reasonable resource planning and common compiler optimization techniques to obtain the best blocks and threads configuration considering the resource constraints of each node. The key to this part is dynamic coupling Thread-Level Parallelism (TLP) and Instruction-Level Parallelism (ILP) to improve the performance of the algorithms without additional energy consumption. Finally, combining the ECCM and the TLPOM, we use the Reliable GPU Cluster Architecture (RGCA) to obtain a high-reliability computing system considering the nodes' diversity, algorithm characteristics, etc. The results show that the performance of the algorithms significantly increased by 34.1%, 33.96% and 24.07% for Fermi, Kepler and Maxwell on average with TLPOM and the RGCA ensures that our IoT computing system provides low-cost and high-reliability services.

  15. Validity of the Malay version of the Internet Addiction Test: a study on a group of medical students in Malaysia.

    PubMed

    Guan, Ng Chong; Isa, Saramah Mohammed; Hashim, Aili Hanim; Pillai, Subash Kumar; Harbajan Singh, Manveen Kaur

    2015-03-01

    The use of the Internet has been increasing dramatically over the decade in Malaysia. Excessive usage of the Internet has lead to a phenomenon called Internet addiction. There is a need for a reliable, valid, and simple-to-use scale to measure Internet addiction in the Malaysian population for clinical practice and research purposes. The aim of this study was to validate the Malay version of the Internet Addiction Test, using a sample of 162 medical students. The instrument displayed good internal consistency (Cronbach's α = .91), parallel reliability (intraclass coefficient = .88, P < .001), and concurrent validity with the Compulsive Internet Use Scale (Pearson's correlation = .84, P < .001). Receiver operating characteristic analysis showed that 43 was the optimal cutoff score to discriminate students with and without Internet dependence. Principal component analysis with varimax rotation identified a 5-factor model. The Malay version of the Internet Addiction Test appeared to be a valid instrument for assessing Internet addiction in Malaysian university students. © 2012 APJPH.

  16. Reliability Analysis of Main-axis Control System of the Equatorial Antarctica Astronomical Telescope Based on Fault Tree

    NASA Astrophysics Data System (ADS)

    LI, Y.; Yang, S. H.

    2017-05-01

    The Antarctica astronomical telescopes work chronically on the top of the unattended South Pole, and they have only one chance to maintain every year. Due to the complexity of the optical, mechanical, and electrical systems, the telescopes are hard to be maintained and need multi-tasker expedition teams, which means an excessive awareness is essential for the reliability of the Antarctica telescopes. Based on the fault mechanism and fault mode of the main-axis control system for the equatorial Antarctica astronomical telescope AST3-3 (Antarctic Schmidt Telescopes 3-3), the method of fault tree analysis is introduced in this article, and we obtains the importance degree of the top event from the importance degree of the bottom event structure. From the above results, the hidden problems and weak links can be effectively found out, which will indicate the direction for promoting the stability of the system and optimizing the design of the system.

  17. Creep-rupture reliability analysis

    NASA Technical Reports Server (NTRS)

    Peralta-Duran, A.; Wirsching, P. H.

    1984-01-01

    A probabilistic approach to the correlation and extrapolation of creep-rupture data is presented. Time temperature parameters (TTP) are used to correlate the data, and an analytical expression for the master curve is developed. The expression provides a simple model for the statistical distribution of strength and fits neatly into a probabilistic design format. The analysis focuses on the Larson-Miller and on the Manson-Haferd parameters, but it can be applied to any of the TTP's. A method is developed for evaluating material dependent constants for TTP's. It is shown that optimized constants can provide a significant improvement in the correlation of the data, thereby reducing modelling error. Attempts were made to quantify the performance of the proposed method in predicting long term behavior. Uncertainty in predicting long term behavior from short term tests was derived for several sets of data. Examples are presented which illustrate the theory and demonstrate the application of state of the art reliability methods to the design of components under creep.

  18. Visual analytics of brain networks.

    PubMed

    Li, Kaiming; Guo, Lei; Faraco, Carlos; Zhu, Dajiang; Chen, Hanbo; Yuan, Yixuan; Lv, Jinglei; Deng, Fan; Jiang, Xi; Zhang, Tuo; Hu, Xintao; Zhang, Degang; Miller, L Stephen; Liu, Tianming

    2012-05-15

    Identification of regions of interest (ROIs) is a fundamental issue in brain network construction and analysis. Recent studies demonstrate that multimodal neuroimaging approaches and joint analysis strategies are crucial for accurate, reliable and individualized identification of brain ROIs. In this paper, we present a novel approach of visual analytics and its open-source software for ROI definition and brain network construction. By combining neuroscience knowledge and computational intelligence capabilities, visual analytics can generate accurate, reliable and individualized ROIs for brain networks via joint modeling of multimodal neuroimaging data and an intuitive and real-time visual analytics interface. Furthermore, it can be used as a functional ROI optimization and prediction solution when fMRI data is unavailable or inadequate. We have applied this approach to an operation span working memory fMRI/DTI dataset, a schizophrenia DTI/resting state fMRI (R-fMRI) dataset, and a mild cognitive impairment DTI/R-fMRI dataset, in order to demonstrate the effectiveness of visual analytics. Our experimental results are encouraging. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. Uncertainty quantification and reliability assessment in operational oil spill forecast modeling system.

    PubMed

    Hou, Xianlong; Hodges, Ben R; Feng, Dongyu; Liu, Qixiao

    2017-03-15

    As oil transport increasing in the Texas bays, greater risks of ship collisions will become a challenge, yielding oil spill accidents as a consequence. To minimize the ecological damage and optimize rapid response, emergency managers need to be informed with how fast and where oil will spread as soon as possible after a spill. The state-of-the-art operational oil spill forecast modeling system improves the oil spill response into a new stage. However uncertainty due to predicted data inputs often elicits compromise on the reliability of the forecast result, leading to misdirection in contingency planning. Thus understanding the forecast uncertainty and reliability become significant. In this paper, Monte Carlo simulation is implemented to provide parameters to generate forecast probability maps. The oil spill forecast uncertainty is thus quantified by comparing the forecast probability map and the associated hindcast simulation. A HyosPy-based simple statistic model is developed to assess the reliability of an oil spill forecast in term of belief degree. The technologies developed in this study create a prototype for uncertainty and reliability analysis in numerical oil spill forecast modeling system, providing emergency managers to improve the capability of real time operational oil spill response and impact assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Validity and reliability of bilingual English-Arabic version of Schutte self report emotional intelligence scale in an undergraduate Arab medical student sample.

    PubMed

    Naeem, Naghma; Muijtjens, Arno

    2015-04-01

    The psychological construct of emotional intelligence (EI), its theoretical models, measurement instruments and applications have been the subject of several research studies in health professions education. The objective of the current study was to investigate the factorial validity and reliability of a bilingual version of the Schutte Self Report Emotional Intelligence Scale (SSREIS) in an undergraduate Arab medical student population. The study was conducted during April-May 2012. A cross-sectional survey design was employed. A sample (n = 467) was obtained from undergraduate medical students belonging to the male and female medical college of King Saud University, Riyadh, Saudi Arabia. Exploratory and confirmatory factor analysis was performed using SPSS 16.0 and AMOS 4.0 statistical software to determine the factor structure. Reliability was determined using Cronbach's alpha statistics. The results obtained using an undergraduate Arab medical student sample supported a multidimensional; three factor structure of the SSREIS. The three factors are Optimism, Awareness-of-Emotions and Use-of-Emotions. The reliability (Cronbach's alpha) for the three subscales was 0.76, 0.72 and 0.55, respectively. Emotional intelligence is a multifactorial construct (three factors). The bilingual version of the SSREIS is a valid and reliable measure of trait emotional intelligence in an undergraduate Arab medical student population.

  1. Advanced CHP Control Algorithms: Scope Specification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katipamula, Srinivas; Brambley, Michael R.

    2006-04-28

    The primary objective of this multiyear project is to develop algorithms for combined heat and power systems to ensure optimal performance, increase reliability, and lead to the goal of clean, efficient, reliable and affordable next generation energy systems.

  2. Using multivariate regression modeling for sampling and predicting chemical characteristics of mixed waste in old landfills.

    PubMed

    Brandstätter, Christian; Laner, David; Prantl, Roman; Fellner, Johann

    2014-12-01

    Municipal solid waste landfills pose a threat on environment and human health, especially old landfills which lack facilities for collection and treatment of landfill gas and leachate. Consequently, missing information about emission flows prevent site-specific environmental risk assessments. To overcome this gap, the combination of waste sampling and analysis with statistical modeling is one option for estimating present and future emission potentials. Optimizing the tradeoff between investigation costs and reliable results requires knowledge about both: the number of samples to be taken and variables to be analyzed. This article aims to identify the optimized number of waste samples and variables in order to predict a larger set of variables. Therefore, we introduce a multivariate linear regression model and tested the applicability by usage of two case studies. Landfill A was used to set up and calibrate the model based on 50 waste samples and twelve variables. The calibrated model was applied to Landfill B including 36 waste samples and twelve variables with four predictor variables. The case study results are twofold: first, the reliable and accurate prediction of the twelve variables can be achieved with the knowledge of four predictor variables (Loi, EC, pH and Cl). For the second Landfill B, only ten full measurements would be needed for a reliable prediction of most response variables. The four predictor variables would exhibit comparably low analytical costs in comparison to the full set of measurements. This cost reduction could be used to increase the number of samples yielding an improved understanding of the spatial waste heterogeneity in landfills. Concluding, the future application of the developed model potentially improves the reliability of predicted emission potentials. The model could become a standard screening tool for old landfills if its applicability and reliability would be tested in additional case studies. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Development of a chromatographic method with multi-criteria decision making design for simultaneous determination of nifedipine and atenolol in content uniformity testing.

    PubMed

    Ahmed, Sameh; Alqurshi, Abdulmalik; Mohamed, Abdel-Maaboud Ismail

    2018-07-01

    A new robust and reliable high-performance liquid chromatography (HPLC) method with multi-criteria decision making (MCDM) approach was developed to allow simultaneous quantification of atenolol (ATN) and nifedipine (NFD) in content uniformity testing. Felodipine (FLD) was used as an internal standard (I.S.) in this study. A novel marriage between a new interactive response optimizer and a HPLC method was suggested for multiple response optimizations of target responses. An interactive response optimizer was used as a decision and prediction tool for the optimal settings of target responses, according to specified criteria, based on Derringer's desirability. Four independent variables were considered in this study: Acetonitrile%, buffer pH and concentration along with column temperature. Eight responses were optimized: retention times of ATN, NFD, and FLD, resolutions between ATN/NFD and NFD/FLD, and plate numbers for ATN, NFD, and FLD. Multiple regression analysis was applied in order to scan the influences of the most significant variables for the regression models. The experimental design was set to give minimum retention times, maximum resolution and plate numbers. The interactive response optimizer allowed prediction of optimum conditions according to these criteria with a good composite desirability value of 0.98156. The developed method was validated according to the International Conference on Harmonization (ICH) guidelines with the aid of the experimental design. The developed MCDM-HPLC method showed superior robustness and resolution in short analysis time allowing successful simultaneous content uniformity testing of ATN and NFD in marketed capsules. The current work presents an interactive response optimizer as an efficient platform to optimize, predict responses, and validate HPLC methodology with tolerable design space for assay in quality control laboratories. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Re-evaluation of the reported experimental values of the heat of vaporization of N-methylacetamide

    PubMed Central

    MacKerell, Alexander D.; Shim, Ji Hyun; Anisimov, Victor M.

    2010-01-01

    The accuracy of empirical force fields is inherently related to the quality of the target data used for optimization of the model. With the heat of vaporization (ΔHvap) of N-methylacetamide (NMA), a range of values have been reported as target data for optimization of the nonbond parameters associated with the peptide bond in proteins. In the present work, the original experimental data and Antoine constants used for the determination of the ΔHvap of NMA are reanalyzed. Based on this analysis, the wide range of ΔHvap values reported in the literature are shown to be due to incorrect reporting of the temperatures at which the original values were extracted and limitations in the quality of experimental vapor pressure-temperature data over a wide range of temperatures. Taking these problems into account, a consistent ΔHvap value is extracted from three studies for which experimental data are available. This analysis suggests that the most reliable value for ΔHvap is 13.0±0.1 at 410 K for use in force field optimization studies. The present results also indicate that similar analyses, including analysis of Antoine constants alone, may be of utility when reported ΔHvap values are not consistent for a given neat liquid. PMID:20445813

  5. Nanowire growth process modeling and reliability models for nanodevices

    NASA Astrophysics Data System (ADS)

    Fathi Aghdam, Faranak

    Nowadays, nanotechnology is becoming an inescapable part of everyday life. The big barrier in front of its rapid growth is our incapability of producing nanoscale materials in a reliable and cost-effective way. In fact, the current yield of nano-devices is very low (around 10 %), which makes fabrications of nano-devices very expensive and uncertain. To overcome this challenge, the first and most important step is to investigate how to control nano-structure synthesis variations. The main directions of reliability research in nanotechnology can be classified either from a material perspective or from a device perspective. The first direction focuses on restructuring materials and/or optimizing process conditions at the nano-level (nanomaterials). The other direction is linked to nano-devices and includes the creation of nano-electronic and electro-mechanical systems at nano-level architectures by taking into account the reliability of future products. In this dissertation, we have investigated two topics on both nano-materials and nano-devices. In the first research work, we have studied the optimization of one of the most important nanowire growth processes using statistical methods. Research on nanowire growth with patterned arrays of catalyst has shown that the wire-to-wire spacing is an important factor affecting the quality of resulting nanowires. To improve the process yield and the length uniformity of fabricated nanowires, it is important to reduce the resource competition between nanowires during the growth process. We have proposed a physical-statistical nanowire-interaction model considering the shadowing effect and shared substrate diffusion area to determine the optimal pitch that would ensure the minimum competition between nanowires. A sigmoid function is used in the model, and the least squares estimation method is used to estimate the model parameters. The estimated model is then used to determine the optimal spatial arrangement of catalyst arrays. This work is an early attempt that uses a physical-statistical modeling approach to studying selective nanowire growth for the improvement of process yield. In the second research work, the reliability of nano-dielectrics is investigated. As electronic devices get smaller, reliability issues pose new challenges due to unknown underlying physics of failure (i.e., failure mechanisms and modes). This necessitates new reliability analysis approaches related to nano-scale devices. One of the most important nano-devices is the transistor that is subject to various failure mechanisms. Dielectric breakdown is known to be the most critical one and has become a major barrier for reliable circuit design in nano-scale. Due to the need for aggressive downscaling of transistors, dielectric films are being made extremely thin, and this has led to adopting high permittivity (k) dielectrics as an alternative to widely used SiO2 in recent years. Since most time-dependent dielectric breakdown test data on bilayer stacks show significant deviations from a Weibull trend, we have proposed two new approaches to modeling the time to breakdown of bi-layer high-k dielectrics. In the first approach, we have used a marked space-time self-exciting point process to model the defect generation rate. A simulation algorithm is used to generate defects within the dielectric space, and an optimization algorithm is employed to minimize the Kullback-Leibler divergence between the empirical distribution obtained from the real data and the one based on the simulated data to find the best parameter values and to predict the total time to failure. The novelty of the presented approach lies in using a conditional intensity for trap generation in dielectric that is a function of time, space and size of the previous defects. In addition, in the second approach, a k-out-of-n system framework is proposed to estimate the total failure time after the generation of more than one soft breakdown.

  6. Mathematical programming models for the economic design and assessment of wind energy conversion systems

    NASA Astrophysics Data System (ADS)

    Reinert, K. A.

    The use of linear decision rules (LDR) and chance constrained programming (CCP) to optimize the performance of wind energy conversion clusters coupled to storage systems is described. Storage is modelled by LDR and output by CCP. The linear allocation rule and linear release rule prescribe the size and optimize a storage facility with a bypass. Chance constraints are introduced to explicitly treat reliability in terms of an appropriate value from an inverse cumulative distribution function. Details of deterministic programming structure and a sample problem involving a 500 kW and a 1.5 MW WECS are provided, considering an installed cost of $1/kW. Four demand patterns and three levels of reliability are analyzed for optimizing the generator choice and the storage configuration for base load and peak operating conditions. Deficiencies in ability to predict reliability and to account for serial correlations are noted in the model, which is concluded useful for narrowing WECS design options.

  7. The optimization on flow scheme of helium liquefier with genetic algorithm

    NASA Astrophysics Data System (ADS)

    Wang, H. R.; Xiong, L. Y.; Peng, N.; Liu, L. Q.

    2017-01-01

    There are several ways to organize the flow scheme of the helium liquefiers, such as arranging the expanders in parallel (reverse Brayton stage) or in series (modified Brayton stages). In this paper, the inlet mass flow and temperatures of expanders in Collins cycle are optimized using genetic algorithm (GA). Results show that maximum liquefaction rate can be obtained when the system is working at the optimal parameters. However, the reliability of the system is not well due to high wheel speed of the first turbine. Study shows that the scheme in which expanders are arranged in series with heat exchangers between them has higher operation reliability but lower plant efficiency when working at the same situation. Considering both liquefaction rate and system stability, another flow scheme is put forward hoping to solve the dilemma. The three configurations are compared from different aspects, they are respectively economic cost, heat exchanger size, system reliability and exergy efficiency. In addition, the effect of heat capacity ratio on heat transfer efficiency is discussed. A conclusion of choosing liquefier configuration is given in the end, which is meaningful for the optimal design of helium liquefier.

  8. Optimization of chromatographic conditions for determination of aflatoxin B1, B2, G1 and G2 by using liquid chromatography-mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Ramadhaningtyas, Dillani Putri; Aryana, Nurhani; Aristiawan, Yosi; Styarini, Dyah

    2017-11-01

    The optimization of instrument condition and chromatographic separation for analysis of aflatoxin B1, B2, G1 and G2 using liquid chromatography tandem with mass spectrometer detector was conducted in the aim to provide more accurate and reliable analysis results. The aflatoxin known to be serious threat for human health as it is classified as the carcinogenic compounds. The aflatoxin B1, B2, G1 and G2 were selected due to its extensive contamination in various agricultural commodities. The best chromatographic separation was obtained using C-18 column with gradient elution of solvent 5 mM ammonium acetate and 0.1% formic acid in methanol at 7 minutes runtime analysis. The linearity of the detector showed satisfied results as the coefficient determination found to be 0.9994, 0.9996, 0.9998 and 0.9987 for aflatoxin B1, G1, B2, and G2 respectively in the range concentration from 1 to 20 ng/g. The quantifier ion selected for the aflatoxin B1, B2, G1 and G2 was m/z 285.1, 259, 243 and 313 respectively. The instrument precision at these quantifier ions also showed satisfied result with %RSD was around 3.4 to 6.8%. The optimized method present in this study can be used for further sample analysis.

  9. Optimizing multiple reliable forward contracts for reservoir allocation using multitime scale streamflow forecasts

    NASA Astrophysics Data System (ADS)

    Lu, Mengqian; Lall, Upmanu; Robertson, Andrew W.; Cook, Edward

    2017-03-01

    Streamflow forecasts at multiple time scales provide a new opportunity for reservoir management to address competing objectives. Market instruments such as forward contracts with specified reliability are considered as a tool that may help address the perceived risk associated with the use of such forecasts in lieu of traditional operation and allocation strategies. A water allocation process that enables multiple contracts for water supply and hydropower production with different durations, while maintaining a prescribed level of flood risk reduction, is presented. The allocation process is supported by an optimization model that considers multitime scale ensemble forecasts of monthly streamflow and flood volume over the upcoming season and year, the desired reliability and pricing of proposed contracts for hydropower and water supply. It solves for the size of contracts at each reliability level that can be allocated for each future period, while meeting target end of period reservoir storage with a prescribed reliability. The contracts may be insurable, given that their reliability is verified through retrospective modeling. The process can allow reservoir operators to overcome their concerns as to the appropriate skill of probabilistic forecasts, while providing water users with short-term and long-term guarantees as to how much water or energy they may be allocated. An application of the optimization model to the Bhakra Dam, India, provides an illustration of the process. The issues of forecast skill and contract performance are examined. A field engagement of the idea is useful to develop a real-world perspective and needs a suitable institutional environment.

  10. Predictive validity and reliability of the Braden scale for risk assessment of pressure ulcers in an intensive care unit.

    PubMed

    Lima-Serrano, M; González-Méndez, M I; Martín-Castaño, C; Alonso-Araujo, I; Lima-Rodríguez, J S

    2018-03-01

    Contribution to validation of the Braden scale in patients admitted to the ICU, based on an analysis of its reliability and predictive validity. An analytical, observational, longitudinal prospective study was carried out. Intensive Care Unit, Hospital Virgen del Rocío, Seville (Spain). Patients aged 18years or older and admitted for over 24hours to the ICU were included. Patients with pressure ulcers upon admission were excluded. A total of 335 patients were enrolled in two study periods of one month each. None. The presence of gradei-iv pressure ulcers was regarded as the main or dependent variable. Three categories were considered (demographic, clinical and prognostic) for the remaining variables. The incidence of patients who developed pressure ulcers was 8.1%. The proportion of gradei andii pressure ulcer was 40.6% and 59.4% respectively, highlighting the sacrum as the most frequently affected location. Cronbach's alpha coefficient in the assessments considered indicated good to moderate reliability. In the three evaluations made, a cutoff point of 12 was presented as optimal in the assessment of the first and second days of admission. In relation to the assessment of the day with minimum score, the optimal cutoff point was 10. The Braden scale shows insufficient predictive validity and poor precision for cutoff points of both 18 and 16, which are those accepted in the different clinical scenarios. Copyright © 2017 Elsevier España, S.L.U. y SEMNIM. All rights reserved.

  11. Minimizing water consumption when producing hydropower

    NASA Astrophysics Data System (ADS)

    Leon, A. S.

    2015-12-01

    In 2007, hydropower accounted for only 16% of the world electricity production, with other renewable sources totaling 3%. Thus, it is not surprising that when alternatives are evaluated for new energy developments, there is strong impulse for fossil fuel or nuclear energy as opposed to renewable sources. However, as hydropower schemes are often part of a multipurpose water resources development project, they can often help to finance other components of the project. In addition, hydropower systems and their associated dams and reservoirs provide human well-being benefits, such as flood control and irrigation, and societal benefits such as increased recreational activities and improved navigation. Furthermore, hydropower due to its associated reservoir storage, can provide flexibility and reliability for energy production in integrated energy systems. The storage capability of hydropower systems act as a regulating mechanism by which other intermittent and variable renewable energy sources (wind, wave, solar) can play a larger role in providing electricity of commercial quality. Minimizing water consumption for producing hydropower is critical given that overuse of water for energy production may result in a shortage of water for other purposes such as irrigation, navigation or fish passage. This paper presents a dimensional analysis for finding optimal flow discharge and optimal penstock diameter when designing impulse and reaction water turbines for hydropower systems. The objective of this analysis is to provide general insights for minimizing water consumption when producing hydropower. This analysis is based on the geometric and hydraulic characteristics of the penstock, the total hydraulic head and the desired power production. As part of this analysis, various dimensionless relationships between power production, flow discharge and head losses were derived. These relationships were used to withdraw general insights on determining optimal flow discharge and optimal penstock diameter. For instance, it was found that for minimizing water consumption, the ratio of head loss to gross head should not exceed about 15%. Two examples of application are presented to illustrate the procedure for determining optimal flow discharge and optimal penstock diameter for impulse and reaction turbines.

  12. Thin-film module circuit design: Practical and reliability aspects

    NASA Technical Reports Server (NTRS)

    Daiello, R. V.; Twesme, E. N.

    1985-01-01

    This paper will address several aspects of the design and construction of submodules based on thin film amorphous silicon (a-Si) p i n solar cells. Starting from presently attainable single cell characteristics, and a realistic set of specifications, practical module designs are discussed from the viewpoints of efficient designs, the fabrication requirements, and reliability concerns. The examples center mostly on series interconnected modules of the superstrate type with detailed discussions of each portion of the structure in relation to its influence on module efficiency. Emphasis is placed on engineering topics such as: area coverage, optimal geometries, and cost and reliability. Practical constraints on achieving optimal designs, along with some examples of potential pitfalls in the manufacture and subsequent performance of a-Si modules are discussed.

  13. Shape optimization and CAD

    NASA Technical Reports Server (NTRS)

    Rasmussen, John

    1990-01-01

    Structural optimization has attracted the attention since the days of Galileo. Olhoff and Taylor have produced an excellent overview of the classical research within this field. However, the interest in structural optimization has increased greatly during the last decade due to the advent of reliable general numerical analysis methods and the computer power necessary to use them efficiently. This has created the possibility of developing general numerical systems for shape optimization. Several authors, eg., Esping; Braibant & Fleury; Bennet & Botkin; Botkin, Yang, and Bennet; and Stanton have published practical and successful applications of general optimization systems. Ding and Homlein have produced extensive overviews of available systems. Furthermore, a number of commercial optimization systems based on well-established finite element codes have been introduced. Systems like ANSYS, IDEAS, OASIS, and NISAOPT are widely known examples. In parallel to this development, the technology of computer aided design (CAD) has gained a large influence on the design process of mechanical engineering. The CAD technology has already lived through a rapid development driven by the drastically growing capabilities of digital computers. However, the systems of today are still considered as being only the first generation of a long row of computer integrated manufacturing (CIM) systems. These systems to come will offer an integrated environment for design, analysis, and fabrication of products of almost any character. Thus, the CAD system could be regarded as simply a database for geometrical information equipped with a number of tools with the purpose of helping the user in the design process. Among these tools are facilities for structural analysis and optimization as well as present standard CAD features like drawing, modeling, and visualization tools. The state of the art of structural optimization is that a large amount of mathematical and mechanical techniques are available for the solution of single problems. By implementing collections of the available techniques into general software systems, operational environments for structural optimization have been created. The forthcoming years must bring solutions to the problem of integrating such systems into more general design environments. The result of this work should be CAD systems for rational design in which structural optimization is one important design tool among many others.

  14. Hierarchical Bayesian Model Averaging for Chance Constrained Remediation Designs

    NASA Astrophysics Data System (ADS)

    Chitsazan, N.; Tsai, F. T.

    2012-12-01

    Groundwater remediation designs are heavily relying on simulation models which are subjected to various sources of uncertainty in their predictions. To develop a robust remediation design, it is crucial to understand the effect of uncertainty sources. In this research, we introduce a hierarchical Bayesian model averaging (HBMA) framework to segregate and prioritize sources of uncertainty in a multi-layer frame, where each layer targets a source of uncertainty. The HBMA framework provides an insight to uncertainty priorities and propagation. In addition, HBMA allows evaluating model weights in different hierarchy levels and assessing the relative importance of models in each level. To account for uncertainty, we employ a chance constrained (CC) programming for stochastic remediation design. Chance constrained programming was implemented traditionally to account for parameter uncertainty. Recently, many studies suggested that model structure uncertainty is not negligible compared to parameter uncertainty. Using chance constrained programming along with HBMA can provide a rigorous tool for groundwater remediation designs under uncertainty. In this research, the HBMA-CC was applied to a remediation design in a synthetic aquifer. The design was to develop a scavenger well approach to mitigate saltwater intrusion toward production wells. HBMA was employed to assess uncertainties from model structure, parameter estimation and kriging interpolation. An improved harmony search optimization method was used to find the optimal location of the scavenger well. We evaluated prediction variances of chloride concentration at the production wells through the HBMA framework. The results showed that choosing the single best model may lead to a significant error in evaluating prediction variances for two reasons. First, considering the single best model, variances that stem from uncertainty in the model structure will be ignored. Second, considering the best model with non-dominant model weight may underestimate or overestimate prediction variances by ignoring other plausible propositions. Chance constraints allow developing a remediation design with a desirable reliability. However, considering the single best model, the calculated reliability will be different from the desirable reliability. We calculated the reliability of the design for the models at different levels of HBMA. The results showed that by moving toward the top layers of HBMA, the calculated reliability converges to the chosen reliability. We employed the chance constrained optimization along with the HBMA framework to find the optimal location and pumpage for the scavenger well. The results showed that using models at different levels in the HBMA framework, the optimal location of the scavenger well remained the same, but the optimal extraction rate was altered. Thus, we concluded that the optimal pumping rate was sensitive to the prediction variance. Also, the prediction variance was changed by using different extraction rate. Using very high extraction rate will cause prediction variances of chloride concentration at the production wells to approach zero regardless of which HBMA models used.

  15. The German Version of the Herth Hope Index (HHI-D): Development and Psychometric Properties.

    PubMed

    Geiser, Franziska; Zajackowski, Katharina; Conrad, Rupert; Imbierowicz, Katrin; Wegener, Ingo; Herth, Kaye A; Urbach, Anne Sarah

    2015-01-01

    The importance of hope is evident in clinical oncological care. Hope is associated with psychological and also physical functioning. However, there is still a dearth of empirical research on hope as a multidimensional concept. The Herth Hope Index is a reliable and valid instrument for the measurement of hope and is available in many languages. Until now no authorized German translation has been published and validated. After translation, the questionnaire was completed by 192 patients with different tumor entities in radiation therapy. Reliability, concurrent validity, and factor structure of the questionnaire were determined. Correlations were high with depression and anxiety as well as optimism and pessimism. As expected, correlations with coping styles were moderate. Internal consistency and test-retest reliability were satisfactory. We could not replicate the original 3-factor model. Application of the scree plot criterion in an exploratory factor analysis resulted in a single-factor structure. The Herth Hope Index - German Version (HHI-D) is a short, reliable, and valid instrument for the assessment of hope in patient populations. We recommend using only the HHI-D total score until further research gives more insights into possible factorial solutions and subscales. © 2015 S. Karger GmbH, Freiburg.

  16. TU-FG-201-01: 18-Month Clinical Experience of a Linac Daily Quality Assurance (QA) Solution Using Only EPID and OBI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, B; Sun, B; Yaddanapudi, S

    Purpose: To describe the clinical use of a Linear Accelerator (Linac) DailyQA system with only EPID and OBI. To assess the reliability over an 18-month period and improve the robustness of this system based on QA failure analysis. Methods: A DailyQA solution utilizing an in-house designed phantom, combined EPID and OBI image acquisitions, and a web-based data analysis and reporting system was commissioned and used in our clinic to measure geometric, dosimetry and imaging components of a Varian Truebeam Linac. During an 18-month period (335 working days), the Daily QA results, including the output constancy, beam flatness and symmetry, uniformity,more » TPR20/10, MV and KV imaging quality, were collected and analyzed. For output constancy measurement, an independent monthly QA system with an ionization chamber (IC) and annual/incidental TG51 measurements with ADCL IC were performed and cross-compared to Daily QA system. Thorough analyses were performed on the recorded QA failures to evaluate the machine performance, optimize the data analysis algorithm, adjust the tolerance setting and improve the training procedure to prevent future failures. Results: A clinical workflow including beam delivery, data analysis, QA report generation and physics approval was established and optimized to suit daily clinical operation. The output tests over the 335 working day period cross-correlated with the monthly QA system within 1.3% and TG51 results within 1%. QA passed with one attempt on 236 days out of 335 days. Based on the QA failures analysis, the Gamma criteria is revised from (1%, 1mm) to (2%, 1mm) considering both QA accuracy and efficiency. Data analysis algorithm is improved to handle multiple entries for a repeating test. Conclusion: We described our 18-month clinical experience on a novel DailyQA system using only EPID and OBI. The long term data presented demonstrated the system is suitable and reliable for Linac daily QA.« less

  17. Zebrafish tracking using convolutional neural networks.

    PubMed

    Xu, Zhiping; Cheng, Xi En

    2017-02-17

    Keeping identity for a long term after occlusion is still an open problem in the video tracking of zebrafish-like model animals, and accurate animal trajectories are the foundation of behaviour analysis. We utilize the highly accurate object recognition capability of a convolutional neural network (CNN) to distinguish fish of the same congener, even though these animals are indistinguishable to the human eye. We used data augmentation and an iterative CNN training method to optimize the accuracy for our classification task, achieving surprisingly accurate trajectories of zebrafish of different size and age zebrafish groups over different time spans. This work will make further behaviour analysis more reliable.

  18. Zebrafish tracking using convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Xu, Zhiping; Cheng, Xi En

    2017-02-01

    Keeping identity for a long term after occlusion is still an open problem in the video tracking of zebrafish-like model animals, and accurate animal trajectories are the foundation of behaviour analysis. We utilize the highly accurate object recognition capability of a convolutional neural network (CNN) to distinguish fish of the same congener, even though these animals are indistinguishable to the human eye. We used data augmentation and an iterative CNN training method to optimize the accuracy for our classification task, achieving surprisingly accurate trajectories of zebrafish of different size and age zebrafish groups over different time spans. This work will make further behaviour analysis more reliable.

  19. A continuous optimization approach for inferring parameters in mathematical models of regulatory networks.

    PubMed

    Deng, Zhimin; Tian, Tianhai

    2014-07-29

    The advances of systems biology have raised a large number of sophisticated mathematical models for describing the dynamic property of complex biological systems. One of the major steps in developing mathematical models is to estimate unknown parameters of the model based on experimentally measured quantities. However, experimental conditions limit the amount of data that is available for mathematical modelling. The number of unknown parameters in mathematical models may be larger than the number of observation data. The imbalance between the number of experimental data and number of unknown parameters makes reverse-engineering problems particularly challenging. To address the issue of inadequate experimental data, we propose a continuous optimization approach for making reliable inference of model parameters. This approach first uses a spline interpolation to generate continuous functions of system dynamics as well as the first and second order derivatives of continuous functions. The expanded dataset is the basis to infer unknown model parameters using various continuous optimization criteria, including the error of simulation only, error of both simulation and the first derivative, or error of simulation as well as the first and second derivatives. We use three case studies to demonstrate the accuracy and reliability of the proposed new approach. Compared with the corresponding discrete criteria using experimental data at the measurement time points only, numerical results of the ERK kinase activation module show that the continuous absolute-error criteria using both function and high order derivatives generate estimates with better accuracy. This result is also supported by the second and third case studies for the G1/S transition network and the MAP kinase pathway, respectively. This suggests that the continuous absolute-error criteria lead to more accurate estimates than the corresponding discrete criteria. We also study the robustness property of these three models to examine the reliability of estimates. Simulation results show that the models with estimated parameters using continuous fitness functions have better robustness properties than those using the corresponding discrete fitness functions. The inference studies and robustness analysis suggest that the proposed continuous optimization criteria are effective and robust for estimating unknown parameters in mathematical models.

  20. Absolute variation of the mechanical characteristics of halloysite reinforced polyurethane nanocomposites complemented by Taguchi and ANOVA approaches

    NASA Astrophysics Data System (ADS)

    Gaaz, Tayser Sumer; Sulong, Abu Bakar; Kadhum, Abdul Amir H.; Nassir, Mohamed H.; Al-Amiery, Ahmed A.

    The variation of the results of the mechanical properties of halloysite nanotubes (HNTs) reinforced thermoplastic polyurethane (TPU) at different HNTs loadings was implemented as a tool for analysis. The preparation of HNTs-TPU nanocomposites was performed under four controlled parameters of mixing temperature, mixing speed, mixing time, and HNTs loading at three levels each to satisfy Taguchi method orthogonal array L9 aiming to optimize these parameters for the best measurements of tensile strength, Young's modulus, and tensile strain (known as responses). The maximum variation of the experimental results for each response was determined and analysed based on the optimized results predicted by Taguchi method and ANOVA. It was found that the maximum absolute variations of the three mentioned responses are 69%, 352%, and 126%, respectively. The analysis has shown that the preparation of the optimized tensile strength requires 1 wt.% HNTs loading (excluding 2 wt.% and 3 wt.%), mixing temperature of 190 °C (excluding 200 °C and 210 °C), and mixing speed of 30 rpm (excluding 40 rpm and 50 rpm). In addition, the analysis has determined that the mixing time at 20 min has no effect on the preparation. The mentioned analysis was fortified by ANOVA, images of FESEM, and DSC results. Seemingly, the agglomeration and distribution of HNTs in the nanocomposite play an important role in the process. The outcome of the analysis could be considered as a very important step towards the reliability of Taguchi method.

  1. A comparative study and application of continuously variable transmission to a single main rotor heavy lift helicopter

    NASA Astrophysics Data System (ADS)

    Hameer, Sameer

    Rotorcraft transmission design is limited by empirical weight trends that are proportional to the power/torque raised to the two-thirds coupled with the relative inexperience industry has with the employment of variable speed transmission to heavy lift helicopters of the order of 100,000 lbs gross weight and 30,000 installed horsepower. The advanced rotorcraft transmission program objectives are to reduce transmission weight by at least 25%, reduce sound pressure levels by at least 10 dB, have a 5000 hr mean time between removal, and also incorporate the use of split torque technology in rotorcraft drivetrains of the future. The major obstacle that challenges rotorcraft drivetrain design is the selection, design, and optimization of a variable speed transmission in the goal of achieving a 50% reduction in rotor speed and its ability to handle high torque with light weight gears, as opposed to using a two-speed transmission which has inherent structural problems and is highly unreliable due to the embodiment of the traction type transmission, complex clutch and brake system. This thesis selects a nontraction pericyclic continuously variable transmission (P-CVT) as the best approach for a single main rotor heavy lift helicopter. The objective is to target and overcome the above mentioned obstacle for drivetrain design. Overcoming this obstacle provides advancement in the state of the art of drivetrain design over existing planetary and split torque transmissions currently used in helicopters. The goal of the optimization process was to decrease weight, decrease noise, increase efficiency, and increase safety and reliability. The objective function utilized the minimization of the weight and the major constraint is the tooth bending stress of the facegears. The most important parameters of the optimization process are weight, maintainability, and reliability which are cross-functionally related to each other, and these parameters are related to the torques and operating speeds. The analysis of the split torque type P-CVT achieved a weight reduction of 42.5% and 40.7% over planetary and split torque transmissions respectively. In addition, a 19.5 dB sound pressure level reduction was achieved using active gear struts, and also the use of fabricated steel truss like housing provided a higher maintainability and reliability, low cost, and low weight over cast magnesium housing currently employed in helicopters. The static finite element analysis of the split torque type P-CVT, both 2-D and 3-D, yielded stresses below the allowable bending stress of the material. The goal of the finite element analysis is to see if the designed product has met its functional requirements. The safety assessment of the split torque type P-CVT yielded a 99% probability of mission success based on a Monte Carlo simulation using stochastic-petri net analysis and a failure hazard analysis. This was followed by an FTA/RBD analysis which yielded an overall system failure rate of 140.35 failures per million hours, and a preliminary certification and time line of certification was performed. The use of spherical facegears and pericyclic kinematics has advanced the state of the art in drivetrain design primarily in the reduction of weight and noise coupled with high safety, reliability, and efficiency.

  2. The Pareidolia Test: A Simple Neuropsychological Test Measuring Visual Hallucination-Like Illusions

    PubMed Central

    Mamiya, Yasuyuki; Nishio, Yoshiyuki; Watanabe, Hiroyuki; Yokoi, Kayoko; Uchiyama, Makoto; Baba, Toru; Iizuka, Osamu; Kanno, Shigenori; Kamimura, Naoto; Kazui, Hiroaki; Hashimoto, Mamoru; Ikeda, Manabu; Takeshita, Chieko; Shimomura, Tatsuo; Mori, Etsuro

    2016-01-01

    Background Visual hallucinations are a core clinical feature of dementia with Lewy bodies (DLB), and this symptom is important in the differential diagnosis and prediction of treatment response. The pareidolia test is a tool that evokes visual hallucination-like illusions, and these illusions may be a surrogate marker of visual hallucinations in DLB. We created a simplified version of the pareidolia test and examined its validity and reliability to establish the clinical utility of this test. Methods The pareidolia test was administered to 52 patients with DLB, 52 patients with Alzheimer’s disease (AD) and 20 healthy controls (HCs). We assessed the test-retest/inter-rater reliability using the intra-class correlation coefficient (ICC) and the concurrent validity using the Neuropsychiatric Inventory (NPI) hallucinations score as a reference. A receiver operating characteristic (ROC) analysis was used to evaluate the sensitivity and specificity of the pareidolia test to differentiate DLB from AD and HCs. Results The pareidolia test required approximately 15 minutes to administer, exhibited good test-retest/inter-rater reliability (ICC of 0.82), and moderately correlated with the NPI hallucinations score (rs = 0.42). Using an optimal cut-off score set according to the ROC analysis, and the pareidolia test differentiated DLB from AD with a sensitivity of 81% and a specificity of 92%. Conclusions Our study suggests that the simplified version of the pareidolia test is a valid and reliable surrogate marker of visual hallucinations in DLB. PMID:27171377

  3. Design of a modular digital computer system

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A design tradeoff study is reported for a modular spaceborne computer system that is responsive to many mission types and phases. The computer uses redundancy to maximize reliability, and multiprocessing to maximize processing capacity. Fault detection and recovery features provide optimal reliability.

  4. Cavitating Propeller Performance in Inclined Shaft Conditions with OpenFOAM: PPTC 2015 Test Case

    NASA Astrophysics Data System (ADS)

    Gaggero, Stefano; Villa, Diego

    2018-05-01

    In this paper, we present our analysis of the non-cavitating and cavitating unsteady performances of the Potsdam Propeller Test Case (PPTC) in oblique flow. For our calculations, we used the Reynolds-averaged Navier-Stokes equation (RANSE) solver from the open-source OpenFOAM libraries. We selected the homogeneous mixture approach to solve for multiphase flow with phase change, using the volume of fluid (VoF) approach to solve the multiphase flow and modeling the mass transfer between vapor and water with the Schnerr-Sauer model. Comparing the model results with the experimental measurements collected during the Second Workshop on Cavitation and Propeller Performance - SMP'15 enabled our assessment of the reliability of the open-source calculations. Comparisons with the numerical data collected during the workshop enabled further analysis of the reliability of different flow solvers from which we produced an overview of recommended guidelines (mesh arrangements and solver setups) for accurate numerical prediction even in off-design conditions. Lastly, we propose a number of calculations using the boundary element method developed at the University of Genoa for assessing the reliability of this dated but still widely adopted approach for design and optimization in the preliminary stages of very demanding test cases.

  5. Preliminary reliability and validity of Persian version of the Family Dermatology Life Quality Index (FDLQI).

    PubMed

    Safizadeh, H; Nakhaee, N; Shamsi-Meymandi, S; Pourdamghan, N; Basra, M K A

    2014-04-01

    Family Dermatology Life Quality Index (FDLQI) is an instrument for assessing the quality of life of family members of dermatologic patients. The aim of this study was to describe the cultural adaptation of this questionnaire and to assess psychometric properties of the Persian version. At first, the questionnaire was translated into Persian, and then back-translation was performed. The whole cycle was repeated until a consensus was reached about the optimal translation. In the next step, cognitive debriefing was performed, and after approval of the Persian version by FDLQI developers, it was distributed among 100 family members of dermatological patients in order to evaluate its validity and reliability. Mean age of participants was 37.1 years (±12.3). Mean score of FDLQI was 15.4 (±5.5) with maximum and minimum scores of 30 and 6, respectively. The quality of life of studied participants showed no significant difference based on age-group, sex, educational level and the family relationship. Cronbach's alpha was calculated as 0.87. Exploratory factor analysis revealed a one-factor solution that accounted for 40.7 % of the variance. The unidimensional model was supported by confirmatory factor analysis. The results of the present study showed that the Persian version of FDLQI has acceptable factorial validity and internal consistency reliability.

  6. Sweetener blend optimization by using mixture design methodology and the electronic tongue.

    PubMed

    Waldrop, Megan E; Ross, Carolyn F

    2014-09-01

    Utilizing more than one sweetener has been shown to be an effective way to substitute sucrose in food products. The objective of this study was to apply the augmented simplex-centroid mixture design for the optimization of acceptable sweetener blends using coconut sugar, agave, and stevia. Sweetener blends were evaluated in aqueous solutions and gluten-free granola bars by a trained panel and consumers (n = 60). Significant differences were found between sweetener mixtures in solutions by both panelists and consumers (P < 0.05). Taste profiles for the sweetener solutions were also generated using the electronic tongue. Most consumer and trained intensity ratings were highly correlated (R(2) ≥ 0.79) with the electronic tongue taste profile analysis. Granola bars were also found to be significantly different (P < 0.05), with consumers preferring coconut sugar mixtures. Using contour plots and desirability function analysis, an optimal sweetener combination was found for a granola bar formulation of 89.9% coconut sugar, 6.1% agave, and 4% stevia. These results indicate that a mixture design can be a reliable way to develop new sweetener blends for product development. © 2014 Institute of Food Technologists®

  7. Performance analysis of complex repairable industrial systems using PSO and fuzzy confidence interval based methodology.

    PubMed

    Garg, Harish

    2013-03-01

    The main objective of the present paper is to propose a methodology for analyzing the behavior of the complex repairable industrial systems. In real-life situations, it is difficult to find the most optimal design policies for MTBF (mean time between failures), MTTR (mean time to repair) and related costs by utilizing available resources and uncertain data. For this, the availability-cost optimization model has been constructed for determining the optimal design parameters for improving the system design efficiency. The uncertainties in the data related to each component of the system are estimated with the help of fuzzy and statistical methodology in the form of the triangular fuzzy numbers. Using these data, the various reliability parameters, which affects the system performance, are obtained in the form of the fuzzy membership function by the proposed confidence interval based fuzzy Lambda-Tau (CIBFLT) methodology. The computed results by CIBFLT are compared with the existing fuzzy Lambda-Tau methodology. Sensitivity analysis on the system MTBF has also been addressed. The methodology has been illustrated through a case study of washing unit, the main part of the paper industry. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  9. designGG: an R-package and web tool for the optimal design of genetical genomics experiments.

    PubMed

    Li, Yang; Swertz, Morris A; Vera, Gonzalo; Fu, Jingyuan; Breitling, Rainer; Jansen, Ritsert C

    2009-06-18

    High-dimensional biomolecular profiling of genetically different individuals in one or more environmental conditions is an increasingly popular strategy for exploring the functioning of complex biological systems. The optimal design of such genetical genomics experiments in a cost-efficient and effective way is not trivial. This paper presents designGG, an R package for designing optimal genetical genomics experiments. A web implementation for designGG is available at http://gbic.biol.rug.nl/designGG. All software, including source code and documentation, is freely available. DesignGG allows users to intelligently select and allocate individuals to experimental units and conditions such as drug treatment. The user can maximize the power and resolution of detecting genetic, environmental and interaction effects in a genome-wide or local mode by giving more weight to genome regions of special interest, such as previously detected phenotypic quantitative trait loci. This will help to achieve high power and more accurate estimates of the effects of interesting factors, and thus yield a more reliable biological interpretation of data. DesignGG is applicable to linkage analysis of experimental crosses, e.g. recombinant inbred lines, as well as to association analysis of natural populations.

  10. Intersubject variability and intrasubject reproducibility of 12-lead ECG metrics: Implications for human verification.

    PubMed

    Jekova, Irena; Krasteva, Vessela; Leber, Remo; Schmid, Ramun; Twerenbold, Raphael; Müller, Christian; Reichlin, Tobias; Abächerli, Roger

    Electrocardiogram (ECG) biometrics is an advanced technology, not yet covered by guidelines on criteria, features and leads for maximal authentication accuracy. This study aims to define the minimal set of morphological metrics in 12-lead ECG by optimization towards high reliability and security, and validation in a person verification model across a large population. A standard 12-lead resting ECG database from 574 non-cardiac patients with two remote recordings (>1year apart) was used. A commercial ECG analysis module (Schiller AG) measured 202 morphological features, including lead-specific amplitudes, durations, ST-metrics, and axes. Coefficient of variation (CV, intersubject variability) and percent-mean-absolute-difference (PMAD, intrasubject reproducibility) defined the optimization (PMAD/CV→min) and restriction (CV<30%) criteria for selection of the most stable and distinctive features. Linear discriminant analysis (LDA) validated the non-redundant feature set for person verification. Maximal LDA verification sensitivity (85.3%) and specificity (86.4%) were validated for 11 optimal features: R-amplitude (I,II,V1,V2,V3,V5), S-amplitude (V1,V2), Tnegative-amplitude (aVR), and R-duration (aVF,V1). Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Reliability-Based Design Optimization of a Composite Airframe Component

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Coroneos, Rula; Patnaik, Surya N.

    2011-01-01

    A stochastic optimization methodology (SDO) has been developed to design airframe structural components made of metallic and composite materials. The design method accommodates uncertainties in load, strength, and material properties that are defined by distribution functions with mean values and standard deviations. A response parameter, like a failure mode, has become a function of reliability. The primitive variables like thermomechanical loads, material properties, and failure theories, as well as variables like depth of beam or thickness of a membrane, are considered random parameters with specified distribution functions defined by mean values and standard deviations.

  12. Doubly Bayesian Analysis of Confidence in Perceptual Decision-Making.

    PubMed

    Aitchison, Laurence; Bang, Dan; Bahrami, Bahador; Latham, Peter E

    2015-10-01

    Humans stand out from other animals in that they are able to explicitly report on the reliability of their internal operations. This ability, which is known as metacognition, is typically studied by asking people to report their confidence in the correctness of some decision. However, the computations underlying confidence reports remain unclear. In this paper, we present a fully Bayesian method for directly comparing models of confidence. Using a visual two-interval forced-choice task, we tested whether confidence reports reflect heuristic computations (e.g. the magnitude of sensory data) or Bayes optimal ones (i.e. how likely a decision is to be correct given the sensory data). In a standard design in which subjects were first asked to make a decision, and only then gave their confidence, subjects were mostly Bayes optimal. In contrast, in a less-commonly used design in which subjects indicated their confidence and decision simultaneously, they were roughly equally likely to use the Bayes optimal strategy or to use a heuristic but suboptimal strategy. Our results suggest that, while people's confidence reports can reflect Bayes optimal computations, even a small unusual twist or additional element of complexity can prevent optimality.

  13. Optimization strategies for radiation induced grafting of 4-vinylpyridine onto poly(ethylene-co-tetraflouroethene) film using Box-Behnken design

    NASA Astrophysics Data System (ADS)

    Mahmoud Nasef, Mohamed; Shamsaei, Ezzatollah; Ghassemi, Payman; Ahmed Aly, Amgad; Hamid Yahaya, Abdul

    2012-04-01

    The radiation induced grafting of 4-vinylpyridine (4-VP) onto poly(ethylene-co-tetrafluoroethene) (ETFE) was optimized using the Box-Behnken factorial design available in the response surface method (RSM). The optimized grafting parameters; absorbed dose, monomer concentration, grafting time and reaction temperature were varied in four levels to quantify their effect on the grafting yield (GY). The validity of the statistical model was supported by the small deviation between the predicted (GY=61%) and experimental (GY=57%) values. The optimum conditions for enhancing GY were determined at the following values: monomer concentration of 48 vol%, absorbed dose of 64 kGy, reaction time of 4 h and temperature of 68 °C. A comparison was made between the optimization model developed for the present grafting system and that for grafting of 1-vinylimidazole (1-VIm) onto ETFE to confirm the validly and reliability of the Box-Behnken for the optimization of various radiation induced grafting reactions. Fourier transform infrared (FTIR), thermogravimetric analysis (TGA) and X-ray diffraction (XRD) were used to investigate the properties of the obtained films and provide evidence for grafting.

  14. Control Parameters Optimization Based on Co-Simulation of a Mechatronic System for an UA-Based Two-Axis Inertially Stabilized Platform.

    PubMed

    Zhou, Xiangyang; Zhao, Beilei; Gong, Guohao

    2015-08-14

    This paper presents a method based on co-simulation of a mechatronic system to optimize the control parameters of a two-axis inertially stabilized platform system (ISP) applied in an unmanned airship (UA), by which high control performance and reliability of the ISP system are achieved. First, a three-dimensional structural model of the ISP is built by using the three-dimensional parametric CAD software SOLIDWORKS(®); then, to analyze the system's kinematic and dynamic characteristics under operating conditions, dynamics modeling is conducted by using the multi-body dynamics software ADAMS™, thus the main dynamic parameters such as displacement, velocity, acceleration and reaction curve are obtained, respectively, through simulation analysis. Then, those dynamic parameters were input into the established MATLAB(®) SIMULINK(®) controller to simulate and test the performance of the control system. By these means, the ISP control parameters are optimized. To verify the methods, experiments were carried out by applying the optimized parameters to the control system of a two-axis ISP. The results show that the co-simulation by using virtual prototyping (VP) is effective to obtain optimized ISP control parameters, eventually leading to high ISP control performance.

  15. Control Parameters Optimization Based on Co-Simulation of a Mechatronic System for an UA-Based Two-Axis Inertially Stabilized Platform

    PubMed Central

    Zhou, Xiangyang; Zhao, Beilei; Gong, Guohao

    2015-01-01

    This paper presents a method based on co-simulation of a mechatronic system to optimize the control parameters of a two-axis inertially stabilized platform system (ISP) applied in an unmanned airship (UA), by which high control performance and reliability of the ISP system are achieved. First, a three-dimensional structural model of the ISP is built by using the three-dimensional parametric CAD software SOLIDWORKS®; then, to analyze the system’s kinematic and dynamic characteristics under operating conditions, dynamics modeling is conducted by using the multi-body dynamics software ADAMS™, thus the main dynamic parameters such as displacement, velocity, acceleration and reaction curve are obtained, respectively, through simulation analysis. Then, those dynamic parameters were input into the established MATLAB® SIMULINK® controller to simulate and test the performance of the control system. By these means, the ISP control parameters are optimized. To verify the methods, experiments were carried out by applying the optimized parameters to the control system of a two-axis ISP. The results show that the co-simulation by using virtual prototyping (VP) is effective to obtain optimized ISP control parameters, eventually leading to high ISP control performance. PMID:26287210

  16. Crossover versus mutation: a comparative analysis of the evolutionary strategy of genetic algorithms applied to combinatorial optimization problems.

    PubMed

    Osaba, E; Carballedo, R; Diaz, F; Onieva, E; de la Iglesia, I; Perallos, A

    2014-01-01

    Since their first formulation, genetic algorithms (GAs) have been one of the most widely used techniques to solve combinatorial optimization problems. The basic structure of the GAs is known by the scientific community, and thanks to their easy application and good performance, GAs are the focus of a lot of research works annually. Although throughout history there have been many studies analyzing various concepts of GAs, in the literature there are few studies that analyze objectively the influence of using blind crossover operators for combinatorial optimization problems. For this reason, in this paper a deep study on the influence of using them is conducted. The study is based on a comparison of nine techniques applied to four well-known combinatorial optimization problems. Six of the techniques are GAs with different configurations, and the remaining three are evolutionary algorithms that focus exclusively on the mutation process. Finally, to perform a reliable comparison of these results, a statistical study of them is made, performing the normal distribution z-test.

  17. Crossover versus Mutation: A Comparative Analysis of the Evolutionary Strategy of Genetic Algorithms Applied to Combinatorial Optimization Problems

    PubMed Central

    Osaba, E.; Carballedo, R.; Diaz, F.; Onieva, E.; de la Iglesia, I.; Perallos, A.

    2014-01-01

    Since their first formulation, genetic algorithms (GAs) have been one of the most widely used techniques to solve combinatorial optimization problems. The basic structure of the GAs is known by the scientific community, and thanks to their easy application and good performance, GAs are the focus of a lot of research works annually. Although throughout history there have been many studies analyzing various concepts of GAs, in the literature there are few studies that analyze objectively the influence of using blind crossover operators for combinatorial optimization problems. For this reason, in this paper a deep study on the influence of using them is conducted. The study is based on a comparison of nine techniques applied to four well-known combinatorial optimization problems. Six of the techniques are GAs with different configurations, and the remaining three are evolutionary algorithms that focus exclusively on the mutation process. Finally, to perform a reliable comparison of these results, a statistical study of them is made, performing the normal distribution z-test. PMID:25165731

  18. On the optimization of electromagnetic geophysical data: Application of the PSO algorithm

    NASA Astrophysics Data System (ADS)

    Godio, A.; Santilano, A.

    2018-01-01

    Particle Swarm optimization (PSO) algorithm resolves constrained multi-parameter problems and is suitable for simultaneous optimization of linear and nonlinear problems, with the assumption that forward modeling is based on good understanding of ill-posed problem for geophysical inversion. We apply PSO for solving the geophysical inverse problem to infer an Earth model, i.e. the electrical resistivity at depth, consistent with the observed geophysical data. The method doesn't require an initial model and can be easily constrained, according to external information for each single sounding. The optimization process to estimate the model parameters from the electromagnetic soundings focuses on the discussion of the objective function to be minimized. We discuss the possibility to introduce in the objective function vertical and lateral constraints, with an Occam-like regularization. A sensitivity analysis allowed us to check the performance of the algorithm. The reliability of the approach is tested on synthetic, real Audio-Magnetotelluric (AMT) and Long Period MT data. The method appears able to solve complex problems and allows us to estimate the a posteriori distribution of the model parameters.

  19. Simulation-Driven Design Approach for Design and Optimization of Blankholder

    NASA Astrophysics Data System (ADS)

    Sravan, Tatipala; Suddapalli, Nikshep R.; Johan, Pilthammar; Mats, Sigvant; Christian, Johansson

    2017-09-01

    Reliable design of stamping dies is desired for efficient and safe production. The design of stamping dies are today mostly based on casting feasibility, although it can also be based on criteria for fatigue, stiffness, safety, economy. Current work presents an approach that is built on Simulation Driven Design, enabling Design Optimization to address this issue. A structural finite element model of a stamping die, used to produce doors for Volvo V70/S80 car models, is studied. This die had developed cracks during its usage. To understand the behaviour of stress distribution in the stamping die, structural analysis of the die is conducted and critical regions with high stresses are identified. The results from structural FE-models are compared with analytical calculations pertaining to fatigue properties of the material. To arrive at an optimum design with increased stiffness and lifetime, topology and free-shape optimization are performed. In the optimization routine, identified critical regions of the die are set as design variables. Other optimization variables are set to maintain manufacturability of the resultant stamping die. Thereafter a CAD model is built based on geometrical results from topology and free-shape optimizations. Then the CAD model is subjected to structural analysis to visualize the new stress distribution. This process is iterated until a satisfactory result is obtained. The final results show reduction in stress levels by 70% with a more homogeneous distribution. Even though mass of the die is increased by 17 %, overall, a stiffer die with better lifetime is obtained. Finally, by reflecting on the entire process, a coordinated approach to handle such situations efficiently is presented.

  20. The operations of quantum logic gates with pure and mixed initial states.

    PubMed

    Chen, Jun-Liang; Li, Che-Ming; Hwang, Chi-Chuan; Ho, Yi-Hui

    2011-04-07

    The implementations of quantum logic gates realized by the rovibrational states of a C(12)O(16) molecule in the X((1)Σ(+)) electronic ground state are investigated. Optimal laser fields are obtained by using the modified multitarget optimal theory (MTOCT) which combines the maxima of the cost functional and the fidelity for state and quantum process. The projection operator technique together with modified MTOCT is used to get optimal laser fields. If initial states of the quantum gate are pure states, states at target time approach well to ideal target states. However, if the initial states are mixed states, the target states do not approach well to ideal ones. The process fidelity is introduced to investigate the reliability of the quantum gate operation driven by the optimal laser field. We found that the quantum gates operate reliably whether the initial states are pure or mixed.

  1. Development and optimization of a stove-powered thermoelectric generator

    NASA Astrophysics Data System (ADS)

    Mastbergen, Dan

    Almost a third of the world's population still lacks access to electricity. Most of these people use biomass stoves for cooking which produce significant amounts of wasted thermal energy, but no electricity. Less than 1% of this energy in the form of electricity would be adequate for basic tasks such as lighting and communications. However, an affordable and reliable means of accomplishing this is currently nonexistent. The goal of this work is to develop a thermoelectric generator to convert a small amount of wasted heat into electricity. Although this concept has been around for decades, previous attempts have failed due to insufficient analysis of the system as a whole, leading to ineffective and costly designs. In this work, a complete design process is undertaken including concept generation, prototype testing, field testing, and redesign/optimization. Detailed component models are constructed and integrated to create a full system model. The model encompasses the stove operation, thermoelectric module, heat sinks, charging system and battery. A 3000 cycle endurance test was also conducted to evaluate the effects of operating temperature, module quality, and thermal interface quality on the generator's reliability, lifetime and cost effectiveness. The results from this testing are integrated into the system model to determine the lowest system cost in $/Watt over a five year period. Through this work the concept of a stove-based thermoelectric generator is shown to be technologically and economically feasible. In addition, a methodology is developed for optimizing the system for specific regional stove usage habits.

  2. Award-Winning CARES/Life Ceramics Durability Evaluation Software Is Making Advanced Technology Accessible

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CARES/Life software developed at the NASA Lewis Research Center eases this by providing a tool that uses probabilistic reliability analysis techniques to optimize the design and manufacture of brittle material components. CARES/Life is an integrated package that predicts the probability of a monolithic ceramic component's failure as a function of its time in service. It couples commercial finite element programs--which resolve a component's temperature and stress distribution - with reliability evaluation and fracture mechanics routines for modeling strength - limiting defects. These routines are based on calculations of the probabilistic nature of the brittle material's strength.

  3. Validity and reliability of the Spanish-language version of the self-administered Leeds Assessment of Neuropathic Symptoms and Signs (S-LANSS) pain scale.

    PubMed

    López-de-Uralde-Villanueva, I; Gil-Martínez, A; Candelas-Fernández, P; de Andrés-Ares, J; Beltrán-Alacreu, H; La Touche, R

    2016-12-08

    The self-administered Leeds Assessment of Neuropathic Symptoms and Signs (S-LANSS) scale is a tool designed to identify patients with pain with neuropathic features. To assess the validity and reliability of the Spanish-language version of the S-LANSS scale. Our study included a total of 182 patients with chronic pain to assess the convergent and discriminant validity of the S-LANSS; the sample was increased to 321 patients to evaluate construct validity and reliability. The validated Spanish-language version of the ID-Pain questionnaire was used as the criterion variable. All participants completed the ID-Pain, the S-LANSS, and the Numerical Rating Scale for pain. Discriminant validity was evaluated by analysing sensitivity, specificity, and the area under the receiver operating characteristic curve (AUC). Construct validity was assessed with factor analysis and by comparing the odds ratio of each S-LANSS item to the total score. Convergent validity and reliability were evaluated with Pearson's r and Cronbach's alpha, respectively. The optimal cut-off point for S-LANSS was ≥12 points (AUC=.89; sensitivity=88.7; specificity=76.6). Factor analysis yielded one factor; furthermore, all items contributed significantly to the positive total score on the S-LANSS (P<.05). The S-LANSS showed a significant correlation with ID-Pain (r=.734, α=.71). The Spanish-language version of the S-LANSS is valid and reliable for identifying patients with chronic pain with neuropathic features. Copyright © 2016 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  4. Quantification of larval zebrafish motor function in multi-well plates using open-source MATLAB® applications

    PubMed Central

    Zhou, Yangzhong; Cattley, Richard T.; Cario, Clinton L.; Bai, Qing; Burton, Edward A.

    2014-01-01

    This article describes a method to quantify the movements of larval zebrafish in multi-well plates, using the open-source MATLAB® applications LSRtrack and LSRanalyze. The protocol comprises four stages: generation of high-quality, flatly-illuminated video recordings with exposure settings that facilitate object recognition; analysis of the resulting recordings using tools provided in LSRtrack to optimize tracking accuracy and motion detection; analysis of tracking data using LSRanalyze or custom MATLAB® scripts; implementation of validation controls. The method is reliable, automated and flexible, requires less than one hour of hands-on work for completion once optimized, and shows excellent signal:noise characteristics. The resulting data can be analyzed to determine: positional preference; displacement, velocity and acceleration; duration and frequency of movement events and rest periods. This approach is widely applicable to analyze spontaneous or stimulus-evoked zebrafish larval neurobehavioral phenotypes resulting from a broad array of genetic and environmental manipulations, in a multi-well plate format suitable for high-throughput applications. PMID:24901738

  5. Optimized multiparametric flow cytometric analysis of circulating endothelial cells and their subpopulations in peripheral blood of patients with solid tumors: a technical analysis.

    PubMed

    Zhou, Fangbin; Zhou, Yaying; Yang, Ming; Wen, Jinli; Dong, Jun; Tan, Wenyong

    2018-01-01

    Circulating endothelial cells (CECs) and their subpopulations could be potential novel biomarkers for various malignancies. However, reliable enumerable methods are warranted to further improve their clinical utility. This study aimed to optimize a flow cytometric method (FCM) assay for CECs and subpopulations in peripheral blood for patients with solid cancers. An FCM assay was used to detect and identify CECs. A panel of 60 blood samples, including 44 metastatic cancer patients and 16 healthy controls, were used in this study. Some key issues of CEC enumeration, including sample material and anticoagulant selection, optimal titration of antibodies, lysis/wash procedures of blood sample preparation, conditions of sample storage, sufficient cell events to enhance the signal, fluorescence-minus-one controls instead of isotype controls to reduce background noise, optimal selection of cell surface markers, and evaluating the reproducibility of our method, were integrated and investigated. Wilcoxon and Mann-Whitney U tests were used to determine statistically significant differences. In this validation study, we refined a five-color FCM method to detect CECs and their subpopulations in peripheral blood of patients with solid tumors. Several key technical issues regarding preanalytical elements, FCM data acquisition, and analysis were addressed. Furthermore, we clinically validated the utility of our method. The baseline levels of mature CECs, endothelial progenitor cells, and activated CECs were higher in cancer patients than healthy subjects ( P <0.01). However, there was no significant difference in resting CEC levels between healthy subjects and cancer patients ( P =0.193). We integrated and comprehensively addressed significant technical issues found in previously published assays and validated the reproducibility and sensitivity of our proposed method. Future work is required to explore the potential of our optimized method in clinical oncologic applications.

  6. Choosing the best image processing method for masticatory performance assessment when using two-coloured specimens.

    PubMed

    Vaccaro, G; Pelaez, J I; Gil, J A

    2016-07-01

    Objective masticatory performance assessment using two-coloured specimens relies on image processing techniques; however, just a few approaches have been tested and no comparative studies are reported. The aim of this study was to present a selection procedure of the optimal image analysis method for masticatory performance assessment with a given two-coloured chewing gum. Dentate participants (n = 250; 25 ± 6·3 years) chewed red-white chewing gums for 3, 6, 9, 12, 15, 18, 21 and 25 cycles (2000 samples). Digitalised images of retrieved specimens were analysed using 122 image processing methods (IPMs) based on feature extraction algorithms (pixel values and histogram analysis). All IPMs were tested following the criteria of: normality of measurements (Kolmogorov-Smirnov), ability to detect differences among mixing states (anova corrected with post hoc Bonferroni) and moderate-to-high correlation with the number of cycles (Spearman's Rho). The optimal IPM was chosen using multiple criteria decision analysis (MCDA). Measurements provided by all IPMs proved to be normally distributed (P < 0·05), 116 proved sensible to mixing states (P < 0·05), and 35 showed moderate-to-high correlation with the number of cycles (|ρ| > 0·5; P < 0·05). The variance of the histogram of the Hue showed the highest correlation with the number of cycles (ρ = 0·792; P < 0·0001) and the highest MCDA score (optimal). The proposed procedure proved to be reliable and able to select the optimal approach among multiple IPMs. This experiment may be reproduced to identify the optimal approach for each case of locally available test foods. © 2016 John Wiley & Sons Ltd.

  7. Accounting for Proof Test Data in a Reliability Based Design Optimization Framework

    NASA Technical Reports Server (NTRS)

    Ventor, Gerharad; Scotti, Stephen J.

    2012-01-01

    This paper investigates the use of proof (or acceptance) test data during the reliability based design optimization of structural components. It is assumed that every component will be proof tested and that the component will only enter into service if it passes the proof test. The goal is to reduce the component weight, while maintaining high reliability, by exploiting the proof test results during the design process. The proposed procedure results in the simultaneous design of the structural component and the proof test itself and provides the designer with direct control over the probability of failing the proof test. The procedure is illustrated using two analytical example problems and the results indicate that significant weight savings are possible when exploiting the proof test results during the design process.

  8. Statistical image quantification toward optimal scan fusion and change quantification

    NASA Astrophysics Data System (ADS)

    Potesil, Vaclav; Zhou, Xiang Sean

    2007-03-01

    Recent advance of imaging technology has brought new challenges and opportunities for automatic and quantitative analysis of medical images. With broader accessibility of more imaging modalities for more patients, fusion of modalities/scans from one time point and longitudinal analysis of changes across time points have become the two most critical differentiators to support more informed, more reliable and more reproducible diagnosis and therapy decisions. Unfortunately, scan fusion and longitudinal analysis are both inherently plagued with increased levels of statistical errors. A lack of comprehensive analysis by imaging scientists and a lack of full awareness by physicians pose potential risks in clinical practice. In this paper, we discuss several key error factors affecting imaging quantification, studying their interactions, and introducing a simulation strategy to establish general error bounds for change quantification across time. We quantitatively show that image resolution, voxel anisotropy, lesion size, eccentricity, and orientation are all contributing factors to quantification error; and there is an intricate relationship between voxel anisotropy and lesion shape in affecting quantification error. Specifically, when two or more scans are to be fused at feature level, optimal linear fusion analysis reveals that scans with voxel anisotropy aligned with lesion elongation should receive a higher weight than other scans. As a result of such optimal linear fusion, we will achieve a lower variance than naïve averaging. Simulated experiments are used to validate theoretical predictions. Future work based on the proposed simulation methods may lead to general guidelines and error lower bounds for quantitative image analysis and change detection.

  9. Design and multiphysics analysis of a 176Â MHz continuous-wave radio-frequency quadrupole

    NASA Astrophysics Data System (ADS)

    Kutsaev, S. V.; Mustapha, B.; Ostroumov, P. N.; Barcikowski, A.; Schrage, D.; Rodnizki, J.; Berkovits, D.

    2014-07-01

    We have developed a new design for a 176 MHz cw radio-frequency quadrupole (RFQ) for the SARAF upgrade project. At this frequency, the proposed design is a conventional four-vane structure. The main design goals are to provide the highest possible shunt impedance while limiting the required rf power to about 120 kW for reliable cw operation, and the length to about 4 meters. If built as designed, the proposed RFQ will be the first four-vane cw RFQ built as a single cavity (no resonant coupling required) that does not require π-mode stabilizing loops or dipole rods. For this, we rely on very detailed 3D simulations of all aspects of the structure and the level of machining precision achieved on the recently developed ATLAS upgrade RFQ. A full 3D model of the structure including vane modulation was developed. The design was optimized using electromagnetic and multiphysics simulations. Following the choice of the vane type and geometry, the vane undercuts were optimized to produce a flat field along the structure. The final design has good mode separation and should not need dipole rods if built as designed, but their effect was studied in the case of manufacturing errors. The tuners were also designed and optimized to tune the main mode without affecting the field flatness. Following the electromagnetic (EM) design optimization, a multiphysics engineering analysis of the structure was performed. The multiphysics analysis is a coupled electromagnetic, thermal and mechanical analysis. The cooling channels, including their paths and sizes, were optimized based on the limiting temperature and deformation requirements. The frequency sensitivity to the RFQ body and vane cooling water temperatures was carefully studied in order to use it for frequency fine-tuning. Finally, an inductive rf power coupler design based on the ATLAS RFQ coupler was developed and simulated. The EM design optimization was performed using cst Microwave Studio and the results were verified using both hfss and ansys. The engineering analysis was performed using hfss and ansys and most of the results were verified using the newly developed cst Multiphysics package.

  10. OPTIMIZING EXPOSURE MEASUREMENT TECHNIQUES

    EPA Science Inventory

    The research reported in this task description addresses one of a series of interrelated NERL tasks with the common goal of optimizing the predictive power of low cost, reliable exposure measurements for the planned Interagency National Children's Study (NCS). Specifically, we w...

  11. Implications of scaling on static RAM bit cell stability and reliability

    NASA Astrophysics Data System (ADS)

    Coones, Mary Ann; Herr, Norm; Bormann, Al; Erington, Kent; Soorholtz, Vince; Sweeney, John; Phillips, Michael

    1993-01-01

    In order to lower manufacturing costs and increase performance, static random access memory (SRAM) bit cells are scaled progressively toward submicron geometries. The reliability of an SRAM is highly dependent on the bit cell stability. Smaller memory cells with less capacitance and restoring current make the array more susceptible to failures from defectivity, alpha hits, and other instabilities and leakage mechanisms. Improving long term reliability while migrating to higher density devices makes the task of building in and improving reliability increasingly difficult. Reliability requirements for high density SRAMs are very demanding with failure rates of less than 100 failures per billion device hours (100 FITs) being a common criteria. Design techniques for increasing bit cell stability and manufacturability must be implemented in order to build in this level of reliability. Several types of analyses are performed to benchmark the performance of the SRAM device. Examples of these analysis techniques which are presented here include DC parametric measurements of test structures, functional bit mapping of the circuit used to characterize the entire distribution of bits, electrical microprobing of weak and/or failing bits, and system and accelerated soft error rate measurements. These tests allow process and design improvements to be evaluated prior to implementation on the final product. These results are used to provide comprehensive bit cell characterization which can then be compared to device models and adjusted accordingly to provide optimized cell stability versus cell size for a particular technology. The result is designed in reliability which can be accomplished during the early stages of product development.

  12. Wind farm optimization using evolutionary algorithms

    NASA Astrophysics Data System (ADS)

    Ituarte-Villarreal, Carlos M.

    In recent years, the wind power industry has focused its efforts on solving the Wind Farm Layout Optimization (WFLO) problem. Wind resource assessment is a pivotal step in optimizing the wind-farm design and siting and, in determining whether a project is economically feasible or not. In the present work, three (3) different optimization methods are proposed for the solution of the WFLO: (i) A modified Viral System Algorithm applied to the optimization of the proper location of the components in a wind-farm to maximize the energy output given a stated wind environment of the site. The optimization problem is formulated as the minimization of energy cost per unit produced and applies a penalization for the lack of system reliability. The viral system algorithm utilized in this research solves three (3) well-known problems in the wind-energy literature; (ii) a new multiple objective evolutionary algorithm to obtain optimal placement of wind turbines while considering the power output, cost, and reliability of the system. The algorithm presented is based on evolutionary computation and the objective functions considered are the maximization of power output, the minimization of wind farm cost and the maximization of system reliability. The final solution to this multiple objective problem is presented as a set of Pareto solutions and, (iii) A hybrid viral-based optimization algorithm adapted to find the proper component configuration for a wind farm with the introduction of the universal generating function (UGF) analytical approach to discretize the different operating or mechanical levels of the wind turbines in addition to the various wind speed states. The proposed methodology considers the specific probability functions of the wind resource to describe their proper behaviors to account for the stochastic comportment of the renewable energy components, aiming to increase their power output and the reliability of these systems. The developed heuristic considers a variable number of system components and wind turbines with different operating characteristics and sizes, to have a more heterogeneous model that can deal with changes in the layout and in the power generation requirements over the time. Moreover, the approach evaluates the impact of the wind-wake effect of the wind turbines upon one another to describe and evaluate the power production capacity reduction of the system depending on the layout distribution of the wind turbines.

  13. Optimal orientation in flows: providing a benchmark for animal movement strategies.

    PubMed

    McLaren, James D; Shamoun-Baranes, Judy; Dokter, Adriaan M; Klaassen, Raymond H G; Bouten, Willem

    2014-10-06

    Animal movements in air and water can be strongly affected by experienced flow. While various flow-orientation strategies have been proposed and observed, their performance in variable flow conditions remains unclear. We apply control theory to establish a benchmark for time-minimizing (optimal) orientation. We then define optimal orientation for movement in steady flow patterns and, using dynamic wind data, for short-distance mass movements of thrushes (Turdus sp.) and 6000 km non-stop migratory flights by great snipes, Gallinago media. Relative to the optimal benchmark, we assess the efficiency (travel speed) and reliability (success rate) of three generic orientation strategies: full compensation for lateral drift, vector orientation (single-heading movement) and goal orientation (continually heading towards the goal). Optimal orientation is characterized by detours to regions of high flow support, especially when flow speeds approach and exceed the animal's self-propelled speed. In strong predictable flow (short distance thrush flights), vector orientation adjusted to flow on departure is nearly optimal, whereas for unpredictable flow (inter-continental snipe flights), only goal orientation was near-optimally reliable and efficient. Optimal orientation provides a benchmark for assessing efficiency of responses to complex flow conditions, thereby offering insight into adaptive flow-orientation across taxa in the light of flow strength, predictability and navigation capacity.

  14. Optimal orientation in flows: providing a benchmark for animal movement strategies

    PubMed Central

    McLaren, James D.; Shamoun-Baranes, Judy; Dokter, Adriaan M.; Klaassen, Raymond H. G.; Bouten, Willem

    2014-01-01

    Animal movements in air and water can be strongly affected by experienced flow. While various flow-orientation strategies have been proposed and observed, their performance in variable flow conditions remains unclear. We apply control theory to establish a benchmark for time-minimizing (optimal) orientation. We then define optimal orientation for movement in steady flow patterns and, using dynamic wind data, for short-distance mass movements of thrushes (Turdus sp.) and 6000 km non-stop migratory flights by great snipes, Gallinago media. Relative to the optimal benchmark, we assess the efficiency (travel speed) and reliability (success rate) of three generic orientation strategies: full compensation for lateral drift, vector orientation (single-heading movement) and goal orientation (continually heading towards the goal). Optimal orientation is characterized by detours to regions of high flow support, especially when flow speeds approach and exceed the animal's self-propelled speed. In strong predictable flow (short distance thrush flights), vector orientation adjusted to flow on departure is nearly optimal, whereas for unpredictable flow (inter-continental snipe flights), only goal orientation was near-optimally reliable and efficient. Optimal orientation provides a benchmark for assessing efficiency of responses to complex flow conditions, thereby offering insight into adaptive flow-orientation across taxa in the light of flow strength, predictability and navigation capacity. PMID:25056213

  15. Uncertainty-Based Multi-Objective Optimization of Groundwater Remediation Design

    NASA Astrophysics Data System (ADS)

    Singh, A.; Minsker, B.

    2003-12-01

    Management of groundwater contamination is a cost-intensive undertaking filled with conflicting objectives and substantial uncertainty. A critical source of this uncertainty in groundwater remediation design problems comes from the hydraulic conductivity values for the aquifer, upon which the prediction of flow and transport of contaminants are dependent. For a remediation solution to be reliable in practice it is important that it is robust over the potential error in the model predictions. This work focuses on incorporating such uncertainty within a multi-objective optimization framework, to get reliable as well as Pareto optimal solutions. Previous research has shown that small amounts of sampling within a single-objective genetic algorithm can produce highly reliable solutions. However with multiple objectives the noise can interfere with the basic operations of a multi-objective solver, such as determining non-domination of individuals, diversity preservation, and elitism. This work proposes several approaches to improve the performance of noisy multi-objective solvers. These include a simple averaging approach, taking samples across the population (which we call extended averaging), and a stochastic optimization approach. All the approaches are tested on standard multi-objective benchmark problems and a hypothetical groundwater remediation case-study; the best-performing approach is then tested on a field-scale case at Umatilla Army Depot.

  16. Reliability and paste process optimization of eutectic and lead-free for mixed packaging

    NASA Technical Reports Server (NTRS)

    Ramkumar, S. M.; Ganeshan, V.; Thenalur, K.; Ghaffarian, R.

    2002-01-01

    This paper reports the results of an experiment that utilized the JPL's area array consortium test vehicle design, containing a myriad of mixed technology components with an OSP finish. The details of the reliability study are presented in this paper.

  17. A pragmatic decision model for inventory management with heterogeneous suppliers

    NASA Astrophysics Data System (ADS)

    Nakandala, Dilupa; Lau, Henry; Zhang, Jingjing; Gunasekaran, Angappa

    2018-05-01

    For enterprises, it is imperative that the trade-off between the cost of inventory and risk implications is managed in the most efficient manner. To explore this, we use the common example of a wholesaler operating in an environment where suppliers demonstrate heterogeneous reliability. The wholesaler has partial orders with dual suppliers and uses lateral transshipments. While supplier reliability is a key concern in inventory management, reliable suppliers are more expensive and investment in strategic approaches that improve supplier performance carries a high cost. Here we consider the operational strategy of dual sourcing with reliable and unreliable suppliers and model the total inventory cost where the likely scenario lead-time of the unreliable suppliers extends beyond the scheduling period. We then develop a Customized Integer Programming Optimization Model to determine the optimum size of partial orders with multiple suppliers. In addition to the objective of total cost optimization, this study takes into account the volatility of the cost associated with the uncertainty of an inventory system.

  18. A reliability as an independent variable (RAIV) methodology for optimizing test planning for liquid rocket engines

    NASA Astrophysics Data System (ADS)

    Strunz, Richard; Herrmann, Jeffrey W.

    2011-12-01

    The hot fire test strategy for liquid rocket engines has always been a concern of space industry and agency alike because no recognized standard exists. Previous hot fire test plans focused on the verification of performance requirements but did not explicitly include reliability as a dimensioning variable. The stakeholders are, however, concerned about a hot fire test strategy that balances reliability, schedule, and affordability. A multiple criteria test planning model is presented that provides a framework to optimize the hot fire test strategy with respect to stakeholder concerns. The Staged Combustion Rocket Engine Demonstrator, a program of the European Space Agency, is used as example to provide the quantitative answer to the claim that a reduced thrust scale demonstrator is cost beneficial for a subsequent flight engine development. Scalability aspects of major subsystems are considered in the prior information definition inside the Bayesian framework. The model is also applied to assess the impact of an increase of the demonstrated reliability level on schedule and affordability.

  19. Arsenic removal from contaminated groundwater by membrane-integrated hybrid plant: optimization and control using Visual Basic platform.

    PubMed

    Chakrabortty, S; Sen, M; Pal, P

    2014-03-01

    A simulation software (ARRPA) has been developed in Microsoft Visual Basic platform for optimization and control of a novel membrane-integrated arsenic separation plant in the backdrop of absence of such software. The user-friendly, menu-driven software is based on a dynamic linearized mathematical model, developed for the hybrid treatment scheme. The model captures the chemical kinetics in the pre-treating chemical reactor and the separation and transport phenomena involved in nanofiltration. The software has been validated through extensive experimental investigations. The agreement between the outputs from computer simulation program and the experimental findings are excellent and consistent under varying operating conditions reflecting high degree of accuracy and reliability of the software. High values of the overall correlation coefficient (R (2) = 0.989) and Willmott d-index (0.989) are indicators of the capability of the software in analyzing performance of the plant. The software permits pre-analysis, manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. Performance analysis of the whole system as well as the individual units is possible using the tool. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for removal of arsenic from contaminated groundwater.

  20. Item response theory analysis of the life orientation test-revised: age and gender differential item functioning analyses.

    PubMed

    Steca, Patrizia; Monzani, Dario; Greco, Andrea; Chiesi, Francesca; Primi, Caterina

    2015-06-01

    This study is aimed at testing the measurement properties of the Life Orientation Test-Revised (LOT-R) for the assessment of dispositional optimism by employing item response theory (IRT) analyses. The LOT-R was administered to a large sample of 2,862 Italian adults. First, confirmatory factor analyses demonstrated the theoretical conceptualization of the construct measured by the LOT-R as a single bipolar dimension. Subsequently, IRT analyses for polytomous, ordered response category data were applied to investigate the items' properties. The equivalence of the items across gender and age was assessed by analyzing differential item functioning. Discrimination and severity parameters indicated that all items were able to distinguish people with different levels of optimism and adequately covered the spectrum of the latent trait. Additionally, the LOT-R appears to be gender invariant and, with minor exceptions, age invariant. Results provided evidence that the LOT-R is a reliable and valid measure of dispositional optimism. © The Author(s) 2014.

  1. Optimizing landslide susceptibility zonation: Effects of DEM spatial resolution and slope unit delineation on logistic regression models

    NASA Astrophysics Data System (ADS)

    Schlögel, R.; Marchesini, I.; Alvioli, M.; Reichenbach, P.; Rossi, M.; Malet, J.-P.

    2018-01-01

    We perform landslide susceptibility zonation with slope units using three digital elevation models (DEMs) of varying spatial resolution of the Ubaye Valley (South French Alps). In so doing, we applied a recently developed algorithm automating slope unit delineation, given a number of parameters, in order to optimize simultaneously the partitioning of the terrain and the performance of a logistic regression susceptibility model. The method allowed us to obtain optimal slope units for each available DEM spatial resolution. For each resolution, we studied the susceptibility model performance by analyzing in detail the relevance of the conditioning variables. The analysis is based on landslide morphology data, considering either the whole landslide or only the source area outline as inputs. The procedure allowed us to select the most useful information, in terms of DEM spatial resolution, thematic variables and landslide inventory, in order to obtain the most reliable slope unit-based landslide susceptibility assessment.

  2. Determination of volatile marker compounds in raw ham using headspace-trap gas chromatography.

    PubMed

    Bosse Née Danz, Ramona; Wirth, Melanie; Konstanz, Annette; Becker, Thomas; Weiss, Jochen; Gibis, Monika

    2017-03-15

    A simple, reliable and automated method was developed and optimized for qualification and quantification of aroma-relevant volatile marker compounds of North European raw ham using a headspace (HS)-Trap gas chromatography-mass spectrometry (GC-MS) and GC-flame ionization detector (FID) analysis. A total of 38 volatile compounds were detected with this HS-Trap GC-MS method amongst which the largest groups were ketones (12), alcohols (8), hydrocarbons (7), aldehydes (6) and esters (3). The HS-Trap GC-FID method was optimized for the parameters: thermostatting time and temperature, vial and desorption pressure, number of extraction cycles and salt addition. A validation for 13 volatile marker compounds with limits of detection in ng/g was carried out. The optimized method can serve as alternative to conventional headspace and solid phase micro extraction methods and allows users to determine volatile compounds in raw hams making it of interest to industrial and academic meat scientists. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Optimization and real-time control for laser treatment of heterogeneous soft tissues.

    PubMed

    Feng, Yusheng; Fuentes, David; Hawkins, Andrea; Bass, Jon M; Rylander, Marissa Nichole

    2009-01-01

    Predicting the outcome of thermotherapies in cancer treatment requires an accurate characterization of the bioheat transfer processes in soft tissues. Due to the biological and structural complexity of tumor (soft tissue) composition and vasculature, it is often very difficult to obtain reliable tissue properties that is one of the key factors for the accurate treatment outcome prediction. Efficient algorithms employing in vivo thermal measurements to determine heterogeneous thermal tissues properties in conjunction with a detailed sensitivity analysis can produce essential information for model development and optimal control. The goals of this paper are to present a general formulation of the bioheat transfer equation for heterogeneous soft tissues, review models and algorithms developed for cell damage, heat shock proteins, and soft tissues with nanoparticle inclusion, and demonstrate an overall computational strategy for developing a laser treatment framework with the ability to perform real-time robust calibrations and optimal control. This computational strategy can be applied to other thermotherapies using the heat source such as radio frequency or high intensity focused ultrasound.

  4. Finite element modelling of chain-die forming for ultra-high strength steel

    NASA Astrophysics Data System (ADS)

    Majji, Raju; Xiang, Yang; Ding, Scott; Yang, Chunhui

    2017-10-01

    There has been a high demand for weight reduction in automotive vehicles while maintaining passenger safety. A potential steel material to achieve this is Ultra High Strength Steel (UHSS). As a high strength material, it is difficult to be formed with desired profiles using traditional sheet metal forming processes such as Cold Roll Forming. To overcome this problem, a potentially alternative solution is Chain-die Forming (CDF), recently developed. The basic principal of the CDF is to fully combine roll forming and bending processes. The main advantage of this process is the elongated deformation length that significantly increases effective roll radius. This study focuses on identifying issues with the CDF by using CAD modelling, Motion Analysis and Finite Element Analysis (FEA) to devise solutions and construct a more reliable process in an optimal design sense. Some attempts on finite element modelling and simulation of the CDF were conducted using relatively simple models in literature and the research was still not sufficient enough for optimal design of a typical CDF for UHSS. Therefore two numerical models of Chain-die Forming process are developed in this study, including a) one having a set of rolls similar to roll forming but with a large radius, i.e., 20 meters; and b) the other one with dies and punch segments similar to a typical CDF machine. As a case study, to form a 60° channel with single pass was conducted using these two devised models for a comparison. The obtained numerical results clearly show the CDF could generate less residual stress, low strain and small springback of a single pass for the 60° UHSS channel. The design analysis procedure proposed in this study could greatly help the mechanical designers to devise a cost-effective and reliable CDF process for forming UHSS.

  5. Rapid determination of sugar level in snack products using infrared spectroscopy.

    PubMed

    Wang, Ting; Rodriguez-Saona, Luis E

    2012-08-01

    Real-time spectroscopic methods can provide a valuable window into food manufacturing to permit optimization of production rate, quality and safety. There is a need for cutting edge sensor technology directed at improving efficiency, throughput and reliability of critical processes. The aim of the research was to evaluate the feasibility of infrared systems combined with chemometric analysis to develop rapid methods for determination of sugars in cereal products. Samples were ground and spectra were collected using a mid-infrared (MIR) spectrometer equipped with a triple-bounce ZnSe MIRacle attenuated total reflectance accessory or Fourier transform near infrared (NIR) system equipped with a diffuse reflection-integrating sphere. Sugar contents were determined using a reference HPLC method. Partial least squares regression (PLSR) was used to create cross-validated calibration models. The predictability of the models was evaluated on an independent set of samples and compared with reference techniques. MIR and NIR spectra showed characteristic absorption bands for sugars, and generated excellent PLSR models (sucrose: SEP < 1.7% and r > 0.96). Multivariate models accurately and precisely predicted sugar level in snacks allowing for rapid analysis. This simple technique allows for reliable prediction of quality parameters, and automation enabling food manufacturers for early corrective actions that will ultimately save time and money while establishing a uniform quality. The U.S. snack food industry generates billions of dollars in revenue each year and vibrational spectroscopic methods combined with pattern recognition analysis could permit optimization of production rate, quality, and safety of many food products. This research showed that infrared spectroscopy is a powerful technique for near real-time (approximately 1 min) assessment of sugar content in various cereal products. © 2012 Institute of Food Technologists®

  6. Validity and reliability of four language mapping paradigms.

    PubMed

    Wilson, Stephen M; Bautista, Alexa; Yen, Melodie; Lauderdale, Stefanie; Eriksson, Dana K

    2017-01-01

    Language areas of the brain can be mapped in individual participants with functional MRI. We investigated the validity and reliability of four language mapping paradigms that may be appropriate for individuals with acquired aphasia: sentence completion, picture naming, naturalistic comprehension, and narrative comprehension. Five neurologically normal older adults were scanned on each of the four paradigms on four separate occasions. Validity was assessed in terms of whether activation patterns reflected the known typical organization of language regions, that is, lateralization to the left hemisphere, and involvement of the left inferior frontal gyrus and the left middle and/or superior temporal gyri. Reliability (test-retest reproducibility) was quantified in terms of the Dice coefficient of similarity, which measures overlap of activations across time points. We explored the impact of different absolute and relative voxelwise thresholds, a range of cluster size cutoffs, and limitation of analyses to a priori potential language regions. We found that the narrative comprehension and sentence completion paradigms offered the best balance of validity and reliability. However, even with optimal combinations of analysis parameters, there were many scans on which known features of typical language organization were not demonstrated, and test-retest reproducibility was only moderate for realistic parameter choices. These limitations in terms of validity and reliability may constitute significant limitations for many clinical or research applications that depend on identifying language regions in individual participants.

  7. Automated assessment of early hypoxic brain edema in non-enhanced CT predicts outcome in patients after cardiac arrest.

    PubMed

    Hanning, Uta; Sporns, Peter Bernhard; Lebiedz, Pia; Niederstadt, Thomas; Zoubi, Tarek; Schmidt, Rene; Knecht, Stefan; Heindel, Walter; Kemmling, André

    2016-07-01

    Early prediction of potential neurological recovery in patients after cardiac arrest is challenging. Recent studies suggest that the densitrometic gray-white matter ratio (GWR) determined from cranial computed tomography (CT) scans may be a reliable predictor of poor outcome. We evaluated an automated, rater independent method to determine GWR in CT as an early objective imaging predictor of clinical outcome. We analyzed imaging data of 84 patients after cardiac arrest that underwent noncontrast CT within 24h after arrest. To determine GWR in CT we applied two methods using a recently published automated probabilistic gray-white matter segmentation algorithm (GWR_aut) and conventional manual measurements within gray-white regions of interest (GWR_man). Neurological outcome was graded by the cerebral performance category (CPC). As part of standard routine CPC was assessed by the treating physician in the intensive care unit at admission and at discharge to normal ward. The performance of GWR measures (automated and manual) to predict the binary clinical endpoints of poor (CPC3-5) and good outcome (CPC1-2) was assessed by ROC analysis with increasing discrimination thresholds. Results of GWR_aut were compared to GWR_man of two raters. Of 84 patients, 55 (65%) showed a poor outcome. ROC curve analysis revealed reliable outcome prediction of GWR_aut (AUC 0.860) and GWR_man (AUC 0.707 and 0.699, respectively). Predictive power of GWR_aut was higher than GWR_man by each rater (p=0.019 and p=0.021, respectively) at an optimal cut-off of 1.084 to predict poor outcome (optimal criterion with 92.7% sensitivity, 72.4% specificity). Interrater reliability of GWR_man by intra-class correlation coefficient (ICC) was moderate (0.551). Automated quantification of GWR in CT may be used as an objective observer-independent imaging marker for outcome in patients after cardiac arrest. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. On the design of high-rise buildings with a specified level of reliability

    NASA Astrophysics Data System (ADS)

    Dolganov, Andrey; Kagan, Pavel

    2018-03-01

    High-rise buildings have a specificity, which significantly distinguishes them from traditional buildings of high-rise and multi-storey buildings. Steel structures in high-rise buildings are advisable to be used in earthquake-proof regions, since steel, due to its plasticity, provides damping of the kinetic energy of seismic impacts. These aspects should be taken into account when choosing a structural scheme of a high-rise building and designing load-bearing structures. Currently, modern regulatory documents do not quantify the reliability of structures. Although the problem of assigning an optimal level of reliability has existed for a long time. The article shows the possibility of designing metal structures of high-rise buildings with specified reliability. Currently, modern regulatory documents do not quantify the reliability of high-rise buildings. Although the problem of assigning an optimal level of reliability has existed for a long time. It is proposed to establish the value of reliability 0.99865 (3σ) for constructions of buildings and structures of a normal level of responsibility in calculations for the first group of limiting states. For increased (construction of high-rise buildings) and reduced levels of responsibility for the provision of load-bearing capacity, it is proposed to assign respectively 0.99997 (4σ) and 0.97725 (2σ). The coefficients of the use of the cross section of a metal beam for different levels of security are given.

  9. Optimal reinforcement of training datasets in semi-supervised landmark-based segmentation

    NASA Astrophysics Data System (ADS)

    Ibragimov, Bulat; Likar, Boštjan; Pernuš, Franjo; Vrtovec, Tomaž

    2015-03-01

    During the last couple of decades, the development of computerized image segmentation shifted from unsupervised to supervised methods, which made segmentation results more accurate and robust. However, the main disadvantage of supervised segmentation is a need for manual image annotation that is time-consuming and subjected to human error. To reduce the need for manual annotation, we propose a novel learning approach for training dataset reinforcement in the area of landmark-based segmentation, where newly detected landmarks are optimally combined with reference landmarks from the training dataset and therefore enriches the training process. The approach is formulated as a nonlinear optimization problem, where the solution is a vector of weighting factors that measures how reliable are the detected landmarks. The detected landmarks that are found to be more reliable are included into the training procedure with higher weighting factors, whereas the detected landmarks that are found to be less reliable are included with lower weighting factors. The approach is integrated into the landmark-based game-theoretic segmentation framework and validated against the problem of lung field segmentation from chest radiographs.

  10. Topology and boundary shape optimization as an integrated design tool

    NASA Technical Reports Server (NTRS)

    Bendsoe, Martin Philip; Rodrigues, Helder Carrico

    1990-01-01

    The optimal topology of a two dimensional linear elastic body can be computed by regarding the body as a domain of the plane with a high density of material. Such an optimal topology can then be used as the basis for a shape optimization method that computes the optimal form of the boundary curves of the body. This results in an efficient and reliable design tool, which can be implemented via common FEM mesh generator and CAD type input-output facilities.

  11. Including metabolite concentrations into flux balance analysis: thermodynamic realizability as a constraint on flux distributions in metabolic networks

    PubMed Central

    Hoppe, Andreas; Hoffmann, Sabrina; Holzhütter, Hermann-Georg

    2007-01-01

    Background In recent years, constrained optimization – usually referred to as flux balance analysis (FBA) – has become a widely applied method for the computation of stationary fluxes in large-scale metabolic networks. The striking advantage of FBA as compared to kinetic modeling is that it basically requires only knowledge of the stoichiometry of the network. On the other hand, results of FBA are to a large degree hypothetical because the method relies on plausible but hardly provable optimality principles that are thought to govern metabolic flux distributions. Results To augment the reliability of FBA-based flux calculations we propose an additional side constraint which assures thermodynamic realizability, i.e. that the flux directions are consistent with the corresponding changes of Gibb's free energies. The latter depend on metabolite levels for which plausible ranges can be inferred from experimental data. Computationally, our method results in the solution of a mixed integer linear optimization problem with quadratic scoring function. An optimal flux distribution together with a metabolite profile is determined which assures thermodynamic realizability with minimal deviations of metabolite levels from their expected values. We applied our novel approach to two exemplary metabolic networks of different complexity, the metabolic core network of erythrocytes (30 reactions) and the metabolic network iJR904 of Escherichia coli (931 reactions). Our calculations show that increasing network complexity entails increasing sensitivity of predicted flux distributions to variations of standard Gibb's free energy changes and metabolite concentration ranges. We demonstrate the usefulness of our method for assessing critical concentrations of external metabolites preventing attainment of a metabolic steady state. Conclusion Our method incorporates the thermodynamic link between flux directions and metabolite concentrations into a practical computational algorithm. The weakness of conventional FBA to rely on intuitive assumptions about the reversibility of biochemical reactions is overcome. This enables the computation of reliable flux distributions even under extreme conditions of the network (e.g. enzyme inhibition, depletion of substrates or accumulation of end products) where metabolite concentrations may be drastically altered. PMID:17543097

  12. Fuzzy probabilistic design of water distribution networks

    NASA Astrophysics Data System (ADS)

    Fu, Guangtao; Kapelan, Zoran

    2011-05-01

    The primary aim of this paper is to present a fuzzy probabilistic approach for optimal design and rehabilitation of water distribution systems, combining aleatoric and epistemic uncertainties in a unified framework. The randomness and imprecision in future water consumption are characterized using fuzzy random variables whose realizations are not real but fuzzy numbers, and the nodal head requirements are represented by fuzzy sets, reflecting the imprecision in customers' requirements. The optimal design problem is formulated as a two-objective optimization problem, with minimization of total design cost and maximization of system performance as objectives. The system performance is measured by the fuzzy random reliability, defined as the probability that the fuzzy head requirements are satisfied across all network nodes. The satisfactory degree is represented by necessity measure or belief measure in the sense of the Dempster-Shafer theory of evidence. An efficient algorithm is proposed, within a Monte Carlo procedure, to calculate the fuzzy random system reliability and is effectively combined with the nondominated sorting genetic algorithm II (NSGAII) to derive the Pareto optimal design solutions. The newly proposed methodology is demonstrated with two case studies: the New York tunnels network and Hanoi network. The results from both cases indicate that the new methodology can effectively accommodate and handle various aleatoric and epistemic uncertainty sources arising from the design process and can provide optimal design solutions that are not only cost-effective but also have higher reliability to cope with severe future uncertainties.

  13. A reliable algorithm for optimal control synthesis

    NASA Technical Reports Server (NTRS)

    Vansteenwyk, Brett; Ly, Uy-Loi

    1992-01-01

    In recent years, powerful design tools for linear time-invariant multivariable control systems have been developed based on direct parameter optimization. In this report, an algorithm for reliable optimal control synthesis using parameter optimization is presented. Specifically, a robust numerical algorithm is developed for the evaluation of the H(sup 2)-like cost functional and its gradients with respect to the controller design parameters. The method is specifically designed to handle defective degenerate systems and is based on the well-known Pade series approximation of the matrix exponential. Numerical test problems in control synthesis for simple mechanical systems and for a flexible structure with densely packed modes illustrate positively the reliability of this method when compared to a method based on diagonalization. Several types of cost functions have been considered: a cost function for robust control consisting of a linear combination of quadratic objectives for deterministic and random disturbances, and one representing an upper bound on the quadratic objective for worst case initial conditions. Finally, a framework for multivariable control synthesis has been developed combining the concept of closed-loop transfer recovery with numerical parameter optimization. The procedure enables designers to synthesize not only observer-based controllers but also controllers of arbitrary order and structure. Numerical design solutions rely heavily on the robust algorithm due to the high order of the synthesis model and the presence of near-overlapping modes. The design approach is successfully applied to the design of a high-bandwidth control system for a rotorcraft.

  14. Numerical Optimization Algorithms and Software for Systems Biology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saunders, Michael

    2013-02-02

    The basic aims of this work are: to develop reliable algorithms for solving optimization problems involving large stoi- chiometric matrices; to investigate cyclic dependency between metabolic and macromolecular biosynthetic networks; and to quantify the significance of thermodynamic constraints on prokaryotic metabolism.

  15. Experimental characterization of vertical-axis wind turbine noise.

    PubMed

    Pearson, C E; Graham, W R

    2015-01-01

    Vertical-axis wind turbines are wind-energy generators suitable for use in urban environments. Their associated noise thus needs to be characterized and understood. As a first step, this work investigates the relative importance of harmonic and broadband contributions via model-scale wind-tunnel experiments. Cross-spectra from a pair of flush-mounted wall microphones exhibit both components, but further analysis shows that the broadband dominates at frequencies corresponding to the audible range in full-scale operation. This observation has detrimental implications for noise-prediction reliability and hence also for acoustic design optimization.

  16. Estimation of the efficiency of the introduction of a porous layer into a silicon-on-sapphire structure substrate to enhance the reliability of devices under irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aleksandrov, P. A., E-mail: Alexandrov-PA@nrcki.ru; Baranova, E. K.; Budaragin, V. V.

    2016-08-15

    We investigate the efficiency of the introduction of a porous layer into the substrate of a silicon-onsapphire structure by the implantation of He ions to enhance the radiation resistance of devices. The properties of the introduced layer and its parameters affecting the concentration of minority charge carriers generated by irradiation are analyzed. The reported results of the analysis and calculations can be used to optimize He-ion implantation conditions during the formation of a porous layer.

  17. Reliability centered maintenance : a case study of railway transit maintenance to achieve optimal performance.

    DOT National Transportation Integrated Search

    2010-12-01

    The purpose of this qualitative case study was to identify the types of obstacles and patterns experienced by a single heavy rail transit agency located in North America that embedded a Reliability Centered Maintenance (RCM) Process. The outcome of t...

  18. The technique of entropy optimization in motor current signature analysis and its application in the fault diagnosis of gear transmission

    NASA Astrophysics Data System (ADS)

    Chen, Xiaoguang; Liang, Lin; Liu, Fei; Xu, Guanghua; Luo, Ailing; Zhang, Sicong

    2012-05-01

    Nowadays, Motor Current Signature Analysis (MCSA) is widely used in the fault diagnosis and condition monitoring of machine tools. However, although the current signal has lower SNR (Signal Noise Ratio), it is difficult to identify the feature frequencies of machine tools from complex current spectrum that the feature frequencies are often dense and overlapping by traditional signal processing method such as FFT transformation. With the study in the Motor Current Signature Analysis (MCSA), it is found that the entropy is of importance for frequency identification, which is associated with the probability distribution of any random variable. Therefore, it plays an important role in the signal processing. In order to solve the problem that the feature frequencies are difficult to be identified, an entropy optimization technique based on motor current signal is presented in this paper for extracting the typical feature frequencies of machine tools which can effectively suppress the disturbances. Some simulated current signals were made by MATLAB, and a current signal was obtained from a complex gearbox of an iron works made in Luxembourg. In diagnosis the MCSA is combined with entropy optimization. Both simulated and experimental results show that this technique is efficient, accurate and reliable enough to extract the feature frequencies of current signal, which provides a new strategy for the fault diagnosis and the condition monitoring of machine tools.

  19. Development and validation of the Polish version of the Female Sexual Function Index in the Polish population of females.

    PubMed

    Nowosielski, Krzysztof; Wróbel, Beata; Sioma-Markowska, Urszula; Poręba, Ryszard

    2013-02-01

    Unlike male sexual function, which is relatively easy to assess, female sexual function is still a diagnostic challenge. Although numerous new measurements for female sexual dysfunction (FSD) have recently been developed, the Female Sexual Function Index (FSFI) remains the gold standard for screening. It has been validated in more than 30 countries. The FSFI has been used in several studies conducted in Poland, but it has never been standardized for Polish women. The aim of this study was to develop a Polish version of the FSFI (PL-FSFI). In total, 189 women aged 18-55 years were included in the study. Eighty-five were diagnosed with FSD as per the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision (DSM IV-TR) criteria; 104 women did not have FSD. All subjects completed the PL-FSFI at baseline (day 0), day 7, and day 28. Test-retest reliability was determined by Pearson's product-moment correlations. Reliability was tested using Cronbach's α coefficient. Construct validity was evaluated by principal component analysis using varimax rotation and factor analysis. Discriminant validity was assessed with between-groups analysis of variance. All domains of the PL-FSFI demonstrated satisfactory internal consistencies, with Cronbach's α value of >0.70 for the entire sample. The test-retest reliability demonstrated good-to-excellent agreement between the assessment points. Based on principal component analysis, a 5-factor model was established that explained 83.62% of the total variance. Domain intercorrelations of the PL-FSFI ranged from 0.37-0.77. The optimal PL-FSFI cutoff score was 27.50, with 87.1% sensitivity and 83.1% specificity. The PL-FSFI is a reliable questionnaire with good psychometric and discriminative validity. Therefore, it can be used as a tool for preliminary screening for FSD among Polish women. © 2012 International Society for Sexual Medicine.

  20. A DNA fingerprinting procedure for ultra high-throughput genetic analysis of insects.

    PubMed

    Schlipalius, D I; Waldron, J; Carroll, B J; Collins, P J; Ebert, P R

    2001-12-01

    Existing procedures for the generation of polymorphic DNA markers are not optimal for insect studies in which the organisms are often tiny and background molecular information is often non-existent. We have used a new high throughput DNA marker generation protocol called randomly amplified DNA fingerprints (RAF) to analyse the genetic variability in three separate strains of the stored grain pest, Rhyzopertha dominica. This protocol is quick, robust and reliable even though it requires minimal sample preparation, minute amounts of DNA and no prior molecular analysis of the organism. Arbitrarily selected oligonucleotide primers routinely produced approximately 50 scoreable polymorphic DNA markers, between individuals of three independent field isolates of R. dominica. Multivariate cluster analysis using forty-nine arbitrarily selected polymorphisms generated from a single primer reliably separated individuals into three clades corresponding to their geographical origin. The resulting clades were quite distinct, with an average genetic difference of 37.5 +/- 6.0% between clades and of 21.0 +/- 7.1% between individuals within clades. As a prelude to future gene mapping efforts, we have also assessed the performance of RAF under conditions commonly used in gene mapping. In this analysis, fingerprints from pooled DNA samples accurately and reproducibly reflected RAF profiles obtained from individual DNA samples that had been combined to create the bulked samples.

  1. Integrated optimization of nonlinear R/C frames with reliability constraints

    NASA Technical Reports Server (NTRS)

    Soeiro, Alfredo; Hoit, Marc

    1989-01-01

    A structural optimization algorithm was researched including global displacements as decision variables. The algorithm was applied to planar reinforced concrete frames with nonlinear material behavior submitted to static loading. The flexural performance of the elements was evaluated as a function of the actual stress-strain diagrams of the materials. Formation of rotational hinges with strain hardening were allowed and the equilibrium constraints were updated accordingly. The adequacy of the frames was guaranteed by imposing as constraints required reliability indices for the members, maximum global displacements for the structure and a maximum system probability of failure.

  2. Additional value of computer assisted semen analysis (CASA) compared to conventional motility assessments in pig artificial insemination.

    PubMed

    Broekhuijse, M L W J; Soštarić, E; Feitsma, H; Gadella, B M

    2011-11-01

    In order to obtain a more standardised semen motility evaluation, Varkens KI Nederland has introduced a computer assisted semen analysis (CASA) system in all their pig AI laboratories. The repeatability of CASA was enhanced by standardising for: 1) an optimal sample temperature (39 °C); 2) an optimal dilution factor; 3) optimal mixing of semen and dilution buffer by using mechanical mixing; 4) the slide chamber depth, and together with the previous points; 5) the optimal training of technicians working with the CASA system; and 6) the use of a standard operating procedure (SOP). Once laboratory technicians were trained in using this SOP, they achieved a coefficient of variation of < 5% which was superior to the variation found when the SOP was not strictly used. Microscopic semen motility assessments by eye were subjective and not comparable to the data obtained by standardised CASA. CASA results are preferable as accurate continuous motility dates are generated rather than discrimination motility percentage increments of 10% motility as with motility estimation by laboratory technicians. The higher variability of sperm motility found with CASA and the continuous motility values allow better analysis of the relationship between semen motility characteristics and fertilising capacity. The benefits of standardised CASA for AI is discussed both with respect to estimate the correct dilution factor of the ejaculate for the production of artificial insemination (AI) doses (critical for reducing the number of sperm per AI doses) and thus to get more reliable fertility data from these AI doses in return. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. Multi-objective optimization of laser-scribed micro grooves on AZO conductive thin film using Data Envelopment Analysis

    NASA Astrophysics Data System (ADS)

    Kuo, Chung-Feng Jeffrey; Quang Vu, Huy; Gunawan, Dewantoro; Lan, Wei-Luen

    2012-09-01

    Laser scribing process has been considered as an effective approach for surface texturization on thin film solar cell. In this study, a systematic method for optimizing multi-objective process parameters of fiber laser system was proposed to achieve excellent quality characteristics, such as the minimum scribing line width, the flattest trough bottom, and the least processing edge surface bumps for increasing incident light absorption of thin film solar cell. First, the Taguchi method (TM) obtained useful statistical information through the orthogonal array with relatively fewer experiments. However, TM is only appropriate to optimize single-objective problems and has to rely on engineering judgment for solving multi-objective problems that can cause uncertainty to some degree. The back-propagation neural network (BPNN) and data envelopment analysis (DEA) were utilized to estimate the incomplete data and derive the optimal process parameters of laser scribing system. In addition, analysis of variance (ANOVA) method was also applied to identify the significant factors which have the greatest effects on the quality of scribing process; in other words, by putting more emphasis on these controllable and profound factors, the quality characteristics of the scribed thin film could be effectively enhanced. The experiments were carried out on ZnO:Al (AZO) transparent conductive thin film with a thickness of 500 nm and the results proved that the proposed approach yields better anticipated improvements than that of the TM which is only superior in improving one quality while sacrificing the other qualities. The results of confirmation experiments have showed the reliability of the proposed method.

  4. Photometric method for determination of acidity constants through integral spectra analysis

    NASA Astrophysics Data System (ADS)

    Zevatskiy, Yuriy Eduardovich; Ruzanov, Daniil Olegovich; Samoylov, Denis Vladimirovich

    2015-04-01

    An express method for determination of acidity constants of organic acids, based on the analysis of the integral transmittance vs. pH dependence is developed. The integral value is registered as a photocurrent of photometric device simultaneously with potentiometric titration. The proposed method allows to obtain pKa using only simple and low-cost instrumentation. The optical part of the experimental setup has been optimized through the exclusion of the monochromator device. Thus it only takes 10-15 min to obtain one pKa value with the absolute error of less than 0.15 pH units. Application limitations and reliability of the method have been tested for a series of organic acids of various nature.

  5. High pressure single-crystal micro X-ray diffraction analysis with GSE_ADA/RSV software

    NASA Astrophysics Data System (ADS)

    Dera, Przemyslaw; Zhuravlev, Kirill; Prakapenka, Vitali; Rivers, Mark L.; Finkelstein, Gregory J.; Grubor-Urosevic, Ognjen; Tschauner, Oliver; Clark, Simon M.; Downs, Robert T.

    2013-08-01

    GSE_ADA/RSV is a free software package for custom analysis of single-crystal micro X-ray diffraction (SCμXRD) data, developed with particular emphasis on data from samples enclosed in diamond anvil cells and subject to high pressure conditions. The package has been in extensive use at the high pressure beamlines of Advanced Photon Source (APS), Argonne National Laboratory and Advanced Light Source (ALS), Lawrence Berkeley National Laboratory. The software is optimized for processing of wide-rotation images and includes a variety of peak intensity corrections and peak filtering features, which are custom-designed to make processing of high pressure SCμXRD easier and more reliable.

  6. Directional variance adjustment: bias reduction in covariance matrices based on factor analysis with an application to portfolio optimization.

    PubMed

    Bartz, Daniel; Hatrick, Kerr; Hesse, Christian W; Müller, Klaus-Robert; Lemm, Steven

    2013-01-01

    Robust and reliable covariance estimates play a decisive role in financial and many other applications. An important class of estimators is based on factor models. Here, we show by extensive Monte Carlo simulations that covariance matrices derived from the statistical Factor Analysis model exhibit a systematic error, which is similar to the well-known systematic error of the spectrum of the sample covariance matrix. Moreover, we introduce the Directional Variance Adjustment (DVA) algorithm, which diminishes the systematic error. In a thorough empirical study for the US, European, and Hong Kong stock market we show that our proposed method leads to improved portfolio allocation.

  7. TED analysis of the Si(113) surface structure

    NASA Astrophysics Data System (ADS)

    Suzuki, T.; Minoda, H.; Tanishiro, Y.; Yagi, K.

    1999-09-01

    We carried out a TED (transmission electron diffraction) analysis of the Si(113) surface structure. The TED patterns taken at room temperature showed reflections due to the 3×2 reconstructed structure. The TED pattern indicated that a glide plane parallel to the <332> direction suggested in some models is excluded. We calculated the R-factors (reliability factors) for six surface structure models proposed previously. All structure models with energy-optimized atomic positions have large R-factors. After revision of the atomic positions, the R-factors of all the structure models decreased below 0.3, and the revised version of Dabrowski's 3×2 model has the smallest R-factor of 0.17.

  8. Directional Variance Adjustment: Bias Reduction in Covariance Matrices Based on Factor Analysis with an Application to Portfolio Optimization

    PubMed Central

    Bartz, Daniel; Hatrick, Kerr; Hesse, Christian W.; Müller, Klaus-Robert; Lemm, Steven

    2013-01-01

    Robust and reliable covariance estimates play a decisive role in financial and many other applications. An important class of estimators is based on factor models. Here, we show by extensive Monte Carlo simulations that covariance matrices derived from the statistical Factor Analysis model exhibit a systematic error, which is similar to the well-known systematic error of the spectrum of the sample covariance matrix. Moreover, we introduce the Directional Variance Adjustment (DVA) algorithm, which diminishes the systematic error. In a thorough empirical study for the US, European, and Hong Kong stock market we show that our proposed method leads to improved portfolio allocation. PMID:23844016

  9. Combined coding and delay-throughput analysis for fading channels of mobile satellite communications

    NASA Technical Reports Server (NTRS)

    Wang, C. C.; Yan, Tsun-Yee

    1986-01-01

    This paper presents the analysis of using the punctured convolutional code with Viterbi decoding to improve communications reliability. The punctured code rate is optimized so that the average delay is minimized. The coding gain in terms of the message delay is also defined. Since using punctured convolutional code with interleaving is still inadequate to combat the severe fading for short packets, the use of multiple copies of assignment and acknowledgment packets is suggested. The performance on the average end-to-end delay of this protocol is analyzed. It is shown that a replication of three copies for both assignment packets and acknowledgment packets is optimum for the cases considered.

  10. A computational framework for prime implicants identification in noncoherent dynamic systems.

    PubMed

    Di Maio, Francesco; Baronchelli, Samuele; Zio, Enrico

    2015-01-01

    Dynamic reliability methods aim at complementing the capability of traditional static approaches (e.g., event trees [ETs] and fault trees [FTs]) by accounting for the system dynamic behavior and its interactions with the system state transition process. For this, the system dynamics is here described by a time-dependent model that includes the dependencies with the stochastic transition events. In this article, we present a novel computational framework for dynamic reliability analysis whose objectives are i) accounting for discrete stochastic transition events and ii) identifying the prime implicants (PIs) of the dynamic system. The framework entails adopting a multiple-valued logic (MVL) to consider stochastic transitions at discretized times. Then, PIs are originally identified by a differential evolution (DE) algorithm that looks for the optimal MVL solution of a covering problem formulated for MVL accident scenarios. For testing the feasibility of the framework, a dynamic noncoherent system composed of five components that can fail at discretized times has been analyzed, showing the applicability of the framework to practical cases. © 2014 Society for Risk Analysis.

  11. Automated Testability Decision Tool

    DTIC Science & Technology

    1991-09-01

    Vol. 16,1968, pp. 538-558. Bertsekas, D. P., "Constraints Optimization and Lagrange Multiplier Methods," Academic Press, New York. McLeavey , D.W... McLeavey , J.A., "Parallel Optimization Methods in Standby Reliability, " University of Connecticut, School of Business Administration, Bureau of Business

  12. Pre-implementation studies of a workforce planning tool for nurse staffing and human resource management in university hospitals.

    PubMed

    van Oostveen, Catharina J; Ubbink, Dirk T; Mens, Marian A; Pompe, Edwin A; Vermeulen, Hester

    2016-03-01

    To investigate the reliability, validity and feasibility of the RAFAELA workforce planning system (including the Oulu patient classification system - OPCq), before deciding on implementation in Dutch hospitals. The complexity of care, budgetary restraints and demand for high-quality patient care have ignited the need for transparent hospital workforce planning. Nurses from 12 wards of two university hospitals were trained to test the reliability of the OPCq by investigating the absolute agreement of nursing care intensity (NCI) measurements among nurses. Validity was tested by assessing whether optimal NCI/nurse ratio, as calculated by a regression analysis in RAFAELA, was realistic. System feasibility was investigated through a questionnaire among all nurses involved. Almost 67 000 NCI measurements were performed between December 2013 and June 2014. Agreement using the OPCq varied between 38% and 91%. For only 1 in 12 wards was the optimal NCI area calculated judged as valid. Although the majority of respondents was positive about the applicability and user-friendliness, RAFAELA was not accepted as useful workforce planning system. Nurses' performance using the RAFAELA system did not warrant its implementation. Hospital managers should first focus on enlarging the readiness of nurses regarding the implementation of a workforce planning system. © 2015 John Wiley & Sons Ltd.

  13. Time frequency analysis of olfactory induced EEG-power change.

    PubMed

    Schriever, Valentin Alexander; Han, Pengfei; Weise, Stefanie; Hösel, Franziska; Pellegrino, Robert; Hummel, Thomas

    2017-01-01

    The objective of the present study was to investigate the usefulness of time-frequency analysis (TFA) of olfactory-induced EEG change with a low-cost, portable olfactometer in the clinical investigation of smell function. A total of 78 volunteers participated. The study was composed of three parts where olfactory stimuli were presented using a custom-built olfactometer. Part I was designed to optimize the stimulus as well as the recording conditions. In part II EEG-power changes after olfactory/trigeminal stimulation were compared between healthy participants and patients with olfactory impairment. In Part III the test-retest reliability of the method was evaluated in healthy subjects. Part I indicated that the most effective paradigm for stimulus presentation was cued stimulus, with an interstimulus interval of 18-20s at a stimulus duration of 1000ms with each stimulus quality presented 60 times in blocks of 20 stimuli each. In Part II we found that central processing of olfactory stimuli analyzed by TFA differed significantly between healthy controls and patients even when controlling for age. It was possible to reliably distinguish patients with olfactory impairment from healthy individuals at a high degree of accuracy (healthy controls vs anosmic patients: sensitivity 75%; specificity 89%). In addition we could show a good test-retest reliability of TFA of chemosensory induced EEG-power changes in Part III. Central processing of olfactory stimuli analyzed by TFA reliably distinguishes patients with olfactory impairment from healthy individuals at a high degree of accuracy. Importantly this can be achieved with a simple olfactometer.

  14. Stability, Nonlinearity and Reliability of Electrostatically Actuated MEMS Devices

    PubMed Central

    Zhang, Wen-Ming; Meng, Guang; Chen, Di

    2007-01-01

    Electrostatic micro-electro-mechanical system (MEMS) is a special branch with a wide range of applications in sensing and actuating devices in MEMS. This paper provides a survey and analysis of the electrostatic force of importance in MEMS, its physical model, scaling effect, stability, nonlinearity and reliability in detail. It is necessary to understand the effects of electrostatic forces in MEMS and then many phenomena of practical importance, such as pull-in instability and the effects of effective stiffness, dielectric charging, stress gradient, temperature on the pull-in voltage, nonlinear dynamic effects and reliability due to electrostatic forces occurred in MEMS can be explained scientifically, and consequently the great potential of MEMS technology could be explored effectively and utilized optimally. A simplified parallel-plate capacitor model is proposed to investigate the resonance response, inherent nonlinearity, stiffness softened effect and coupled nonlinear effect of the typical electrostatically actuated MEMS devices. Many failure modes and mechanisms and various methods and techniques, including materials selection, reasonable design and extending the controllable travel range used to analyze and reduce the failures are discussed in the electrostatically actuated MEMS devices. Numerical simulations and discussions indicate that the effects of instability, nonlinear characteristics and reliability subjected to electrostatic forces cannot be ignored and are in need of further investigation.

  15. RAMI Analysis for Designing and Optimizing Tokamak Cooling Water System (TCWS) for the ITER's Fusion Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrada, Juan J; Reiersen, Wayne T

    U.S.-ITER is responsible for the design, engineering, and procurement of the Tokamak Cooling Water System (TCWS). TCWS is designed to provide cooling and baking for client systems that include the first wall/blanket, vacuum vessel, divertor, and neutral beam injector. Additional operations that support these primary functions include chemical control of water provided to client systems, draining and drying for maintenance, and leak detection/localization. TCWS interfaces with 27 systems including the secondary cooling system, which rejects this heat to the environment. TCWS transfers heat generated in the Tokamak during nominal pulsed operation - 850 MW at up to 150 C andmore » 4.2 MPa water pressure. Impurities are diffused from in-vessel components and the vacuum vessel by water baking at 200-240 C at up to 4.4 MPa. TCWS is complex because it serves vital functions for four primary clients whose performance is critical to ITER's success and interfaces with more than 20 additional ITER systems. Conceptual design of this one-of-a-kind cooling system has been completed; however, several issues remain that must be resolved before moving to the next stage of the design process. The 2004 baseline design indicated cooling loops that have no fault tolerance for component failures. During plasma operation, each cooling loop relies on a single pump, a single pressurizer, and one heat exchanger. Consequently, failure of any of these would render TCWS inoperable, resulting in plasma shutdown. The application of reliability, availability, maintainability, and inspectability (RAMI) tools during the different stages of TCWS design is crucial for optimization purposes and for maintaining compliance with project requirements. RAMI analysis will indicate appropriate equipment redundancy that provides graceful degradation in the event of an equipment failure. This analysis helps demonstrate that using proven, commercially available equipment is better than using custom-designed equipment with no field experience and lowers specific costs while providing higher reliability. This paper presents a brief description of the TCWS conceptual design and the application of RAMI tools to optimize the design at different stages during the project.« less

  16. Optimal clinical time for reliable measurement of transcutaneous CO2 with ear probes: counterbalancing overshoot and the vasodilatation effect.

    PubMed

    Domingo, Christian; Canturri, Elisa; Moreno, Amalia; Espuelas, Humildad; Vigil, Laura; Luján, Manel

    2010-01-01

    To determine the optimal clinical reading time for the transcutaneous measurement of oxygen saturation (SpO(2)) and transcutaneous CO(2) (TcPCO(2)) in awake spontaneously breathing individuals, considering the overshoot phenomenon (transient overestimation of arterial PaCO(2)). EXPERIMENTAL SECTION: Observational study of 91 (75 men) individuals undergoing forced spirometry, measurement of SpO(2) and TcPCO(2) with the SenTec monitor every two minutes until minute 20 and arterial blood gas (ABG) analysis. Overshoot severity: (a) mild (0.1-1.9 mm Hg); (b) moderate (2-4.9 mm Hg); (c) severe: (>5 mm Hg). The mean difference was calculated for SpO(2) and TcPCO(2) and arterial values of PaCO(2) and SpO(2). The intraclass correlation coefficient (ICC) between monitor readings and blood values was calculated as a measure of agreement. The mean age was 63.1 ± 11.8 years. Spirometric values: FVC: 75.4 ± 6.2%; FEV(1): 72.9 ± 23.9%; FEV(1)/FVC: 70 ± 15.5%. ABG: PaO(2): 82.6 ± 13.2; PaCO(2): 39.9.1 ± 4.8 mmHg; SaO(2): 95.3 ± 4.4%. Overshoot analysis: overshoot was mild in 33 (36.3%) patients, moderate in 20 (22%) and severe in nine (10%); no overshoot was observed in 29 (31%) patients. The lowest mean differences between arterial blood gas and TcPCO(2) was -0.57 mmHg at minute 10, although the highest ICC was obtained at minutes 12 and 14 (>0.8). The overshoot lost its influence after minute 12. For SpO(2), measurements were reliable at minute 2. The optimal clinical reading measurement recommended for the ear lobe TcPCO(2) measurement ranges between minute 12 and 14. The SpO(2) measurement can be performed at minute 2.

  17. Optimal Clinical Time for Reliable Measurement of Transcutaneous CO2 with Ear Probes: Counterbalancing Overshoot and the Vasodilatation Effect

    PubMed Central

    Domingo, Christian; Canturri, Elisa; Moreno, Amalia; Espuelas, Humildad; Vigil, Laura; Luján, Manel

    2010-01-01

    OBJECTIVES: To determine the optimal clinical reading time for the transcutaneous measurement of oxygen saturation (SpO2) and transcutaneous CO2 (TcPCO2) in awake spontaneously breathing individuals, considering the overshoot phenomenon (transient overestimation of arterial PaCO2). EXPERIMENTAL SECTION: Observational study of 91 (75 men) individuals undergoing forced spirometry, measurement of SpO2 and TcPCO2 with the SenTec monitor every two minutes until minute 20 and arterial blood gas (ABG) analysis. Overshoot severity: (a) mild (0.1–1.9 mm Hg); (b) moderate (2–4.9 mm Hg); (c) severe: (>5 mm Hg). The mean difference was calculated for SpO2 and TcPCO2 and arterial values of PaCO2 and SpO2. The intraclass correlation coefficient (ICC) between monitor readings and blood values was calculated as a measure of agreement. RESULTS: The mean age was 63.1 ± 11.8 years. Spirometric values: FVC: 75.4 ± 6.2%; FEV1: 72.9 ± 23.9%; FEV1/FVC: 70 ± 15.5%. ABG: PaO2: 82.6 ± 13.2; PaCO2: 39.9.1 ± 4.8 mmHg; SaO2: 95.3 ± 4.4%. Overshoot analysis: overshoot was mild in 33 (36.3%) patients, moderate in 20 (22%) and severe in nine (10%); no overshoot was observed in 29 (31%) patients. The lowest mean differences between arterial blood gas and TcPCO2 was −0.57 mmHg at minute 10, although the highest ICC was obtained at minutes 12 and 14 (>0.8). The overshoot lost its influence after minute 12. For SpO2, measurements were reliable at minute 2. CONCLUSIONS: The optimal clinical reading measurement recommended for the ear lobe TcPCO2 measurement ranges between minute 12 and 14. The SpO2 measurement can be performed at minute 2. PMID:22315552

  18. A perspective on future directions in aerospace propulsion system simulation

    NASA Technical Reports Server (NTRS)

    Miller, Brent A.; Szuch, John R.; Gaugler, Raymond E.; Wood, Jerry R.

    1989-01-01

    The design and development of aircraft engines is a lengthy and costly process using today's methodology. This is due, in large measure, to the fact that present methods rely heavily on experimental testing to verify the operability, performance, and structural integrity of components and systems. The potential exists for achieving significant speedups in the propulsion development process through increased use of computational techniques for simulation, analysis, and optimization. This paper outlines the concept and technology requirements for a Numerical Propulsion Simulation System (NPSS) that would provide capabilities to do interactive, multidisciplinary simulations of complete propulsion systems. By combining high performance computing hardware and software with state-of-the-art propulsion system models, the NPSS will permit the rapid calculation, assessment, and optimization of subcomponent, component, and system performance, durability, reliability and weight-before committing to building hardware.

  19. Local Debonding and Fiber Breakage in Composite Materials Modeled Accurately

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2001-01-01

    A prerequisite for full utilization of composite materials in aerospace components is accurate design and life prediction tools that enable the assessment of component performance and reliability. Such tools assist both structural analysts, who design and optimize structures composed of composite materials, and materials scientists who design and optimize the composite materials themselves. NASA Glenn Research Center's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) software package (http://www.grc.nasa.gov/WWW/LPB/mac) addresses this need for composite design and life prediction tools by providing a widely applicable and accurate approach to modeling composite materials. Furthermore, MAC/GMC serves as a platform for incorporating new local models and capabilities that are under development at NASA, thus enabling these new capabilities to progress rapidly to a stage in which they can be employed by the code's end users.

  20. Solar-Diesel Hybrid Power System Optimization and Experimental Validation

    NASA Astrophysics Data System (ADS)

    Jacobus, Headley Stewart

    As of 2008 1.46 billion people, or 22 percent of the World's population, were without electricity. Many of these people live in remote areas where decentralized generation is the only method of electrification. Most mini-grids are powered by diesel generators, but new hybrid power systems are becoming a reliable method to incorporate renewable energy while also reducing total system cost. This thesis quantifies the measurable Operational Costs for an experimental hybrid power system in Sierra Leone. Two software programs, Hybrid2 and HOMER, are used during the system design and subsequent analysis. Experimental data from the installed system is used to validate the two programs and to quantify the savings created by each component within the hybrid system. This thesis bridges the gap between design optimization studies that frequently lack subsequent validation and experimental hybrid system performance studies.

  1. Activating representations in permanent memory: different benefits for pictures and words.

    PubMed

    Seifert, L S

    1997-09-01

    Previous research has suggested that pictures have privileged access to semantic memory (W. R. Glaser, 1992), but J. Theios and P. C. Amrhein (1989b) argued that prior studies inappropriately used large pictures and small words. In Experiment 1, participants categorized pictures reliably faster than words, even when both types of items were of optimal perceptual size. In Experiment 2, a poststimulus flashmask and judgments about internal features did not eliminate picture superiority, indicating that it was not due to differences in early visual processing or analysis of visible features. In Experiment 3, when participants made judgments about whether items were related, latencies were reliably faster for categorically related pictures than for words, but there was no picture advantage for noncategorically associated items. Results indicate that pictures have privileged access to semantic memory for categories, but that neither pictures nor words seem to have privileged access to noncategorical associations.

  2. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    PubMed

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  3. A Simulation Based Approach to Optimize Berth Throughput Under Uncertainty at Marine Container Terminals

    NASA Technical Reports Server (NTRS)

    Golias, Mihalis M.

    2011-01-01

    Berth scheduling is a critical function at marine container terminals and determining the best berth schedule depends on several factors including the type and function of the port, size of the port, location, nearby competition, and type of contractual agreement between the terminal and the carriers. In this paper we formulate the berth scheduling problem as a bi-objective mixed-integer problem with the objective to maximize customer satisfaction and reliability of the berth schedule under the assumption that vessel handling times are stochastic parameters following a discrete and known probability distribution. A combination of an exact algorithm, a Genetic Algorithms based heuristic and a simulation post-Pareto analysis is proposed as the solution approach to the resulting problem. Based on a number of experiments it is concluded that the proposed berth scheduling policy outperforms the berth scheduling policy where reliability is not considered.

  4. Design and control strategy for a hybrid green energy system for mobile telecommunication sites

    NASA Astrophysics Data System (ADS)

    Okundamiya, Michael S.; Emagbetere, Joy O.; Ogujor, Emmanuel A.

    2014-07-01

    The rising energy costs and carbon footprint of operating mobile telecommunication sites in the emerging world have increased research interests in green technology. The intermittent nature of most green energy sources creates the problem of designing the optimum configuration for a given location. This study presents the design analysis and control strategy for a cost effective and reliable operation of the hybrid green energy system (HGES) for GSM base transceiver station (BTS) sites in isolated regions. The design constrains the generation and distribution of power to reliably satisfy the energy demand while ensuring safe operation of the system. The overall process control applies the genetic algorithm-based technique for optimal techno-economic sizing of system's components. The process simulation utilized meteorological data for 3 locations (Abuja, Benin City and Sokoto) with varying climatic conditions in Nigeria. Simulation results presented for green GSM BTS sites are discussed and compared with existing approaches.

  5. Determination of five active compounds in Artemisia princeps and A. capillaris based on UPLC-DAD and discrimination of two species with multivariate analysis.

    PubMed

    Yang, Heejung; Lee, Dong Young; Jeon, Minji; Suh, Youngbae; Sung, Sang Hyun

    2014-05-01

    Five active compounds, chlorogenic acid, 3,5-di-O-caffeoylquinic acid, 4,5-di-O-caffeoylquinic acid, jaceosidin, and eupatilin, in Artemisia princeps (Compositae) were simultaneously determined by ultra-performance liquid chromatography connected to diode array detector. The morphological resemblance between A. princeps and A. capillaris makes it difficult to properly identify species properly. It occasionally leads to misuse or misapplication in Korean traditional medicine. In the study, the discrimination between A. princeps and A. capillaris was optimally performed by the developed validation method, which resulted in definitely a difference between two species. Also, it was developed the most reliable markers contributing to the discrimination of two species by the multivariate analysis methods, such as a principal component analysis and a partial least squares discrimination analysis.

  6. Quality Assurance and T&E of Inertial Systems for RLV Mission

    NASA Astrophysics Data System (ADS)

    Sathiamurthi, S.; Thakur, Nayana; Hari, K.; Peter, Pilmy; Biju, V. S.; Mani, K. S.

    2017-12-01

    This work describes the quality assurance and Test and Evaluation (T&E) activities carried out for the inertial systems flown successfully in India's first reusable launch vehicle technology demonstrator hypersonic experiment mission. As part of reliability analysis, failure mode effect and criticality analysis and derating analysis were carried out in the initial design phase, findings presented to design review forums and the recommendations were implemented. T&E plan was meticulously worked out and presented to respective forums for review and implementation. Test data analysis, health parameter plotting and test report generation was automated and these automations significantly reduced the time required for these activities and helped to avoid manual errors. Further, T&E cycle is optimized without compromising on quality aspects. These specific measures helped to achieve zero defect delivery of inertial systems for RLV application.

  7. Development and testing of the cancer multidisciplinary team meeting observational tool (MDT-MOT)

    PubMed Central

    Harris, Jenny; Taylor, Cath; Sevdalis, Nick; Jalil, Rozh; Green, James S.A.

    2016-01-01

    Abstract Objective To develop a tool for independent observational assessment of cancer multidisciplinary team meetings (MDMs), and test criterion validity, inter-rater reliability/agreement and describe performance. Design Clinicians and experts in teamwork used a mixed-methods approach to develop and refine the tool. Study 1 observers rated pre-determined optimal/sub-optimal MDM film excerpts and Study 2 observers independently rated video-recordings of 10 MDMs. Setting Study 2 included 10 cancer MDMs in England. Participants Testing was undertaken by 13 health service staff and a clinical and non-clinical observer. Intervention None. Main Outcome Measures Tool development, validity, reliability/agreement and variability in MDT performance. Results Study 1: Observers were able to discriminate between optimal and sub-optimal MDM performance (P ≤ 0.05). Study 2: Inter-rater reliability was good for 3/10 domains. Percentage of absolute agreement was high (≥80%) for 4/10 domains and percentage agreement within 1 point was high for 9/10 domains. Four MDTs performed well (scored 3+ in at least 8/10 domains), 5 MDTs performed well in 6–7 domains and 1 MDT performed well in only 4 domains. Leadership and chairing of the meeting, the organization and administration of the meeting, and clinical decision-making processes all varied significantly between MDMs (P ≤ 0.01). Conclusions MDT-MOT demonstrated good criterion validity. Agreement between clinical and non-clinical observers (within one point on the scale) was high but this was inconsistent with reliability coefficients and warrants further investigation. If further validated MDT-MOT might provide a useful mechanism for the routine assessment of MDMs by the local workforce to drive improvements in MDT performance. PMID:27084499

  8. Development and testing of the cancer multidisciplinary team meeting observational tool (MDT-MOT).

    PubMed

    Harris, Jenny; Taylor, Cath; Sevdalis, Nick; Jalil, Rozh; Green, James S A

    2016-06-01

    To develop a tool for independent observational assessment of cancer multidisciplinary team meetings (MDMs), and test criterion validity, inter-rater reliability/agreement and describe performance. Clinicians and experts in teamwork used a mixed-methods approach to develop and refine the tool. Study 1 observers rated pre-determined optimal/sub-optimal MDM film excerpts and Study 2 observers independently rated video-recordings of 10 MDMs. Study 2 included 10 cancer MDMs in England. Testing was undertaken by 13 health service staff and a clinical and non-clinical observer. None. Tool development, validity, reliability/agreement and variability in MDT performance. Study 1: Observers were able to discriminate between optimal and sub-optimal MDM performance (P ≤ 0.05). Study 2: Inter-rater reliability was good for 3/10 domains. Percentage of absolute agreement was high (≥80%) for 4/10 domains and percentage agreement within 1 point was high for 9/10 domains. Four MDTs performed well (scored 3+ in at least 8/10 domains), 5 MDTs performed well in 6-7 domains and 1 MDT performed well in only 4 domains. Leadership and chairing of the meeting, the organization and administration of the meeting, and clinical decision-making processes all varied significantly between MDMs (P ≤ 0.01). MDT-MOT demonstrated good criterion validity. Agreement between clinical and non-clinical observers (within one point on the scale) was high but this was inconsistent with reliability coefficients and warrants further investigation. If further validated MDT-MOT might provide a useful mechanism for the routine assessment of MDMs by the local workforce to drive improvements in MDT performance. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  9. Improved object optimal synthetic description, modeling, learning, and discrimination by GEOGINE computational kernel

    NASA Astrophysics Data System (ADS)

    Fiorini, Rodolfo A.; Dacquino, Gianfranco

    2005-03-01

    GEOGINE (GEOmetrical enGINE), a state-of-the-art OMG (Ontological Model Generator) based on n-D Tensor Invariants for n-Dimensional shape/texture optimal synthetic representation, description and learning, was presented in previous conferences elsewhere recently. Improved computational algorithms based on the computational invariant theory of finite groups in Euclidean space and a demo application is presented. Progressive model automatic generation is discussed. GEOGINE can be used as an efficient computational kernel for fast reliable application development and delivery in advanced biomedical engineering, biometric, intelligent computing, target recognition, content image retrieval, data mining technological areas mainly. Ontology can be regarded as a logical theory accounting for the intended meaning of a formal dictionary, i.e., its ontological commitment to a particular conceptualization of the world object. According to this approach, "n-D Tensor Calculus" can be considered a "Formal Language" to reliably compute optimized "n-Dimensional Tensor Invariants" as specific object "invariant parameter and attribute words" for automated n-Dimensional shape/texture optimal synthetic object description by incremental model generation. The class of those "invariant parameter and attribute words" can be thought as a specific "Formal Vocabulary" learned from a "Generalized Formal Dictionary" of the "Computational Tensor Invariants" language. Even object chromatic attributes can be effectively and reliably computed from object geometric parameters into robust colour shape invariant characteristics. As a matter of fact, any highly sophisticated application needing effective, robust object geometric/colour invariant attribute capture and parameterization features, for reliable automated object learning and discrimination can deeply benefit from GEOGINE progressive automated model generation computational kernel performance. Main operational advantages over previous, similar approaches are: 1) Progressive Automated Invariant Model Generation, 2) Invariant Minimal Complete Description Set for computational efficiency, 3) Arbitrary Model Precision for robust object description and identification.

  10. Determination of the optimal number of components in independent components analysis.

    PubMed

    Kassouf, Amine; Jouan-Rimbaud Bouveresse, Delphine; Rutledge, Douglas N

    2018-03-01

    Independent components analysis (ICA) may be considered as one of the most established blind source separation techniques for the treatment of complex data sets in analytical chemistry. Like other similar methods, the determination of the optimal number of latent variables, in this case, independent components (ICs), is a crucial step before any modeling. Therefore, validation methods are required in order to decide about the optimal number of ICs to be used in the computation of the final model. In this paper, three new validation methods are formally presented. The first one, called Random_ICA, is a generalization of the ICA_by_blocks method. Its specificity resides in the random way of splitting the initial data matrix into two blocks, and then repeating this procedure several times, giving a broader perspective for the selection of the optimal number of ICs. The second method, called KMO_ICA_Residuals is based on the computation of the Kaiser-Meyer-Olkin (KMO) index of the transposed residual matrices obtained after progressive extraction of ICs. The third method, called ICA_corr_y, helps to select the optimal number of ICs by computing the correlations between calculated proportions and known physico-chemical information about samples, generally concentrations, or between a source signal known to be present in the mixture and the signals extracted by ICA. These three methods were tested using varied simulated and experimental data sets and compared, when necessary, to ICA_by_blocks. Results were relevant and in line with expected ones, proving the reliability of the three proposed methods. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Optimization of mass spectrometry acquisition parameters for determination of polycarbonate additives, degradation products, and colorants migrating from food contact materials to chocolate.

    PubMed

    Bignardi, Chiara; Cavazza, Antonella; Laganà, Carmen; Salvadeo, Paola; Corradini, Claudio

    2018-01-01

    The interest towards "substances of emerging concerns" referred to objects intended to come into contact with food is recently growing. Such substances can be found in traces in simulants and in food products put in contact with plastic materials. In this context, it is important to set up analytical systems characterized by high sensitivity and to improve detection parameters to enhance signals. This work was aimed at optimizing a method based on UHPLC coupled to high resolution mass spectrometry to quantify the most common plastic additives, and able to detect the presence of polymers degradation products and coloring agents migrating from plastic re-usable containers. The optimization of mass spectrometric parameter settings for quantitative analysis of additives has been achieved by a chemometric approach, using a full factorial and d-optimal experimental designs, allowing to evaluate possible interactions between the investigated parameters. Results showed that the optimized method was characterized by improved features in terms of sensitivity respect to existing methods and was successfully applied to the analysis of a complex model food system such as chocolate put in contact with 14 polycarbonate tableware samples. A new procedure for sample pre-treatment was carried out and validated, showing high reliability. Results reported, for the first time, the presence of several molecules migrating to chocolate, in particular belonging to plastic additives, such Cyasorb UV5411, Tinuvin 234, Uvitex OB, and oligomers, whose amount was found to be correlated to age and degree of damage of the containers. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Prediction of chemical biodegradability using support vector classifier optimized with differential evolution.

    PubMed

    Cao, Qi; Leung, K M

    2014-09-22

    Reliable computer models for the prediction of chemical biodegradability from molecular descriptors and fingerprints are very important for making health and environmental decisions. Coupling of the differential evolution (DE) algorithm with the support vector classifier (SVC) in order to optimize the main parameters of the classifier resulted in an improved classifier called the DE-SVC, which is introduced in this paper for use in chemical biodegradability studies. The DE-SVC was applied to predict the biodegradation of chemicals on the basis of extensive sample data sets and known structural features of molecules. Our optimization experiments showed that DE can efficiently find the proper parameters of the SVC. The resulting classifier possesses strong robustness and reliability compared with grid search, genetic algorithm, and particle swarm optimization methods. The classification experiments conducted here showed that the DE-SVC exhibits better classification performance than models previously used for such studies. It is a more effective and efficient prediction model for chemical biodegradability.

  13. Spur, helical, and spiral bevel transmission life modeling

    NASA Technical Reports Server (NTRS)

    Savage, Michael; Rubadeux, Kelly L.; Coe, Harold H.; Coy, John J.

    1994-01-01

    A computer program, TLIFE, which estimates the life, dynamic capacity, and reliability of aircraft transmissions, is presented. The program enables comparisons of transmission service life at the design stage for optimization. A variety of transmissions may be analyzed including: spur, helical, and spiral bevel reductions as well as series combinations of these reductions. The basic spur and helical reductions include: single mesh, compound, and parallel path plus revert star and planetary gear trains. A variety of straddle and overhung bearing configurations on the gear shafts are possible as is the use of a ring gear for the output. The spiral bevel reductions include single and dual input drives with arbitrary shaft angles. The program is written in FORTRAN 77 and has been executed both in the personal computer DOS environment and on UNIX workstations. The analysis may be performed in either the SI metric or the English inch system of units. The reliability and life analysis is based on the two-parameter Weibull distribution lives of the component gears and bearings. The program output file describes the overall transmission and each constituent transmission, its components, and their locations, capacities, and loads. Primary output is the dynamic capacity and 90-percent reliability and mean lives of the unit transmissions and the overall system which can be used to estimate service overhaul frequency requirements. Two examples are presented to illustrate the information available for single element and series transmissions.

  14. Brain networks for confidence weighting and hierarchical inference during probabilistic learning.

    PubMed

    Meyniel, Florent; Dehaene, Stanislas

    2017-05-09

    Learning is difficult when the world fluctuates randomly and ceaselessly. Classical learning algorithms, such as the delta rule with constant learning rate, are not optimal. Mathematically, the optimal learning rule requires weighting prior knowledge and incoming evidence according to their respective reliabilities. This "confidence weighting" implies the maintenance of an accurate estimate of the reliability of what has been learned. Here, using fMRI and an ideal-observer analysis, we demonstrate that the brain's learning algorithm relies on confidence weighting. While in the fMRI scanner, human adults attempted to learn the transition probabilities underlying an auditory or visual sequence, and reported their confidence in those estimates. They knew that these transition probabilities could change simultaneously at unpredicted moments, and therefore that the learning problem was inherently hierarchical. Subjective confidence reports tightly followed the predictions derived from the ideal observer. In particular, subjects managed to attach distinct levels of confidence to each learned transition probability, as required by Bayes-optimal inference. Distinct brain areas tracked the likelihood of new observations given current predictions, and the confidence in those predictions. Both signals were combined in the right inferior frontal gyrus, where they operated in agreement with the confidence-weighting model. This brain region also presented signatures of a hierarchical process that disentangles distinct sources of uncertainty. Together, our results provide evidence that the sense of confidence is an essential ingredient of probabilistic learning in the human brain, and that the right inferior frontal gyrus hosts a confidence-based statistical learning algorithm for auditory and visual sequences.

  15. Brain networks for confidence weighting and hierarchical inference during probabilistic learning

    PubMed Central

    Meyniel, Florent; Dehaene, Stanislas

    2017-01-01

    Learning is difficult when the world fluctuates randomly and ceaselessly. Classical learning algorithms, such as the delta rule with constant learning rate, are not optimal. Mathematically, the optimal learning rule requires weighting prior knowledge and incoming evidence according to their respective reliabilities. This “confidence weighting” implies the maintenance of an accurate estimate of the reliability of what has been learned. Here, using fMRI and an ideal-observer analysis, we demonstrate that the brain’s learning algorithm relies on confidence weighting. While in the fMRI scanner, human adults attempted to learn the transition probabilities underlying an auditory or visual sequence, and reported their confidence in those estimates. They knew that these transition probabilities could change simultaneously at unpredicted moments, and therefore that the learning problem was inherently hierarchical. Subjective confidence reports tightly followed the predictions derived from the ideal observer. In particular, subjects managed to attach distinct levels of confidence to each learned transition probability, as required by Bayes-optimal inference. Distinct brain areas tracked the likelihood of new observations given current predictions, and the confidence in those predictions. Both signals were combined in the right inferior frontal gyrus, where they operated in agreement with the confidence-weighting model. This brain region also presented signatures of a hierarchical process that disentangles distinct sources of uncertainty. Together, our results provide evidence that the sense of confidence is an essential ingredient of probabilistic learning in the human brain, and that the right inferior frontal gyrus hosts a confidence-based statistical learning algorithm for auditory and visual sequences. PMID:28439014

  16. Meta-Heuristics in Short Scale Construction: Ant Colony Optimization and Genetic Algorithm.

    PubMed

    Schroeders, Ulrich; Wilhelm, Oliver; Olaru, Gabriel

    2016-01-01

    The advent of large-scale assessment, but also the more frequent use of longitudinal and multivariate approaches to measurement in psychological, educational, and sociological research, caused an increased demand for psychometrically sound short scales. Shortening scales economizes on valuable administration time, but might result in inadequate measures because reducing an item set could: a) change the internal structure of the measure, b) result in poorer reliability and measurement precision, c) deliver measures that cannot effectively discriminate between persons on the intended ability spectrum, and d) reduce test-criterion relations. Different approaches to abbreviate measures fare differently with respect to the above-mentioned problems. Therefore, we compare the quality and efficiency of three item selection strategies to derive short scales from an existing long version: a Stepwise COnfirmatory Factor Analytical approach (SCOFA) that maximizes factor loadings and two metaheuristics, specifically an Ant Colony Optimization (ACO) with a tailored user-defined optimization function and a Genetic Algorithm (GA) with an unspecific cost-reduction function. SCOFA compiled short versions were highly reliable, but had poor validity. In contrast, both metaheuristics outperformed SCOFA and produced efficient and psychometrically sound short versions (unidimensional, reliable, sensitive, and valid). We discuss under which circumstances ACO and GA produce equivalent results and provide recommendations for conditions in which it is advisable to use a metaheuristic with an unspecific out-of-the-box optimization function.

  17. A novel rapid and reproducible flow cytometric method for optimization of transfection efficiency in cells

    PubMed Central

    Homann, Stefanie; Hofmann, Christian; Gorin, Aleksandr M.; Nguyen, Huy Cong Xuan; Huynh, Diana; Hamid, Phillip; Maithel, Neil; Yacoubian, Vahe; Mu, Wenli; Kossyvakis, Athanasios; Sen Roy, Shubhendu; Yang, Otto Orlean

    2017-01-01

    Transfection is one of the most frequently used techniques in molecular biology that is also applicable for gene therapy studies in humans. One of the biggest challenges to investigate the protein function and interaction in gene therapy studies is to have reliable monospecific detection reagents, particularly antibodies, for all human gene products. Thus, a reliable method that can optimize transfection efficiency based on not only expression of the target protein of interest but also the uptake of the nucleic acid plasmid, can be an important tool in molecular biology. Here, we present a simple, rapid and robust flow cytometric method that can be used as a tool to optimize transfection efficiency at the single cell level while overcoming limitations of prior established methods that quantify transfection efficiency. By using optimized ratios of transfection reagent and a nucleic acid (DNA or RNA) vector directly labeled with a fluorochrome, this method can be used as a tool to simultaneously quantify cellular toxicity of different transfection reagents, the amount of nucleic acid plasmid that cells have taken up during transfection as well as the amount of the encoded expressed protein. Finally, we demonstrate that this method is reproducible, can be standardized and can reliably and rapidly quantify transfection efficiency, reducing assay costs and increasing throughput while increasing data robustness. PMID:28863132

  18. Meta-Heuristics in Short Scale Construction: Ant Colony Optimization and Genetic Algorithm

    PubMed Central

    Schroeders, Ulrich; Wilhelm, Oliver; Olaru, Gabriel

    2016-01-01

    The advent of large-scale assessment, but also the more frequent use of longitudinal and multivariate approaches to measurement in psychological, educational, and sociological research, caused an increased demand for psychometrically sound short scales. Shortening scales economizes on valuable administration time, but might result in inadequate measures because reducing an item set could: a) change the internal structure of the measure, b) result in poorer reliability and measurement precision, c) deliver measures that cannot effectively discriminate between persons on the intended ability spectrum, and d) reduce test-criterion relations. Different approaches to abbreviate measures fare differently with respect to the above-mentioned problems. Therefore, we compare the quality and efficiency of three item selection strategies to derive short scales from an existing long version: a Stepwise COnfirmatory Factor Analytical approach (SCOFA) that maximizes factor loadings and two metaheuristics, specifically an Ant Colony Optimization (ACO) with a tailored user-defined optimization function and a Genetic Algorithm (GA) with an unspecific cost-reduction function. SCOFA compiled short versions were highly reliable, but had poor validity. In contrast, both metaheuristics outperformed SCOFA and produced efficient and psychometrically sound short versions (unidimensional, reliable, sensitive, and valid). We discuss under which circumstances ACO and GA produce equivalent results and provide recommendations for conditions in which it is advisable to use a metaheuristic with an unspecific out-of-the-box optimization function. PMID:27893845

  19. THESEUS: maximum likelihood superpositioning and analysis of macromolecular structures.

    PubMed

    Theobald, Douglas L; Wuttke, Deborah S

    2006-09-01

    THESEUS is a command line program for performing maximum likelihood (ML) superpositions and analysis of macromolecular structures. While conventional superpositioning methods use ordinary least-squares (LS) as the optimization criterion, ML superpositions provide substantially improved accuracy by down-weighting variable structural regions and by correcting for correlations among atoms. ML superpositioning is robust and insensitive to the specific atoms included in the analysis, and thus it does not require subjective pruning of selected variable atomic coordinates. Output includes both likelihood-based and frequentist statistics for accurate evaluation of the adequacy of a superposition and for reliable analysis of structural similarities and differences. THESEUS performs principal components analysis for analyzing the complex correlations found among atoms within a structural ensemble. ANSI C source code and selected binaries for various computing platforms are available under the GNU open source license from http://monkshood.colorado.edu/theseus/ or http://www.theseus3d.org.

  20. Single-case research design in pediatric psychology: considerations regarding data analysis.

    PubMed

    Cohen, Lindsey L; Feinstein, Amanda; Masuda, Akihiko; Vowles, Kevin E

    2014-03-01

    Single-case research allows for an examination of behavior and can demonstrate the functional relation between intervention and outcome in pediatric psychology. This review highlights key assumptions, methodological and design considerations, and options for data analysis. Single-case methodology and guidelines are reviewed with an in-depth focus on visual and statistical analyses. Guidelines allow for the careful evaluation of design quality and visual analysis. A number of statistical techniques have been introduced to supplement visual analysis, but to date, there is no consensus on their recommended use in single-case research design. Single-case methodology is invaluable for advancing pediatric psychology science and practice, and guidelines have been introduced to enhance the consistency, validity, and reliability of these studies. Experts generally agree that visual inspection is the optimal method of analysis in single-case design; however, statistical approaches are becoming increasingly evaluated and used to augment data interpretation.

  1. Multi-constituent determination and fingerprint analysis of Scutellaria indica L. using ultra high performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry.

    PubMed

    Liang, Xianrui; Zhao, Cui; Su, Weike

    2015-11-01

    An ultra-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry method integrating multi-constituent determination and fingerprint analysis has been established for quality assessment and control of Scutellaria indica L. The optimized method possesses the advantages of speediness, efficiency, and allows multi-constituents determination and fingerprint analysis in one chromatographic run within 11 min. 36 compounds were detected, and 23 of them were unequivocally identified or tentatively assigned. The established fingerprint method was applied to the analysis of ten S. indica samples from different geographic locations. The quality assessment was achieved by using principal component analysis. The proposed method is useful and reliable for the characterization of multi-constituents in a complex chemical system and the overall quality assessment of S. indica. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Joint mobilization forces and therapist reliability in subjects with knee osteoarthritis

    PubMed Central

    Tragord, Bradley S; Gill, Norman W; Silvernail, Jason L; Teyhen, Deydre S; Allison, Stephen C

    2013-01-01

    Objectives: This study determined biomechanical force parameters and reliability among clinicians performing knee joint mobilizations. Methods: Sixteen subjects with knee osteoarthritis and six therapists participated in the study. Forces were recorded using a capacitive-based pressure mat for three techniques at two grades of mobilization, each with two trials of 15 seconds. Dosage (force–time integral), amplitude, and frequency were also calculated. Analysis of variance was used to analyze grade differences, intraclass correlation coefficients determined reliability, and correlations assessed force associations with subject and rater variables. Results: Grade IV mobilizations produced higher mean forces (P<0.001) and higher dosage (P<0.001), while grade III produced higher maximum forces (P = 0.001). Grade III forces (Newtons) by technique (mean, maximum) were: extension 48, 81; flexion 41, 68; and medial glide 21, 34. Grade IV forces (Newtons) by technique (mean, maximum) were: extension 58, 78; flexion 44, 60; and medial glide 22, 30. Frequency (Hertz) ranged between 0.9–1.1 (grade III) and 1.4–1.6 (grade IV). Intra-clinician reliability was excellent (>0.90). Inter-clinician reliability was moderate for force and dosage, and poor for amplitude and frequency. Discussion: Force measurements were consistent with previously reported ranges and clinical constructs. Grade III and grade IV mobilizations can be distinguished from each other with differences for force and frequency being small, and dosage and amplitude being large. Intra-clinician reliability was excellent for all biomechanical parameters and inter-clinician reliability for dosage, the main variable of clinical interest, was moderate. This study quantified the applied forces among multiple clinicians, which may help determine optimal dosage and standardize care. PMID:24421632

  3. Application of human factors engineering (HFE) to the design of a naloxone auto-injector for the treatment of opioid emergencies.

    PubMed

    Raffa, Robert B; Taylor, Robert; Pergolizzi, Joseph V; Nalamachu, Srinivas; Edwards, Eric S; Edwards, Evan T

    2017-02-01

    The increased use of opioids for chronic treatment of pain and the resulting epidemic of opioid overdoses have created a major public health challenge. Parenteral naloxone has been used since the 1970's to treat opioid overdose. Recently, a novel naloxone auto-injector device (EVZIO, kaleo, Inc., Richmond, VA) was approved by the Food and Drug Administration. In this article, we review the Human Factors Engineering (HFE) process used in the development and testing of this novel naloxone auto-injector currently used in nonmedical settings for the emergency treatment of known or suspected opioid overdose. HFE methods were employed throughout the product development process for the naloxone auto-injector including formative and summative studies in order to optimize the auto-injector's user interface, mitigate use-related hazards and increase reliability during an opioid emergency use scenario. HFE was also used to optimize the product's design and user interface in order to reduce or prevent user confusion and misuse. The naloxone auto-injector went through a rigorous HFE process that included perceptual, cognitive, and physical action analysis; formative usability evaluations; use error analysis and summative design validation studies. Applying HFE resulted in the development of a product that is safe, fast, easy and predictably reliable to deliver a potentially life-saving dose of naloxone during an opioid overdose emergency. The naloxone auto-injector may be considered as a universal precaution option for at-risk patients prescribed opioids or those who are at increased risk for an opioid overdose emergency.

  4. Adaptive particle swarm optimization for optimal orbital elements of binary stars

    NASA Astrophysics Data System (ADS)

    Attia, Abdel-Fattah

    2016-12-01

    The paper presents an adaptive particle swarm optimization (APSO) as an alternative method to determine the optimal orbital elements of the star η Bootis of MK type G0 IV. The proposed algorithm transforms the problem of finding periodic orbits into the problem of detecting global minimizers as a function, to get a best fit of Keplerian and Phase curves. The experimental results demonstrate that the proposed approach of APSO generally more accurate than the standard particle swarm optimization (PSO) and other published optimization algorithms, in terms of solution accuracy, convergence speed and algorithm reliability.

  5. Genome-Wide Identification and Evaluation of Reference Genes for Quantitative RT-PCR Analysis during Tomato Fruit Development.

    PubMed

    Cheng, Yuan; Bian, Wuying; Pang, Xin; Yu, Jiahong; Ahammed, Golam J; Zhou, Guozhi; Wang, Rongqing; Ruan, Meiying; Li, Zhimiao; Ye, Qingjing; Yao, Zhuping; Yang, Yuejian; Wan, Hongjian

    2017-01-01

    Gene expression analysis in tomato fruit has drawn increasing attention nowadays. Quantitative real-time PCR (qPCR) is a routine technique for gene expression analysis. In qPCR operation, reliability of results largely depends on the choice of appropriate reference genes (RGs). Although tomato is a model for fruit biology study, few RGs for qPCR analysis in tomato fruit had yet been developed. In this study, we initially identified 38 most stably expressed genes based on tomato transcriptome data set, and their expression stabilities were further determined in a set of tomato fruit samples of four different fruit developmental stages (Immature, mature green, breaker, mature red) using qPCR analysis. Two statistical algorithms, geNorm and Normfinder, concordantly determined the superiority of these identified putative RGs. Notably, SlFRG05 (Solyc01g104170), SlFRG12 (Solyc04g009770), SlFRG16 (Solyc10g081190), SlFRG27 (Solyc06g007510), and SlFRG37 (Solyc11g005330) were proved to be suitable RGs for tomato fruit development study. Further analysis using geNorm indicate that the combined use of SlFRG03 (Solyc02g063070) and SlFRG27 would provide more reliable normalization results in qPCR experiments. The identified RGs in this study will be beneficial for future qPCR analysis of tomato fruit developmental study, as well as for the potential identification of optimal normalization controls in other plant species.

  6. Genome-Wide Identification and Evaluation of Reference Genes for Quantitative RT-PCR Analysis during Tomato Fruit Development

    PubMed Central

    Cheng, Yuan; Bian, Wuying; Pang, Xin; Yu, Jiahong; Ahammed, Golam J.; Zhou, Guozhi; Wang, Rongqing; Ruan, Meiying; Li, Zhimiao; Ye, Qingjing; Yao, Zhuping; Yang, Yuejian; Wan, Hongjian

    2017-01-01

    Gene expression analysis in tomato fruit has drawn increasing attention nowadays. Quantitative real-time PCR (qPCR) is a routine technique for gene expression analysis. In qPCR operation, reliability of results largely depends on the choice of appropriate reference genes (RGs). Although tomato is a model for fruit biology study, few RGs for qPCR analysis in tomato fruit had yet been developed. In this study, we initially identified 38 most stably expressed genes based on tomato transcriptome data set, and their expression stabilities were further determined in a set of tomato fruit samples of four different fruit developmental stages (Immature, mature green, breaker, mature red) using qPCR analysis. Two statistical algorithms, geNorm and Normfinder, concordantly determined the superiority of these identified putative RGs. Notably, SlFRG05 (Solyc01g104170), SlFRG12 (Solyc04g009770), SlFRG16 (Solyc10g081190), SlFRG27 (Solyc06g007510), and SlFRG37 (Solyc11g005330) were proved to be suitable RGs for tomato fruit development study. Further analysis using geNorm indicate that the combined use of SlFRG03 (Solyc02g063070) and SlFRG27 would provide more reliable normalization results in qPCR experiments. The identified RGs in this study will be beneficial for future qPCR analysis of tomato fruit developmental study, as well as for the potential identification of optimal normalization controls in other plant species. PMID:28900431

  7. Transonic Wing Shape Optimization Using a Genetic Algorithm

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.; Pulliam, Thomas H.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    A method for aerodynamic shape optimization based on a genetic algorithm approach is demonstrated. The algorithm is coupled with a transonic full potential flow solver and is used to optimize the flow about transonic wings including multi-objective solutions that lead to the generation of pareto fronts. The results indicate that the genetic algorithm is easy to implement, flexible in application and extremely reliable.

  8. Coupled Solid Rocket Motor Ballistics and Trajectory Modeling for Higher Fidelity Launch Vehicle Design

    NASA Technical Reports Server (NTRS)

    Ables, Brett

    2014-01-01

    Multi-stage launch vehicles with solid rocket motors (SRMs) face design optimization challenges, especially when the mission scope changes frequently. Significant performance benefits can be realized if the solid rocket motors are optimized to the changing requirements. While SRMs represent a fixed performance at launch, rapid design iterations enable flexibility at design time, yielding significant performance gains. The streamlining and integration of SRM design and analysis can be achieved with improved analysis tools. While powerful and versatile, the Solid Performance Program (SPP) is not conducive to rapid design iteration. Performing a design iteration with SPP and a trajectory solver is a labor intensive process. To enable a better workflow, SPP, the Program to Optimize Simulated Trajectories (POST), and the interfaces between them have been improved and automated, and a graphical user interface (GUI) has been developed. The GUI enables real-time visual feedback of grain and nozzle design inputs, enforces parameter dependencies, removes redundancies, and simplifies manipulation of SPP and POST's numerous options. Automating the analysis also simplifies batch analyses and trade studies. Finally, the GUI provides post-processing, visualization, and comparison of results. Wrapping legacy high-fidelity analysis codes with modern software provides the improved interface necessary to enable rapid coupled SRM ballistics and vehicle trajectory analysis. Low cost trade studies demonstrate the sensitivities of flight performance metrics to propulsion characteristics. Incorporating high fidelity analysis from SPP into vehicle design reduces performance margins and improves reliability. By flying an SRM designed with the same assumptions as the rest of the vehicle, accurate comparisons can be made between competing architectures. In summary, this flexible workflow is a critical component to designing a versatile launch vehicle model that can accommodate a volatile mission scope.

  9. A Taguchi approach on optimal process control parameters for HDPE pipe extrusion process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, R. Umamaheswara; Rao, P. Srinivasa

    2017-06-01

    High-density polyethylene (HDPE) pipes find versatile applicability for transportation of water, sewage and slurry from one place to another. Hence, these pipes undergo tremendous pressure by the fluid carried. The present work entails the optimization of the withstanding pressure of the HDPE pipes using Taguchi technique. The traditional heuristic methodology stresses on a trial and error approach and relies heavily upon the accumulated experience of the process engineers for determining the optimal process control parameters. This results in setting up of less-than-optimal values. Hence, there arouse a necessity to determine optimal process control parameters for the pipe extrusion process, which can ensure robust pipe quality and process reliability. In the proposed optimization strategy, the design of experiments (DoE) are conducted wherein different control parameter combinations are analyzed by considering multiple setting levels of each control parameter. The concept of signal-to-noise ratio ( S/ N ratio) is applied and ultimately optimum values of process control parameters are obtained as: pushing zone temperature of 166 °C, Dimmer speed at 08 rpm, and Die head temperature to be 192 °C. Confirmation experimental run is also conducted to verify the analysis and research result and values proved to be in synchronization with the main experimental findings and the withstanding pressure showed a significant improvement from 0.60 to 1.004 Mpa.

  10. Global optimization for motion estimation with applications to ultrasound videos of carotid artery plaques

    NASA Astrophysics Data System (ADS)

    Murillo, Sergio; Pattichis, Marios; Soliz, Peter; Barriga, Simon; Loizou, C. P.; Pattichis, C. S.

    2010-03-01

    Motion estimation from digital video is an ill-posed problem that requires a regularization approach. Regularization introduces a smoothness constraint that can reduce the resolution of the velocity estimates. The problem is further complicated for ultrasound videos (US), where speckle noise levels can be significant. Motion estimation using optical flow models requires the modification of several parameters to satisfy the optical flow constraint as well as the level of imposed smoothness. Furthermore, except in simulations or mostly unrealistic cases, there is no ground truth to use for validating the velocity estimates. This problem is present in all real video sequences that are used as input to motion estimation algorithms. It is also an open problem in biomedical applications like motion analysis of US of carotid artery (CA) plaques. In this paper, we study the problem of obtaining reliable ultrasound video motion estimates for atherosclerotic plaques for use in clinical diagnosis. A global optimization framework for motion parameter optimization is presented. This framework uses actual carotid artery motions to provide optimal parameter values for a variety of motions and is tested on ten different US videos using two different motion estimation techniques.

  11. miRNA purification with an optimized PDMS microdevice: Toward the direct purification of low abundant circulating biomarkers.

    PubMed

    Santini, G C; Potrich, C; Lunelli, L; Vanzetti, L; Marasso, S L; Cocuzza, M; Pirri, F C; Pederzolli, C

    2017-10-01

    A reliable clinical assay based on circulating microRNAs (miRNAs) as biomarkers is highly required. Microdevices offer an attractive solution as a fast and inexpensive way of concentrating these biomarkers from a low sample volume. A previously developed polydimethylsiloxane (PDMS) microdevice able to purify and detect circulating miRNAs was here optimized. The optimization of the morphological and chemical surface properties by nanopatterning and functionalization is presented. Surfaces were firstly characterized by atomic force microscopy (AFM), X-ray photoelectron spectroscopy (XPS), fluorescamine assay and s-SDTB (sulphosuccinimidyl-4-o-(4,4-dimethoxytrityl) butyrate) assay and subsequently tested for their capacity to adsorb a fluorescent miRNA. From our analysis, modification of surface charge with 0.1% APTMS ((3-Aminopropyl)trimethoxysilane) and 0.9% PEG-s (2-[Methoxy-(polyethyleneoxy)propyl]trimethoxysilane) performed at 60°C for 10min was identified as the best purification condition. Our optimized microdevice integrated with real-time PCR detection, was demonstrated to selectively purify both synthetic and natural circulating miRNAs with a sensitivity of 0.01pM. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Intelligent Tires Based on Measurement of Tire Deformation

    NASA Astrophysics Data System (ADS)

    Matsuzaki, Ryosuke; Todoroki, Akira

    From a traffic safety point-of-view, there is an urgent need for intelligent tires as a warning system for road conditions, for optimized braking control on poor road surfaces and as a tire fault detection system. Intelligent tires, equipped with sensors for monitoring applied strain, are effective in improving reliability and control systems such as anti-lock braking systems (ABSs). In previous studies, we developed a direct tire deformation or strain measurement system with sufficiently low stiffness and high elongation for practical use, and a wireless communication system between tires and vehicle that operates without a battery. The present study investigates the application of strain data for an optimized braking control and road condition warning system. The relationships between strain sensor outputs and tire mechanical parameters, including braking torque, effective radius and contact patch length, are calculated using finite element analysis. Finally, we suggested the possibility of optimized braking control and road condition warning systems. Optimized braking control can be achieved by keeping the slip ratio constant. The road condition warning would be actuated if the recorded friction coefficient at a certain slip ratio is lower than a ‘safe’ reference value.

  13. Intelligent tires for improved tire safety using wireless strain measurement

    NASA Astrophysics Data System (ADS)

    Matsuzaki, Ryosuke; Todoroki, Akira

    2008-03-01

    From a traffic safety point-of-view, there is an urgent need for intelligent tires as a warning system for road conditions, for optimized braking control on poor road surfaces and as a tire fault detection system. Intelligent tires, equipped with sensors for monitoring applied strain, are effective in improving reliability and control systems such as anti-lock braking systems (ABSs). In previous studies, we developed a direct tire deformation or strain measurement system with sufficiently low stiffness and high elongation for practical use, and a wireless communication system between tires and vehicle that operates without a battery. The present study investigates the application of strain data for an optimized braking control and road condition warning system. The relationships between strain sensor outputs and tire mechanical parameters, including braking torque, effective radius and contact patch length, are calculated using finite element analysis. Finally, we suggested the possibility of optimized braking control and road condition warning systems. Optimized braking control can be achieved by keeping the slip ratio constant. The road condition warning would be actuated if the recorded friction coefficient at a certain slip ratio is lower than a 'safe' reference value.

  14. Intelligibility Evaluation of Pathological Speech through Multigranularity Feature Extraction and Optimization.

    PubMed

    Fang, Chunying; Li, Haifeng; Ma, Lin; Zhang, Mancai

    2017-01-01

    Pathological speech usually refers to speech distortion resulting from illness or other biological insults. The assessment of pathological speech plays an important role in assisting the experts, while automatic evaluation of speech intelligibility is difficult because it is usually nonstationary and mutational. In this paper, we carry out an independent innovation of feature extraction and reduction, and we describe a multigranularity combined feature scheme which is optimized by the hierarchical visual method. A novel method of generating feature set based on S -transform and chaotic analysis is proposed. There are BAFS (430, basic acoustics feature), local spectral characteristics MSCC (84, Mel S -transform cepstrum coefficients), and chaotic features (12). Finally, radar chart and F -score are proposed to optimize the features by the hierarchical visual fusion. The feature set could be optimized from 526 to 96 dimensions based on NKI-CCRT corpus and 104 dimensions based on SVD corpus. The experimental results denote that new features by support vector machine (SVM) have the best performance, with a recognition rate of 84.4% on NKI-CCRT corpus and 78.7% on SVD corpus. The proposed method is thus approved to be effective and reliable for pathological speech intelligibility evaluation.

  15. Validation of the UNESP-Botucatu unidimensional composite pain scale for assessing postoperative pain in cattle.

    PubMed

    de Oliveira, Flávia Augusta; Luna, Stelio Pacca Loureiro; do Amaral, Jackson Barros; Rodrigues, Karoline Alves; Sant'Anna, Aline Cristina; Daolio, Milena; Brondani, Juliana Tabarelli

    2014-09-06

    The recognition and measurement of pain in cattle are important in determining the necessity for and efficacy of analgesic intervention. The aim of this study was to record behaviour and determine the validity and reliability of an instrument to assess acute pain in 40 cattle subjected to orchiectomy after sedation with xylazine and local anaesthesia. The animals were filmed before and after orchiectomy to record behaviour. The pain scale was based on previous studies, on a pilot study and on analysis of the camera footage. Three blinded observers and a local observer assessed the edited films obtained during the preoperative and postoperative periods, before and after rescue analgesia and 24 hours after surgery. Re-evaluation was performed one month after the first analysis. Criterion validity (agreement) and item-total correlation using Spearman's coefficient were employed to refine the scale. Based on factor analysis, a unidimensional scale was adopted. The internal consistency of the data was excellent after refinement (Cronbach's α coefficient = 0.866). There was a high correlation (p < 0.001) between the proposed scale and the visual analogue, simple descriptive and numerical rating scales. The construct validity and responsiveness were confirmed by the increase and decrease in pain scores after surgery and rescue analgesia, respectively (p < 0.001). Inter- and intra-observer reliability ranged from moderate to very good. The optimal cut-off point for rescue analgesia was > 4, and analysis of the area under the curve (AUC = 0.963) showed excellent discriminatory ability. The UNESP-Botucatu unidimensional pain scale for assessing acute postoperative pain in cattle is a valid, reliable and responsive instrument with excellent internal consistency and discriminatory ability. The cut-off point for rescue analgesia provides an additional tool for guiding analgesic therapy.

  16. Comparison between three option, four option and five option multiple choice question tests for quality parameters: A randomized study.

    PubMed

    Vegada, Bhavisha; Shukla, Apexa; Khilnani, Ajeetkumar; Charan, Jaykaran; Desai, Chetna

    2016-01-01

    Most of the academic teachers use four or five options per item of multiple choice question (MCQ) test as formative and summative assessment. Optimal number of options in MCQ item is a matter of considerable debate among academic teachers of various educational fields. There is a scarcity of the published literature regarding the optimum number of option in each item of MCQ in the field of medical education. To compare three options, four options, and five options MCQs test for the quality parameters - reliability, validity, item analysis, distracter analysis, and time analysis. Participants were 3 rd semester M.B.B.S. students. Students were divided randomly into three groups. Each group was given one set of MCQ test out of three options, four options, and five option randomly. Following the marking of the multiple choice tests, the participants' option selections were analyzed and comparisons were conducted of the mean marks, mean time, validity, reliability and facility value, discrimination index, point biserial value, distracter analysis of three different option formats. Students score more ( P = 0.000) and took less time ( P = 0.009) for the completion of three options as compared to four options and five options groups. Facility value was more ( P = 0.004) in three options group as compared to four and five options groups. There was no significant difference between three groups for the validity, reliability, and item discrimination. Nonfunctioning distracters were more in the four and five options group as compared to three option group. Assessment based on three option MCQs is can be preferred over four option and five option MCQs.

  17. Use of a quality trait index to increase the reliability of phenotypic evaluations in broccoli

    USDA-ARS?s Scientific Manuscript database

    Selection of superior broccoli hybrids involves multiple considerations, including optimization of head quality traits. Quality assessment of broccoli heads is often confounded by relatively subjective human preferences for optimal appearance of heads. To assist the selection process, we assessed fi...

  18. System Statement of Tasks of Calculating and Providing the Reliability of Heating Cogeneration Plants in Power Systems

    NASA Astrophysics Data System (ADS)

    Biryuk, V. V.; Tsapkova, A. B.; Larin, E. A.; Livshiz, M. Y.; Sheludko, L. P.

    2018-01-01

    A set of mathematical models for calculating the reliability indexes of structurally complex multifunctional combined installations in heat and power supply systems was developed. Reliability of energy supply is considered as required condition for the creation and operation of heat and power supply systems. The optimal value of the power supply system coefficient F is based on an economic assessment of the consumers’ loss caused by the under-supply of electric power and additional system expences for the creation and operation of an emergency capacity reserve. Rationing of RI of the industrial heat supply is based on the use of concept of technological margin of safety of technological processes. The definition of rationed RI values of heat supply of communal consumers is based on the air temperature level iside the heated premises. The complex allows solving a number of practical tasks for providing reliability of heat supply for consumers. A probabilistic model is developed for calculating the reliability indexes of combined multipurpose heat and power plants in heat-and-power supply systems. The complex of models and calculation programs can be used to solve a wide range of specific tasks of optimization of schemes and parameters of combined heat and power plants and systems, as well as determining the efficiency of various redundance methods to ensure specified reliability of power supply.

  19. A Practical Solution to Optimizing the Reliability of Teaching Observation Measures under Budget Constraints

    ERIC Educational Resources Information Center

    Meyer, J. Patrick; Liu, Xiang; Mashburn, Andrew J.

    2014-01-01

    Researchers often use generalizability theory to estimate relative error variance and reliability in teaching observation measures. They also use it to plan future studies and design the best possible measurement procedures. However, designing the best possible measurement procedure comes at a cost, and researchers must stay within their budget…

  20. Functionalization of MEMS cantilever beams for interconnect reliability investigation: development practice

    NASA Astrophysics Data System (ADS)

    Bieniek, T.; Janczyk, G.; Dobrowolski, R.; Wojciechowska, K.; Malinowska, A.; Panas, A.; Nieprzecki, M.; Kłos, H.

    2016-11-01

    This paper covers research results on development of the cantilevers beams test structures for interconnects reliability and robustness investigation. Presented results include design, modelling, simulation, optimization and finally fabrication stage performed on 4 inch Si wafers using the ITE microfabrication facility. This paper also covers experimental results from the test structures characterization.

Top