Sample records for multiple-objective reliability problems

  1. Structural Optimization for Reliability Using Nonlinear Goal Programming

    NASA Technical Reports Server (NTRS)

    El-Sayed, Mohamed E.

    1999-01-01

    This report details the development of a reliability based multi-objective design tool for solving structural optimization problems. Based on two different optimization techniques, namely sequential unconstrained minimization and nonlinear goal programming, the developed design method has the capability to take into account the effects of variability on the proposed design through a user specified reliability design criterion. In its sequential unconstrained minimization mode, the developed design tool uses a composite objective function, in conjunction with weight ordered design objectives, in order to take into account conflicting and multiple design criteria. Multiple design criteria of interest including structural weight, load induced stress and deflection, and mechanical reliability. The nonlinear goal programming mode, on the other hand, provides for a design method that eliminates the difficulty of having to define an objective function and constraints, while at the same time has the capability of handling rank ordered design objectives or goals. For simulation purposes the design of a pressure vessel cover plate was undertaken as a test bed for the newly developed design tool. The formulation of this structural optimization problem into sequential unconstrained minimization and goal programming form is presented. The resulting optimization problem was solved using: (i) the linear extended interior penalty function method algorithm; and (ii) Powell's conjugate directions method. Both single and multi-objective numerical test cases are included demonstrating the design tool's capabilities as it applies to this design problem.

  2. Computationally efficient stochastic optimization using multiple realizations

    NASA Astrophysics Data System (ADS)

    Bayer, P.; Bürger, C. M.; Finkel, M.

    2008-02-01

    The presented study is concerned with computationally efficient methods for solving stochastic optimization problems involving multiple equally probable realizations of uncertain parameters. A new and straightforward technique is introduced that is based on dynamically ordering the stack of realizations during the search procedure. The rationale is that a small number of critical realizations govern the output of a reliability-based objective function. By utilizing a problem, which is typical to designing a water supply well field, several variants of this "stack ordering" approach are tested. The results are statistically assessed, in terms of optimality and nominal reliability. This study demonstrates that the simple ordering of a given number of 500 realizations while applying an evolutionary search algorithm can save about half of the model runs without compromising the optimization procedure. More advanced variants of stack ordering can, if properly configured, save up to more than 97% of the computational effort that would be required if the entire number of realizations were considered. The findings herein are promising for similar problems of water management and reliability-based design in general, and particularly for non-convex problems that require heuristic search techniques.

  3. The Role of Age and Setting in Adolescents' First Drinking Experience for Predicting College Problem Drinking

    ERIC Educational Resources Information Center

    Yaeger, Jeffrey P.; Moreno, Megan A.

    2017-01-01

    Objective: The purpose of this study was to determine the reliability of longitudinally reporting age at first drink (AFD), and to test AFD and setting of first drink (SFD) as predictors of collegiate problem drinking. Participants: 338 first-year college students were interviewed multiple times during their first academic year, from May 2011…

  4. Uncertainty-Based Multi-Objective Optimization of Groundwater Remediation Design

    NASA Astrophysics Data System (ADS)

    Singh, A.; Minsker, B.

    2003-12-01

    Management of groundwater contamination is a cost-intensive undertaking filled with conflicting objectives and substantial uncertainty. A critical source of this uncertainty in groundwater remediation design problems comes from the hydraulic conductivity values for the aquifer, upon which the prediction of flow and transport of contaminants are dependent. For a remediation solution to be reliable in practice it is important that it is robust over the potential error in the model predictions. This work focuses on incorporating such uncertainty within a multi-objective optimization framework, to get reliable as well as Pareto optimal solutions. Previous research has shown that small amounts of sampling within a single-objective genetic algorithm can produce highly reliable solutions. However with multiple objectives the noise can interfere with the basic operations of a multi-objective solver, such as determining non-domination of individuals, diversity preservation, and elitism. This work proposes several approaches to improve the performance of noisy multi-objective solvers. These include a simple averaging approach, taking samples across the population (which we call extended averaging), and a stochastic optimization approach. All the approaches are tested on standard multi-objective benchmark problems and a hypothetical groundwater remediation case-study; the best-performing approach is then tested on a field-scale case at Umatilla Army Depot.

  5. Development of a Rubric to Improve Critical Thinking

    ERIC Educational Resources Information Center

    Hildenbrand, Kasee J.; Schultz, Judy A.

    2012-01-01

    Context: Health care professionals, including athletic trainers are confronted daily with multiple complex problems that require critical thinking. Objective: This research attempts to develop a reliable process to assess students' critical thinking in a variety of athletic training and kinesiology courses. Design: Our first step was to create a…

  6. Confidence-Based Data Association and Discriminative Deep Appearance Learning for Robust Online Multi-Object Tracking.

    PubMed

    Bae, Seung-Hwan; Yoon, Kuk-Jin

    2018-03-01

    Online multi-object tracking aims at estimating the tracks of multiple objects instantly with each incoming frame and the information provided up to the moment. It still remains a difficult problem in complex scenes, because of the large ambiguity in associating multiple objects in consecutive frames and the low discriminability between objects appearances. In this paper, we propose a robust online multi-object tracking method that can handle these difficulties effectively. We first define the tracklet confidence using the detectability and continuity of a tracklet, and decompose a multi-object tracking problem into small subproblems based on the tracklet confidence. We then solve the online multi-object tracking problem by associating tracklets and detections in different ways according to their confidence values. Based on this strategy, tracklets sequentially grow with online-provided detections, and fragmented tracklets are linked up with others without any iterative and expensive association steps. For more reliable association between tracklets and detections, we also propose a deep appearance learning method to learn a discriminative appearance model from large training datasets, since the conventional appearance learning methods do not provide rich representation that can distinguish multiple objects with large appearance variations. In addition, we combine online transfer learning for improving appearance discriminability by adapting the pre-trained deep model during online tracking. Experiments with challenging public datasets show distinct performance improvement over other state-of-the-arts batch and online tracking methods, and prove the effect and usefulness of the proposed methods for online multi-object tracking.

  7. The validity of three tests of temperament in guppies (Poecilia reticulata).

    PubMed

    Burns, James G

    2008-11-01

    Differences in temperament (consistent differences among individuals in behavior) can have important effects on fitness-related activities such as dispersal and competition. However, evolutionary ecologists have put limited effort into validating their tests of temperament. This article attempts to validate three standard tests of temperament in guppies: the open-field test, emergence test, and novel-object test. Through multiple reliability trials, and comparison of results between different types of test, this study establishes the confidence that can be placed in these temperament tests. The open-field test is shown to be a good test of boldness and exploratory behavior; the open-field test was reliable when tested in multiple ways. There were problems with the emergence test and novel-object test, which leads one to conclude that the protocols used in this study should not be considered valid tests for this species. (PsycINFO Database Record (c) 2008 APA, all rights reserved).

  8. Battling Arrow's Paradox to Discover Robust Water Management Alternatives

    NASA Astrophysics Data System (ADS)

    Kasprzyk, J. R.; Reed, P. M.; Hadka, D.

    2013-12-01

    This study explores whether or not Arrow's Impossibility Theorem, a theory of social choice, affects the formulation of water resources systems planning problems. The theorem discusses creating an aggregation function for voters choosing from more than three alternatives for society. The Impossibility Theorem is also called Arrow's Paradox, because when trying to add more voters, a single individual's preference will dictate the optimal group decision. In the context of water resources planning, our study is motivated by recent theoretical work that has generalized the insights for Arrow's Paradox to the design of complex engineered systems. In this framing of the paradox, states of society are equivalent to water planning or design alternatives, and the voters are equivalent to multiple planning objectives (e.g. minimizing cost or maximizing performance). Seen from this point of view, multi-objective water planning problems are functionally equivalent to the social choice problem described above. Traditional solutions to such multi-objective problems aggregate multiple performance measures into a single mathematical objective. The Theorem implies that a subset of performance concerns will inadvertently dictate the overall design evaluations in unpredictable ways using such an aggregation. We suggest that instead of aggregation, an explicit many-objective approach to water planning can help overcome the challenges posed by Arrow's Paradox. Many-objective planning explicitly disaggregates measures of performance while supporting the discovery of the planning tradeoffs, employing multiobjective evolutionary algorithms (MOEAs) to find solutions. Using MOEA-based search to address Arrow's Paradox requires that the MOEAs perform robustly with increasing problem complexity, such as adding additional objectives and/or decisions. This study uses comprehensive diagnostic evaluation of MOEA search performance across multiple problem formulations (both aggregated and many-objective) to show whether or not aggregating performance measures biases decision making. In this study, we explore this hypothesis using an urban water portfolio management case study in the Lower Rio Grande Valley. The diagnostic analysis shows that modern self-adaptive MOEA search is efficient, effective, and reliable for the more complex many-objective LRGV planning formulations. Results indicate that although many classical water systems planning frameworks seek to account for multiple objectives, the common practice of reducing the problem into one or more highly aggregated performance measures can severely and negatively bias planning decisions.

  9. Stochastic Methods for Aircraft Design

    NASA Technical Reports Server (NTRS)

    Pelz, Richard B.; Ogot, Madara

    1998-01-01

    The global stochastic optimization method, simulated annealing (SA), was adapted and applied to various problems in aircraft design. The research was aimed at overcoming the problem of finding an optimal design in a space with multiple minima and roughness ubiquitous to numerically generated nonlinear objective functions. SA was modified to reduce the number of objective function evaluations for an optimal design, historically the main criticism of stochastic methods. SA was applied to many CFD/MDO problems including: low sonic-boom bodies, minimum drag on supersonic fore-bodies, minimum drag on supersonic aeroelastic fore-bodies, minimum drag on HSCT aeroelastic wings, FLOPS preliminary design code, another preliminary aircraft design study with vortex lattice aerodynamics, HSR complete aircraft aerodynamics. In every case, SA provided a simple, robust and reliable optimization method which found optimal designs in order 100 objective function evaluations. Perhaps most importantly, from this academic/industrial project, technology has been successfully transferred; this method is the method of choice for optimization problems at Northrop Grumman.

  10. Coordinative Voltage Control Strategy with Multiple Resources for Distribution Systems of High PV Penetration: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Xiangqi; Zhang, Yingchen

    This paper presents an optimal voltage control methodology with coordination among different voltage-regulating resources, including controllable loads, distributed energy resources such as energy storage and photovoltaics (PV), and utility voltage-regulating devices such as voltage regulators and capacitors. The proposed methodology could effectively tackle the overvoltage and voltage regulation device distortion problems brought by high penetrations of PV to improve grid operation reliability. A voltage-load sensitivity matrix and voltage-regulator sensitivity matrix are used to deploy the resources along the feeder to achieve the control objectives. Mixed-integer nonlinear programming is used to solve the formulated optimization control problem. The methodology has beenmore » tested on the IEEE 123-feeder test system, and the results demonstrate that the proposed approach could actively tackle the voltage problem brought about by high penetrations of PV and improve the reliability of distribution system operation.« less

  11. A set partitioning reformulation for the multiple-choice multidimensional knapsack problem

    NASA Astrophysics Data System (ADS)

    Voß, Stefan; Lalla-Ruiz, Eduardo

    2016-05-01

    The Multiple-choice Multidimensional Knapsack Problem (MMKP) is a well-known ?-hard combinatorial optimization problem that has received a lot of attention from the research community as it can be easily translated to several real-world problems arising in areas such as allocating resources, reliability engineering, cognitive radio networks, cloud computing, etc. In this regard, an exact model that is able to provide high-quality feasible solutions for solving it or being partially included in algorithmic schemes is desirable. The MMKP basically consists of finding a subset of objects that maximizes the total profit while observing some capacity restrictions. In this article a reformulation of the MMKP as a set partitioning problem is proposed to allow for new insights into modelling the MMKP. The computational experimentation provides new insights into the problem itself and shows that the new model is able to improve on the best of the known results for some of the most common benchmark instances.

  12. Wind farm optimization using evolutionary algorithms

    NASA Astrophysics Data System (ADS)

    Ituarte-Villarreal, Carlos M.

    In recent years, the wind power industry has focused its efforts on solving the Wind Farm Layout Optimization (WFLO) problem. Wind resource assessment is a pivotal step in optimizing the wind-farm design and siting and, in determining whether a project is economically feasible or not. In the present work, three (3) different optimization methods are proposed for the solution of the WFLO: (i) A modified Viral System Algorithm applied to the optimization of the proper location of the components in a wind-farm to maximize the energy output given a stated wind environment of the site. The optimization problem is formulated as the minimization of energy cost per unit produced and applies a penalization for the lack of system reliability. The viral system algorithm utilized in this research solves three (3) well-known problems in the wind-energy literature; (ii) a new multiple objective evolutionary algorithm to obtain optimal placement of wind turbines while considering the power output, cost, and reliability of the system. The algorithm presented is based on evolutionary computation and the objective functions considered are the maximization of power output, the minimization of wind farm cost and the maximization of system reliability. The final solution to this multiple objective problem is presented as a set of Pareto solutions and, (iii) A hybrid viral-based optimization algorithm adapted to find the proper component configuration for a wind farm with the introduction of the universal generating function (UGF) analytical approach to discretize the different operating or mechanical levels of the wind turbines in addition to the various wind speed states. The proposed methodology considers the specific probability functions of the wind resource to describe their proper behaviors to account for the stochastic comportment of the renewable energy components, aiming to increase their power output and the reliability of these systems. The developed heuristic considers a variable number of system components and wind turbines with different operating characteristics and sizes, to have a more heterogeneous model that can deal with changes in the layout and in the power generation requirements over the time. Moreover, the approach evaluates the impact of the wind-wake effect of the wind turbines upon one another to describe and evaluate the power production capacity reduction of the system depending on the layout distribution of the wind turbines.

  13. Towards Robust Designs Via Multiple-Objective Optimization Methods

    NASA Technical Reports Server (NTRS)

    Man Mohan, Rai

    2006-01-01

    Fabricating and operating complex systems involves dealing with uncertainty in the relevant variables. In the case of aircraft, flow conditions are subject to change during operation. Efficiency and engine noise may be different from the expected values because of manufacturing tolerances and normal wear and tear. Engine components may have a shorter life than expected because of manufacturing tolerances. In spite of the important effect of operating- and manufacturing-uncertainty on the performance and expected life of the component or system, traditional aerodynamic shape optimization has focused on obtaining the best design given a set of deterministic flow conditions. Clearly it is important to both maintain near-optimal performance levels at off-design operating conditions, and, ensure that performance does not degrade appreciably when the component shape differs from the optimal shape due to manufacturing tolerances and normal wear and tear. These requirements naturally lead to the idea of robust optimal design wherein the concept of robustness to various perturbations is built into the design optimization procedure. The basic ideas involved in robust optimal design will be included in this lecture. The imposition of the additional requirement of robustness results in a multiple-objective optimization problem requiring appropriate solution procedures. Typically the costs associated with multiple-objective optimization are substantial. Therefore efficient multiple-objective optimization procedures are crucial to the rapid deployment of the principles of robust design in industry. Hence the companion set of lecture notes (Single- and Multiple-Objective Optimization with Differential Evolution and Neural Networks ) deals with methodology for solving multiple-objective Optimization problems efficiently, reliably and with little user intervention. Applications of the methodologies presented in the companion lecture to robust design will be included here. The evolutionary method (DE) is first used to solve a relatively difficult problem in extended surface heat transfer wherein optimal fin geometries are obtained for different safe operating base temperatures. The objective of maximizing the safe operating base temperature range is in direct conflict with the objective of maximizing fin heat transfer. This problem is a good example of achieving robustness in the context of changing operating conditions. The evolutionary method is then used to design a turbine airfoil; the two objectives being reduced sensitivity of the pressure distribution to small changes in the airfoil shape and the maximization of the trailing edge wedge angle with the consequent increase in airfoil thickness and strength. This is a relevant example of achieving robustness to manufacturing tolerances and wear and tear in the presence of other objectives.

  14. Coastal aquifer management under parameter uncertainty: Ensemble surrogate modeling based simulation-optimization

    NASA Astrophysics Data System (ADS)

    Janardhanan, S.; Datta, B.

    2011-12-01

    Surrogate models are widely used to develop computationally efficient simulation-optimization models to solve complex groundwater management problems. Artificial intelligence based models are most often used for this purpose where they are trained using predictor-predictand data obtained from a numerical simulation model. Most often this is implemented with the assumption that the parameters and boundary conditions used in the numerical simulation model are perfectly known. However, in most practical situations these values are uncertain. Under these circumstances the application of such approximation surrogates becomes limited. In our study we develop a surrogate model based coupled simulation optimization methodology for determining optimal pumping strategies for coastal aquifers considering parameter uncertainty. An ensemble surrogate modeling approach is used along with multiple realization optimization. The methodology is used to solve a multi-objective coastal aquifer management problem considering two conflicting objectives. Hydraulic conductivity and the aquifer recharge are considered as uncertain values. Three dimensional coupled flow and transport simulation model FEMWATER is used to simulate the aquifer responses for a number of scenarios corresponding to Latin hypercube samples of pumping and uncertain parameters to generate input-output patterns for training the surrogate models. Non-parametric bootstrap sampling of this original data set is used to generate multiple data sets which belong to different regions in the multi-dimensional decision and parameter space. These data sets are used to train and test multiple surrogate models based on genetic programming. The ensemble of surrogate models is then linked to a multi-objective genetic algorithm to solve the pumping optimization problem. Two conflicting objectives, viz, maximizing total pumping from beneficial wells and minimizing the total pumping from barrier wells for hydraulic control of saltwater intrusion are considered. The salinity levels resulting at strategic locations due to these pumping are predicted using the ensemble surrogates and are constrained to be within pre-specified levels. Different realizations of the concentration values are obtained from the ensemble predictions corresponding to each candidate solution of pumping. Reliability concept is incorporated as the percent of the total number of surrogate models which satisfy the imposed constraints. The methodology was applied to a realistic coastal aquifer system in Burdekin delta area in Australia. It was found that all optimal solutions corresponding to a reliability level of 0.99 satisfy all the constraints and as reducing reliability level decreases the constraint violation increases. Thus ensemble surrogate model based simulation-optimization was found to be useful in deriving multi-objective optimal pumping strategies for coastal aquifers under parameter uncertainty.

  15. Inter-examiner classification reliability of Mechanical Diagnosis and Therapy for extremity problems - Systematic review.

    PubMed

    Takasaki, Hiroshi; Okuyama, Kousuke; Rosedale, Richard

    2017-02-01

    Mechanical Diagnosis and Therapy (MDT) is used in the treatment of extremity problems. Classifying clinical problems is one method of providing effective treatment to a target population. Classification reliability is a key factor to determine the precise clinical problem and to direct an appropriate intervention. To explore inter-examiner reliability of the MDT classification for extremity problems in three reliability designs: 1) vignette reliability using surveys with patient vignettes, 2) concurrent reliability, where multiple assessors decide a classification by observing someone's assessment, 3) successive reliability, where multiple assessors independently assess the same patient at different times. Systematic review with data synthesis in a quantitative format. Agreement of MDT subgroups was examined using the Kappa value, with the operational definition of acceptable reliability set at ≥ 0.6. The level of evidence was determined considering the methodological quality of the studies. Six studies were included and all studies met the criteria for high quality. Kappa values for the vignette reliability design (five studies) were ≥ 0.7. There was data from two cohorts in one study for the concurrent reliability design and the Kappa values ranged from 0.45 to 1.0. Kappa values for the successive reliability design (data from three cohorts in one study) were < 0.6. The current review found strong evidence of acceptable inter-examiner reliability of MDT classification for extremity problems in the vignette reliability design, limited evidence of acceptable reliability in the concurrent reliability design and unacceptable reliability in the successive reliability design. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. A tabu search evalutionary algorithm for multiobjective optimization: Application to a bi-criterion aircraft structural reliability problem

    NASA Astrophysics Data System (ADS)

    Long, Kim Chenming

    Real-world engineering optimization problems often require the consideration of multiple conflicting and noncommensurate objectives, subject to nonconvex constraint regions in a high-dimensional decision space. Further challenges occur for combinatorial multiobjective problems in which the decision variables are not continuous. Traditional multiobjective optimization methods of operations research, such as weighting and epsilon constraint methods, are ill-suited to solving these complex, multiobjective problems. This has given rise to the application of a wide range of metaheuristic optimization algorithms, such as evolutionary, particle swarm, simulated annealing, and ant colony methods, to multiobjective optimization. Several multiobjective evolutionary algorithms have been developed, including the strength Pareto evolutionary algorithm (SPEA) and the non-dominated sorting genetic algorithm (NSGA), for determining the Pareto-optimal set of non-dominated solutions. Although numerous researchers have developed a wide range of multiobjective optimization algorithms, there is a continuing need to construct computationally efficient algorithms with an improved ability to converge to globally non-dominated solutions along the Pareto-optimal front for complex, large-scale, multiobjective engineering optimization problems. This is particularly important when the multiple objective functions and constraints of the real-world system cannot be expressed in explicit mathematical representations. This research presents a novel metaheuristic evolutionary algorithm for complex multiobjective optimization problems, which combines the metaheuristic tabu search algorithm with the evolutionary algorithm (TSEA), as embodied in genetic algorithms. TSEA is successfully applied to bicriteria (i.e., structural reliability and retrofit cost) optimization of the aircraft tail structure fatigue life, which increases its reliability by prolonging fatigue life. A comparison for this application of the proposed algorithm, TSEA, with several state-of-the-art multiobjective optimization algorithms reveals that TSEA outperforms these algorithms by providing retrofit solutions with greater reliability for the same costs (i.e., closer to the Pareto-optimal front) after the algorithms are executed for the same number of generations. This research also demonstrates that TSEA competes with and, in some situations, outperforms state-of-the-art multiobjective optimization algorithms such as NSGA II and SPEA 2 when applied to classic bicriteria test problems in the technical literature and other complex, sizable real-world applications. The successful implementation of TSEA contributes to the safety of aeronautical structures by providing a systematic way to guide aircraft structural retrofitting efforts, as well as a potentially useful algorithm for a wide range of multiobjective optimization problems in engineering and other fields.

  17. Multidisciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  18. Multi-Disciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song

    1997-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code developed under the leadership of NASA Lewis Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multi-disciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  19. Multi-objective optimization of GENIE Earth system models.

    PubMed

    Price, Andrew R; Myerscough, Richard J; Voutchkov, Ivan I; Marsh, Robert; Cox, Simon J

    2009-07-13

    The tuning of parameters in climate models is essential to provide reliable long-term forecasts of Earth system behaviour. We apply a multi-objective optimization algorithm to the problem of parameter estimation in climate models. This optimization process involves the iterative evaluation of response surface models (RSMs), followed by the execution of multiple Earth system simulations. These computations require an infrastructure that provides high-performance computing for building and searching the RSMs and high-throughput computing for the concurrent evaluation of a large number of models. Grid computing technology is therefore essential to make this algorithm practical for members of the GENIE project.

  20. Individual strategy ratings improve the control for task difficulty effects in arithmetic problem solving paradigms.

    PubMed

    Tschentscher, Nadja; Hauk, Olaf

    2015-01-01

    Mental arithmetic is a powerful paradigm to study problem solving using neuroimaging methods. However, the evaluation of task complexity varies significantly across neuroimaging studies. Most studies have parameterized task complexity by objective features such as the number size. Only a few studies used subjective rating procedures. In fMRI, we provided evidence that strategy self-reports control better for task complexity across arithmetic conditions than objective features (Tschentscher and Hauk, 2014). Here, we analyzed the relative predictive value of self-reported strategies and objective features for performance in addition and multiplication tasks, by using a paradigm designed for neuroimaging research. We found a superiority of strategy ratings as predictor of performance above objective features. In a Principal Component Analysis on reaction times, the first component explained over 90 percent of variance and factor loadings reflected percentages of self-reported strategies well. In multiple regression analyses on reaction times, self-reported strategies performed equally well or better than objective features, depending on the operation type. A Receiver Operating Characteristic (ROC) analysis confirmed this result. Reaction times classified task complexity better when defined by individual ratings. This suggests that participants' strategy ratings are reliable predictors of arithmetic complexity and should be taken into account in neuroimaging research.

  1. Individual strategy ratings improve the control for task difficulty effects in arithmetic problem solving paradigms

    PubMed Central

    Tschentscher, Nadja; Hauk, Olaf

    2015-01-01

    Mental arithmetic is a powerful paradigm to study problem solving using neuroimaging methods. However, the evaluation of task complexity varies significantly across neuroimaging studies. Most studies have parameterized task complexity by objective features such as the number size. Only a few studies used subjective rating procedures. In fMRI, we provided evidence that strategy self-reports control better for task complexity across arithmetic conditions than objective features (Tschentscher and Hauk, 2014). Here, we analyzed the relative predictive value of self-reported strategies and objective features for performance in addition and multiplication tasks, by using a paradigm designed for neuroimaging research. We found a superiority of strategy ratings as predictor of performance above objective features. In a Principal Component Analysis on reaction times, the first component explained over 90 percent of variance and factor loadings reflected percentages of self-reported strategies well. In multiple regression analyses on reaction times, self-reported strategies performed equally well or better than objective features, depending on the operation type. A Receiver Operating Characteristic (ROC) analysis confirmed this result. Reaction times classified task complexity better when defined by individual ratings. This suggests that participants’ strategy ratings are reliable predictors of arithmetic complexity and should be taken into account in neuroimaging research. PMID:26321997

  2. Multicomponent pre-stack seismic waveform inversion in transversely isotropic media using a non-dominated sorting genetic algorithm

    NASA Astrophysics Data System (ADS)

    Padhi, Amit; Mallick, Subhashis

    2014-03-01

    Inversion of band- and offset-limited single component (P wave) seismic data does not provide robust estimates of subsurface elastic parameters and density. Multicomponent seismic data can, in principle, circumvent this limitation but adds to the complexity of the inversion algorithm because it requires simultaneous optimization of multiple objective functions, one for each data component. In seismology, these multiple objectives are typically handled by constructing a single objective given as a weighted sum of the objectives of individual data components and sometimes with additional regularization terms reflecting their interdependence; which is then followed by a single objective optimization. Multi-objective problems, inclusive of the multicomponent seismic inversion are however non-linear. They have non-unique solutions, known as the Pareto-optimal solutions. Therefore, casting such problems as a single objective optimization provides one out of the entire set of the Pareto-optimal solutions, which in turn, may be biased by the choice of the weights. To handle multiple objectives, it is thus appropriate to treat the objective as a vector and simultaneously optimize each of its components so that the entire Pareto-optimal set of solutions could be estimated. This paper proposes such a novel multi-objective methodology using a non-dominated sorting genetic algorithm for waveform inversion of multicomponent seismic data. The applicability of the method is demonstrated using synthetic data generated from multilayer models based on a real well log. We document that the proposed method can reliably extract subsurface elastic parameters and density from multicomponent seismic data both when the subsurface is considered isotropic and transversely isotropic with a vertical symmetry axis. We also compute approximate uncertainty values in the derived parameters. Although we restrict our inversion applications to horizontally stratified models, we outline a practical procedure of extending the method to approximately include local dips for each source-receiver offset pair. Finally, the applicability of the proposed method is not just limited to seismic inversion but it could be used to invert different data types not only requiring multiple objectives but also multiple physics to describe them.

  3. Comparison of Difficulties and Reliabilities of Math-Completion and Multiple-Choice Item Formats.

    ERIC Educational Resources Information Center

    Oosterhof, Albert C.; Coats, Pamela K.

    Instructors who develop classroom examinations that require students to provide a numerical response to a mathematical problem are often very concerned about the appropriateness of the multiple-choice format. The present study augments previous research relevant to this concern by comparing the difficulty and reliability of multiple-choice and…

  4. Improving the Performance of Highly Constrained Water Resource Systems using Multiobjective Evolutionary Algorithms and RiverWare

    NASA Astrophysics Data System (ADS)

    Smith, R.; Kasprzyk, J. R.; Zagona, E. A.

    2015-12-01

    Instead of building new infrastructure to increase their supply reliability, water resource managers are often tasked with better management of current systems. The managers often have existing simulation models that aid their planning, and lack methods for efficiently generating and evaluating planning alternatives. This presentation discusses how multiobjective evolutionary algorithm (MOEA) decision support can be used with the sophisticated water infrastructure model, RiverWare, in highly constrained water planning environments. We first discuss a study that performed a many-objective tradeoff analysis of water supply in the Tarrant Regional Water District (TRWD) in Texas. RiverWare is combined with the Borg MOEA to solve a seven objective problem that includes systemwide performance objectives and individual reservoir storage reliability. Decisions within the formulation balance supply in multiple reservoirs and control pumping between the eastern and western parts of the system. The RiverWare simulation model is forced by two stochastic hydrology scenarios to inform how management changes in wet versus dry conditions. The second part of the presentation suggests how a broader set of RiverWare-MOEA studies can inform tradeoffs in other systems, especially in political situations where multiple actors are in conflict over finite water resources. By incorporating quantitative representations of diverse parties' objectives during the search for solutions, MOEAs may provide support for negotiations and lead to more widely beneficial water management outcomes.

  5. Adults' strategies for simple addition and multiplication: verbal self-reports and the operand recognition paradigm.

    PubMed

    Metcalfe, Arron W S; Campbell, Jamie I D

    2011-05-01

    Accurate measurement of cognitive strategies is important in diverse areas of psychological research. Strategy self-reports are a common measure, but C. Thevenot, M. Fanget, and M. Fayol (2007) proposed a more objective method to distinguish different strategies in the context of mental arithmetic. In their operand recognition paradigm, speed of recognition memory for problem operands after solving a problem indexes strategy (e.g., direct memory retrieval vs. a procedural strategy). Here, in 2 experiments, operand recognition time was the same following simple addition or multiplication, but, consistent with a wide variety of previous research, strategy reports indicated much greater use of procedures (e.g., counting) for addition than multiplication. Operation, problem size (e.g., 2 + 3 vs. 8 + 9), and operand format (digits vs. words) had interactive effects on reported procedure use that were not reflected in recognition performance. Regression analyses suggested that recognition time was influenced at least as much by the relative difficulty of the preceding problem as by the strategy used. The findings indicate that the operand recognition paradigm is not a reliable substitute for strategy reports and highlight the potential impact of difficulty-related carryover effects in sequential cognitive tasks.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person`s identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed formore » discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm.« less

  7. A Corticothalamic Circuit Model for Sound Identification in Complex Scenes

    PubMed Central

    Otazu, Gonzalo H.; Leibold, Christian

    2011-01-01

    The identification of the sound sources present in the environment is essential for the survival of many animals. However, these sounds are not presented in isolation, as natural scenes consist of a superposition of sounds originating from multiple sources. The identification of a source under these circumstances is a complex computational problem that is readily solved by most animals. We present a model of the thalamocortical circuit that performs level-invariant recognition of auditory objects in complex auditory scenes. The circuit identifies the objects present from a large dictionary of possible elements and operates reliably for real sound signals with multiple concurrently active sources. The key model assumption is that the activities of some cortical neurons encode the difference between the observed signal and an internal estimate. Reanalysis of awake auditory cortex recordings revealed neurons with patterns of activity corresponding to such an error signal. PMID:21931668

  8. Development and Validation of a POSIT-Short Form: Screening for Problem Behaviors among Adolescents at Risk for Substance Use.

    ERIC Educational Resources Information Center

    Danseco, Evangeline R.; Marques, Paul R.

    2002-01-01

    The Problem-Oriented Screening Instrument for Teenagers (POSIT) screens for multiple problems among adolescents at risk for substance use. A shortened version of the POSIT was developed, using factor analysis, and correlational and reliability analyses. The POSIT-SF shows potential for a reliable and cost-efficient screen for youth with substance…

  9. Reliability and Validity of a Procedure to Measure Diagnostic Reasoning and Problem-Solving Skills Taught in Predoctoral Orthodontic Education.

    ERIC Educational Resources Information Center

    Albanese, Mark A.; Jacobs, Richard M.

    1990-01-01

    The reliability and validity of a procedure to measure diagnostic-reasoning and problem-solving skills taught in predoctoral orthodontic education were studied using 68 second year dental students. The procedure includes stimulus material and 33 multiple-choice items. It is a feasible way of assessing problem-solving skills in dentistry education…

  10. Experimental and evaluated photoneutron cross sections for 197Au

    NASA Astrophysics Data System (ADS)

    Varlamov, V.; Ishkhanov, B.; Orlin, V.

    2017-10-01

    There is a serious well-known problem of noticeable disagreements between the partial photoneutron cross sections obtained in various experiments. Such data were mainly determined using quasimonoenergetic annihilation photon beams and the method of neutron multiplicity sorting at Lawrence Livermore National Laboratory (USA) and Centre d'Etudes Nucleaires of Saclay (France). The analysis of experimental cross sections employing new objective physical data reliability criteria has shown that many of those are not reliable. The IAEA Coordinated Research Project (CRP) on photonuclear data evaluation was approved. The experimental and previously evaluated cross sections of the partial photoneutron reactions (γ ,1 n ) and (γ ,2 n ) on 197Au were analyzed using the new data reliability criteria. The data evaluated using the new experimental-theoretical method noticeably differ from both experimental data and data previously evaluated using nuclear modeling codes gnash, gunf, alice-f, and others. These discrepancies needed to be resolved.

  11. Multiagent Flight Control in Dynamic Environments with Cooperative Coevolutionary Algorithms

    NASA Technical Reports Server (NTRS)

    Knudson, Matthew D.; Colby, Mitchell; Tumer, Kagan

    2014-01-01

    Dynamic flight environments in which objectives and environmental features change with respect to time pose a difficult problem with regards to planning optimal flight paths. Path planning methods are typically computationally expensive, and are often difficult to implement in real time if system objectives are changed. This computational problem is compounded when multiple agents are present in the system, as the state and action space grows exponentially. In this work, we use cooperative coevolutionary algorithms in order to develop policies which control agent motion in a dynamic multiagent unmanned aerial system environment such that goals and perceptions change, while ensuring safety constraints are not violated. Rather than replanning new paths when the environment changes, we develop a policy which can map the new environmental features to a trajectory for the agent while ensuring safe and reliable operation, while providing 92% of the theoretically optimal performance

  12. Multiagent Flight Control in Dynamic Environments with Cooperative Coevolutionary Algorithms

    NASA Technical Reports Server (NTRS)

    Colby, Mitchell; Knudson, Matthew D.; Tumer, Kagan

    2014-01-01

    Dynamic environments in which objectives and environmental features change with respect to time pose a difficult problem with regards to planning optimal paths through these environments. Path planning methods are typically computationally expensive, and are often difficult to implement in real time if system objectives are changed. This computational problem is compounded when multiple agents are present in the system, as the state and action space grows exponentially with the number of agents in the system. In this work, we use cooperative coevolutionary algorithms in order to develop policies which control agent motion in a dynamic multiagent unmanned aerial system environment such that goals and perceptions change, while ensuring safety constraints are not violated. Rather than replanning new paths when the environment changes, we develop a policy which can map the new environmental features to a trajectory for the agent while ensuring safe and reliable operation, while providing 92% of the theoretically optimal performance.

  13. Radar based autonomous sensor module

    NASA Astrophysics Data System (ADS)

    Styles, Tim

    2016-10-01

    Most surveillance systems combine camera sensors with other detection sensors that trigger an alert to a human operator when an object is detected. The detection sensors typically require careful installation and configuration for each application and there is a significant burden on the operator to react to each alert by viewing camera video feeds. A demonstration system known as Sensing for Asset Protection with Integrated Electronic Networked Technology (SAPIENT) has been developed to address these issues using Autonomous Sensor Modules (ASM) and a central High Level Decision Making Module (HLDMM) that can fuse the detections from multiple sensors. This paper describes the 24 GHz radar based ASM, which provides an all-weather, low power and license exempt solution to the problem of wide area surveillance. The radar module autonomously configures itself in response to tasks provided by the HLDMM, steering the transmit beam and setting range resolution and power levels for optimum performance. The results show the detection and classification performance for pedestrians and vehicles in an area of interest, which can be modified by the HLDMM without physical adjustment. The module uses range-Doppler processing for reliable detection of moving objects and combines Radar Cross Section and micro-Doppler characteristics for object classification. Objects are classified as pedestrian or vehicle, with vehicle sub classes based on size. Detections are reported only if the object is detected in a task coverage area and it is classified as an object of interest. The system was shown in a perimeter protection scenario using multiple radar ASMs, laser scanners, thermal cameras and visible band cameras. This combination of sensors enabled the HLDMM to generate reliable alerts with improved discrimination of objects and behaviours of interest.

  14. Calculation of Pareto-optimal solutions to multiple-objective problems using threshold-of-acceptability constraints

    NASA Technical Reports Server (NTRS)

    Giesy, D. P.

    1978-01-01

    A technique is presented for the calculation of Pareto-optimal solutions to a multiple-objective constrained optimization problem by solving a series of single-objective problems. Threshold-of-acceptability constraints are placed on the objective functions at each stage to both limit the area of search and to mathematically guarantee convergence to a Pareto optimum.

  15. Bi-Objective Modelling for Hazardous Materials Road–Rail Multimodal Routing Problem with Railway Schedule-Based Space–Time Constraints

    PubMed Central

    Sun, Yan; Lang, Maoxiang; Wang, Danzhu

    2016-01-01

    The transportation of hazardous materials is always accompanied by considerable risk that will impact public and environment security. As an efficient and reliable transportation organization, a multimodal service should participate in the transportation of hazardous materials. In this study, we focus on transporting hazardous materials through the multimodal service network and explore the hazardous materials multimodal routing problem from the operational level of network planning. To formulate this problem more practicably, minimizing the total generalized costs of transporting the hazardous materials and the social risk along the planned routes are set as the optimization objectives. Meanwhile, the following formulation characteristics will be comprehensively modelled: (1) specific customer demands; (2) multiple hazardous material flows; (3) capacitated schedule-based rail service and uncapacitated time-flexible road service; and (4) environmental risk constraint. A bi-objective mixed integer nonlinear programming model is first built to formulate the routing problem that combines the formulation characteristics above. Then linear reformations are developed to linearize and improve the initial model so that it can be effectively solved by exact solution algorithms on standard mathematical programming software. By utilizing the normalized weighted sum method, we can generate the Pareto solutions to the bi-objective optimization problem for a specific case. Finally, a large-scale empirical case study from the Beijing–Tianjin–Hebei Region in China is presented to demonstrate the feasibility of the proposed methods in dealing with the practical problem. Various scenarios are also discussed in the case study. PMID:27483294

  16. Thinking in Terms of Sensors: Personification of Self as an Object in Physics Problem Solving

    ERIC Educational Resources Information Center

    Tabor-Morris, A. E.

    2015-01-01

    How can physics teachers help students develop consistent problem solving techniques for both simple and complicated physics problems, such as those that encompass objects undergoing multiple forces (mechanical or electrical) as individually portrayed in free-body diagrams and/or phenomenon involving multiple objects, such as Doppler effect…

  17. Reliable design of a closed loop supply chain network under uncertainty: An interval fuzzy possibilistic chance-constrained model

    NASA Astrophysics Data System (ADS)

    Vahdani, Behnam; Tavakkoli-Moghaddam, Reza; Jolai, Fariborz; Baboli, Arman

    2013-06-01

    This article seeks to offer a systematic approach to establishing a reliable network of facilities in closed loop supply chains (CLSCs) under uncertainties. Facilities that are located in this article concurrently satisfy both traditional objective functions and reliability considerations in CLSC network designs. To attack this problem, a novel mathematical model is developed that integrates the network design decisions in both forward and reverse supply chain networks. The model also utilizes an effective reliability approach to find a robust network design. In order to make the results of this article more realistic, a CLSC for a case study in the iron and steel industry has been explored. The considered CLSC is multi-echelon, multi-facility, multi-product and multi-supplier. Furthermore, multiple facilities exist in the reverse logistics network leading to high complexities. Since the collection centres play an important role in this network, the reliability concept of these facilities is taken into consideration. To solve the proposed model, a novel interactive hybrid solution methodology is developed by combining a number of efficient solution approaches from the recent literature. The proposed solution methodology is a bi-objective interval fuzzy possibilistic chance-constraint mixed integer linear programming (BOIFPCCMILP). Finally, computational experiments are provided to demonstrate the applicability and suitability of the proposed model in a supply chain environment and to help decision makers facilitate their analyses.

  18. The Essential Properties of Yoga Questionnaire (EPYQ): Psychometric Properties.

    PubMed

    Park, Crystal L; Elwy, A Rani; Maiya, Meghan; Sarkin, Andrew J; Riley, Kristen E; Eisen, Susan V; Gutierrez, Ian; Finkelstein-Fox, Lucy; Lee, Sharon Y; Casteel, Danielle; Braun, Tosca; Groessl, Erik J

    2018-03-02

    Yoga interventions are heterogeneous and vary along multiple dimensions. These dimensions may affect mental and physical health outcomes in different ways or through different mechanisms. However, most studies of the effects of yoga on health do not adequately describe or quantify the components of the interventions being implemented. This lack of detail prevents researchers from making comparisons across studies and limits our understanding of the relative effects of different aspects of yoga interventions. To address this problem, we developed the Essential Properties of Yoga Questionnaire (EPYQ), which allows researchers to objectively characterize their interventions. We present here the reliability and validity data from the final phases of this measure-development project. Analyses identified fourteen key dimensions of yoga interventions measured by the EPYQ: acceptance/compassion, bandhas, body awareness, breathwork, instructor mention of health benefits, individual attention, meditation and mindfulness, mental and emotional awareness, physicality, active postures, restorative postures, social aspects, spirituality, and yoga philosophy. The EPYQ demonstrated good reliability, as assessed by internal consistency and test-retest reliability analysis, and evidence suggests that the EPYQ is a valid measure of multiple dimensions of yoga. The measure is ready for use by clinicians and researchers. Results indicate that, currently, trained objective raters should score interventions to avoid reference frame errors and potential rating bias, but alternative approaches may be developed. The EPYQ will allow researchers to link specific yoga dimensions to identifiable health outcomes and optimize the design of yoga interventions for specific conditions.

  19. On designing for quality

    NASA Technical Reports Server (NTRS)

    Vajingortin, L. D.; Roisman, W. P.

    1991-01-01

    The problem of ensuring the required quality of products and/or technological processes often becomes more difficult due to the fact that there is not general theory of determining the optimal sets of value of the primary factors, i.e., of the output parameters of the parts and units comprising an object and ensuring the correspondence of the object's parameters to the quality requirements. This is the main reason for the amount of time taken to finish complex vital article. To create this theory, one has to overcome a number of difficulties and to solve the following tasks: the creation of reliable and stable mathematical models showing the influence of the primary factors on the output parameters; finding a new technique of assigning tolerances for primary factors with regard to economical, technological, and other criteria, the technique being based on the solution of the main problem; well reasoned assignment of nominal values for primary factors which serve as the basis for creating tolerances. Each of the above listed tasks is of independent importance. An attempt is made to give solutions for this problem. The above problem dealing with quality ensuring an mathematically formalized aspect is called the multiple inverse problem.

  20. A Multiple Objective Test Assembly Approach for Exposure Control Problems in Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Veldkamp, Bernard P.; Verschoor, Angela J.; Eggen, Theo J. H. M.

    2010-01-01

    Overexposure and underexposure of items in the bank are serious problems in operational computerized adaptive testing (CAT) systems. These exposure problems might result in item compromise, or point at a waste of investments. The exposure control problem can be viewed as a test assembly problem with multiple objectives. Information in the test has…

  1. A SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation and analysis of multiple objectives are very important in designing environmentally benign processes. They require a systematic procedure for solving multi-objective decision-making problems due to the complex nature of the problems and the need for complex assessment....

  2. Bilevel formulation of a policy design problem considering multiple objectives and incomplete preferences

    NASA Astrophysics Data System (ADS)

    Hawthorne, Bryant; Panchal, Jitesh H.

    2014-07-01

    A bilevel optimization formulation of policy design problems considering multiple objectives and incomplete preferences of the stakeholders is presented. The formulation is presented for Feed-in-Tariff (FIT) policy design for decentralized energy infrastructure. The upper-level problem is the policy designer's problem and the lower-level problem is a Nash equilibrium problem resulting from market interactions. The policy designer has two objectives: maximizing the quantity of energy generated and minimizing policy cost. The stakeholders decide on quantities while maximizing net present value and minimizing capital investment. The Nash equilibrium problem in the presence of incomplete preferences is formulated as a stochastic linear complementarity problem and solved using expected value formulation, expected residual minimization formulation, and the Monte Carlo technique. The primary contributions in this article are the mathematical formulation of the FIT policy, the extension of computational policy design problems to multiple objectives, and the consideration of incomplete preferences of stakeholders for policy design problems.

  3. Multiple utility constrained multi-objective programs using Bayesian theory

    NASA Astrophysics Data System (ADS)

    Abbasian, Pooneh; Mahdavi-Amiri, Nezam; Fazlollahtabar, Hamed

    2018-03-01

    A utility function is an important tool for representing a DM's preference. We adjoin utility functions to multi-objective optimization problems. In current studies, usually one utility function is used for each objective function. Situations may arise for a goal to have multiple utility functions. Here, we consider a constrained multi-objective problem with each objective having multiple utility functions. We induce the probability of the utilities for each objective function using Bayesian theory. Illustrative examples considering dependence and independence of variables are worked through to demonstrate the usefulness of the proposed model.

  4. Reliability and Validity of a Procedure To Measure Diagnostic Reasoning and Problem-Solving Skills Taught in Predoctoral Orthodontic Education.

    ERIC Educational Resources Information Center

    Albanese, Mark A.; Jacobs, Richard M.

    Preliminary psychometric data assessing the reliability and validity of a method used to measure the diagnostic reasoning and problem-solving skills of predoctoral students in orthodontia are described. The measurement approach consisted of sets of patient demographic data and dental photos and x-rays, accompanied by a set of 33 multiple-choice…

  5. Validation of the MedUseQ: A Self-Administered Screener for Older Adults to Assess Medication Use Problems.

    PubMed

    Berman, Rebecca L; Iris, Madelyn; Conrad, Kendon J; Robinson, Carrie

    2018-01-01

    Older adults taking multiple prescription and nonprescription drugs are at risk for medication use problems, yet there are few brief, self-administered screening tools designed specifically for them. The study objective was to develop and validate a patient-centered screener for community-dwelling older adults. In phase 1, a convenience sample of 57 stakeholders (older adults, pharmacists, nurses, and physicians) participated in concept mapping, using Concept System® Global MAX TM , to identify items for a questionnaire. In phase 2, a 40-item questionnaire was tested with a convenience sample of 377 adults and a 24-item version was tested with 306 older adults, aged 55 and older, using Rasch methodology. In phase 3, stakeholder focus groups provided feedback on the format of questionnaire materials and recommended strategies for addressing problems. The concept map contained 72 statements organized into 6 conceptual clusters or domains. The 24-item screener was unidimensional. Cronbach's alpha was .87, person reliability was acceptable (.74), and item reliability was high (.96). The MedUseQ is a validated, patient-centered tool targeting older adults that can be used to assess a wide range of medication use problems in clinical and community settings and to identify areas for education, intervention, or further assessment.

  6. Thinking in terms of sensors: personification of self as an object in physics problem solving

    NASA Astrophysics Data System (ADS)

    Tabor-Morris, A. E.

    2015-03-01

    How can physics teachers help students develop consistent problem solving techniques for both simple and complicated physics problems, such as those that encompass objects undergoing multiple forces (mechanical or electrical) as individually portrayed in free-body diagrams and/or phenomenon involving multiple objects, such as Doppler effect reflection applications in echoes and ultrasonic cardiac monitoring for sound, or police radar for light? These problems can confuse novice physics students, and to sort out problem parts, the suggestion is made here to guide the student to personify self as the object in question, that is, to imagine oneself as the object undergoing outside influences such as forces and then qualify and quantify those for the problem at hand. This personification does NOT, as according to the three traditional definitions of the term (animism, anthropomorphism and teleology), empower the object to act, but instead just to detect its environment. By having students use their imagination to put themselves in the place of the object, they can ‘sense’ the influences the object is experiencing to analyze these individually, hopefully reducing the student’s feeling of being overwhelmed with information, and also imbuing the student with a sense of having experienced the situation. This can be especially useful in problems that involve both multiple forces AND multiple objects (for example, Atwood’s machine), since objects acted upon need to be considered separately and consecutively, with the idea that one cannot be two objects at once. This personification technique, documented to have been used by both Einstein and Feynman, is recommended here for secondary-school teen and university-level adult learners with discussions on specific physics and astronomy classroom strategies.

  7. Single and multiple objective biomass-to-biofuel supply chain optimization considering environmental impacts

    NASA Astrophysics Data System (ADS)

    Valles Sosa, Claudia Evangelina

    Bioenergy has become an important alternative source of energy to alleviate the reliance on petroleum energy. Bioenergy offers diminishing climate change by reducing Green House Gas Emissions, as well as providing energy security and enhancing rural development. The Energy Independence and Security Act mandate the use of 21 billion gallons of advanced biofuels including 16 billion gallons of cellulosic biofuels by the year 2022. It is clear that Biomass can make a substantial contribution to supply future energy demand in a sustainable way. However, the supply of sustainable energy is one of the main challenges that mankind will face over the coming decades. For instance, many logistical challenges will be faced in order to provide an efficient and reliable supply of quality feedstock to biorefineries. 700 million tons of biomass will be required to be sustainably delivered to biorefineries annually to meet the projected use of biofuels by the year of 2022. Approaching this complex logistic problem as a multi-commodity network flow structure, the present work proposes the use of a genetic algorithm as a single objective optimization problem that considers the maximization of profit and the present work also proposes the use of a Multiple Objective Evolutionary Algorithm to simultaneously maximize profit while minimizing global warming potential. Most transportation optimization problems available in the literature have mostly considered the maximization of profit or the minimization of total travel time as potential objectives to be optimized. However, on this research work, we take a more conscious and sustainable approach for this logistic problem. Planners are increasingly expected to adopt a multi-disciplinary approach, especially due to the rising importance of environmental stewardship. The role of a transportation planner and designer is shifting from simple economic analysis to promoting sustainability through the integration of environmental objectives. To respond to these new challenges, the Modified Multiple Objective Evolutionary Algorithm for the design optimization of a biomass to bio-refinery logistic system that considers the simultaneous maximization of the total profit and the minimization of three environmental impacts is presented. Sustainability balances economic, social and environmental goals and objectives. There exist several works in the literature that have considered economic and environmental objectives for the presented supply chain problem. However, there is a lack of research performed in the social aspect of a sustainable logistics system. This work proposes a methodology to integrate social aspect assessment, based on employment creation. Finally, most of the assessment methodologies considered in the literature only contemplate deterministic values, when in realistic situations uncertainties in the supply chain are present. In this work, Value-at-Risk, an advanced risk measure commonly used in portfolio optimization is included to consider the uncertainties in biofuel prices, among the others.

  8. SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES

    EPA Science Inventory

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems, due to the complex nature of the problems, the need for complex assessments, and complicated ...

  9. Multiple objective optimization in reliability demonstration test

    DOE PAGES

    Lu, Lu; Anderson-Cook, Christine Michaela; Li, Mingyang

    2016-10-01

    Reliability demonstration tests are usually performed in product design or validation processes to demonstrate whether a product meets specified requirements on reliability. For binomial demonstration tests, the zero-failure test has been most commonly used due to its simplicity and use of minimum sample size to achieve an acceptable consumer’s risk level. However, this test can often result in unacceptably high risk for producers as well as a low probability of passing the test even when the product has good reliability. This paper explicitly explores the interrelationship between multiple objectives that are commonly of interest when planning a demonstration test andmore » proposes structured decision-making procedures using a Pareto front approach for selecting an optimal test plan based on simultaneously balancing multiple criteria. Different strategies are suggested for scenarios with different user priorities and graphical tools are developed to help quantify the trade-offs between choices and to facilitate informed decision making. As a result, potential impacts of some subjective user inputs on the final decision are studied to offer insights and useful guidance for general applications.« less

  10. Optimal clustering of MGs based on droop controller for improving reliability using a hybrid of harmony search and genetic algorithms.

    PubMed

    Abedini, Mohammad; Moradi, Mohammad H; Hosseinian, S M

    2016-03-01

    This paper proposes a novel method to address reliability and technical problems of microgrids (MGs) based on designing a number of self-adequate autonomous sub-MGs via adopting MGs clustering thinking. In doing so, a multi-objective optimization problem is developed where power losses reduction, voltage profile improvement and reliability enhancement are considered as the objective functions. To solve the optimization problem a hybrid algorithm, named HS-GA, is provided, based on genetic and harmony search algorithms, and a load flow method is given to model different types of DGs as droop controller. The performance of the proposed method is evaluated in two case studies. The results provide support for the performance of the proposed method. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama

    2003-01-01

    This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.

  12. Multiple-energy Techniques in Industrial Computerized Tomography

    DOE R&D Accomplishments Database

    Schneberk, D.; Martz, H.; Azevedo, S.

    1990-08-01

    Considerable effort is being applied to develop multiple-energy industrial CT techniques for materials characterization. Multiple-energy CT can provide reliable estimates of effective Z (Z{sub eff}), weight fraction, and rigorous calculations of absolute density, all at the spatial resolution of the scanner. Currently, a wide variety of techniques exist for CT scanners, but each has certain problems and limitations. Ultimately, the best multi-energy CT technique would combine the qualities of accuracy, reliability, and wide range of application, and would require the smallest number of additional measurements. We have developed techniques for calculating material properties of industrial objects that differ somewhat from currently used methods. In this paper, we present our methods for calculating Z{sub eff}, weight fraction, and density. We begin with the simplest case -- methods for multiple-energy CT using isotopic sources -- and proceed to multiple-energy work with x-ray machine sources. The methods discussed here are illustrated on CT scans of PBX-9502 high explosives, a lexan-aluminum phantom, and a cylinder of glass beads used in a preliminary study to determine if CT can resolve three phases: air, water, and a high-Z oil. In the CT project at LLNL, we have constructed several CT scanners of varying scanning geometries using {gamma}- and x-ray sources. In our research, we employed two of these scanners: pencil-beam CAT for CT data using isotopic sources and video-CAT equipped with an IRT micro-focal x-ray machine source.

  13. Reliability of DSM-IV Symptom Ratings of ADHD: Implications for DSM-V

    ERIC Educational Resources Information Center

    Solanto, Mary V.; Alvir, Jose

    2009-01-01

    Objective: The objective of this study was to examine the intrarater reliability of "DSM-IV" ADHD symptoms. Method: Two-hundred-two children referred for attention problems and 49 comparison children (all 7-12 years) were rated by parents and teachers on the identical "DSM-IV" items presented in two different formats, the…

  14. Feasibility, Test-Retest Reliability, and Interrater Reliability of the Modified Ashworth Scale and Modified Tardieu Scale in Persons with Profound Intellectual and Multiple Disabilities

    ERIC Educational Resources Information Center

    Waninge, A.; Rook, R. A.; Dijkhuizen, A.; Gielen, E.; van der Schans, C. P.

    2011-01-01

    Caregivers of persons with profound intellectual and multiple disabilities (PIMD) often describe the quality of the daily movements of these persons in terms of flexibility or stiffness. Objective outcome measures for flexibility and stiffness are muscle tone or level of spasticity. Two instruments used to grade muscle tone and spasticity are the…

  15. Using a derivative-free optimization method for multiple solutions of inverse transport problems

    DOE PAGES

    Armstrong, Jerawan C.; Favorite, Jeffrey A.

    2016-01-14

    Identifying unknown components of an object that emits radiation is an important problem for national and global security. Radiation signatures measured from an object of interest can be used to infer object parameter values that are not known. This problem is called an inverse transport problem. An inverse transport problem may have multiple solutions and the most widely used approach for its solution is an iterative optimization method. This paper proposes a stochastic derivative-free global optimization algorithm to find multiple solutions of inverse transport problems. The algorithm is an extension of a multilevel single linkage (MLSL) method where a meshmore » adaptive direct search (MADS) algorithm is incorporated into the local phase. Furthermore, numerical test cases using uncollided fluxes of discrete gamma-ray lines are presented to show the performance of this new algorithm.« less

  16. Statistical methods and neural network approaches for classification of data from multiple sources

    NASA Technical Reports Server (NTRS)

    Benediktsson, Jon Atli; Swain, Philip H.

    1990-01-01

    Statistical methods for classification of data from multiple data sources are investigated and compared to neural network models. A problem with using conventional multivariate statistical approaches for classification of data of multiple types is in general that a multivariate distribution cannot be assumed for the classes in the data sources. Another common problem with statistical classification methods is that the data sources are not equally reliable. This means that the data sources need to be weighted according to their reliability but most statistical classification methods do not have a mechanism for this. This research focuses on statistical methods which can overcome these problems: a method of statistical multisource analysis and consensus theory. Reliability measures for weighting the data sources in these methods are suggested and investigated. Secondly, this research focuses on neural network models. The neural networks are distribution free since no prior knowledge of the statistical distribution of the data is needed. This is an obvious advantage over most statistical classification methods. The neural networks also automatically take care of the problem involving how much weight each data source should have. On the other hand, their training process is iterative and can take a very long time. Methods to speed up the training procedure are introduced and investigated. Experimental results of classification using both neural network models and statistical methods are given, and the approaches are compared based on these results.

  17. Construction of Valid and Reliable Test for Assessment of Students

    ERIC Educational Resources Information Center

    Osadebe, P. U.

    2015-01-01

    The study was carried out to construct a valid and reliable test in Economics for secondary school students. Two research questions were drawn to guide the establishment of validity and reliability for the Economics Achievement Test (EAT). It is a multiple choice objective test of five options with 100 items. A sample of 1000 students was randomly…

  18. Tracking multiple objects is limited only by object spacing, not by speed, time, or capacity.

    PubMed

    Franconeri, S L; Jonathan, S V; Scimeca, J M

    2010-07-01

    In dealing with a dynamic world, people have the ability to maintain selective attention on a subset of moving objects in the environment. Performance in such multiple-object tracking is limited by three primary factors-the number of objects that one can track, the speed at which one can track them, and how close together they can be. We argue that this last limit, of object spacing, is the root cause of all performance constraints in multiple-object tracking. In two experiments, we found that as long as the distribution of object spacing is held constant, tracking performance is unaffected by large changes in object speed and tracking time. These results suggest that barring object-spacing constraints, people could reliably track an unlimited number of objects as fast as they could track a single object.

  19. Confronting Decision Cliffs: Diagnostic Assessment of Multi-Objective Evolutionary Algorithms' Performance for Addressing Uncertain Environmental Thresholds

    NASA Astrophysics Data System (ADS)

    Ward, V. L.; Singh, R.; Reed, P. M.; Keller, K.

    2014-12-01

    As water resources problems typically involve several stakeholders with conflicting objectives, multi-objective evolutionary algorithms (MOEAs) are now key tools for understanding management tradeoffs. Given the growing complexity of water planning problems, it is important to establish if an algorithm can consistently perform well on a given class of problems. This knowledge allows the decision analyst to focus on eliciting and evaluating appropriate problem formulations. This study proposes a multi-objective adaptation of the classic environmental economics "Lake Problem" as a computationally simple but mathematically challenging MOEA benchmarking problem. The lake problem abstracts a fictional town on a lake which hopes to maximize its economic benefit without degrading the lake's water quality to a eutrophic (polluted) state through excessive phosphorus loading. The problem poses the challenge of maintaining economic activity while confronting the uncertainty of potentially crossing a nonlinear and potentially irreversible pollution threshold beyond which the lake is eutrophic. Objectives for optimization are maximizing economic benefit from lake pollution, maximizing water quality, maximizing the reliability of remaining below the environmental threshold, and minimizing the probability that the town will have to drastically change pollution policies in any given year. The multi-objective formulation incorporates uncertainty with a stochastic phosphorus inflow abstracting non-point source pollution. We performed comprehensive diagnostics using 6 algorithms: Borg, MOEAD, eMOEA, eNSGAII, GDE3, and NSGAII to ascertain their controllability, reliability, efficiency, and effectiveness. The lake problem abstracts elements of many current water resources and climate related management applications where there is the potential for crossing irreversible, nonlinear thresholds. We show that many modern MOEAs can fail on this test problem, indicating its suitability as a useful and nontrivial benchmarking problem.

  20. Markov chains for testing redundant software

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Sjogren, Jon A.

    1988-01-01

    A preliminary design for a validation experiment has been developed that addresses several problems unique to assuring the extremely high quality of multiple-version programs in process-control software. The procedure uses Markov chains to model the error states of the multiple version programs. The programs are observed during simulated process-control testing, and estimates are obtained for the transition probabilities between the states of the Markov chain. The experimental Markov chain model is then expanded into a reliability model that takes into account the inertia of the system being controlled. The reliability of the multiple version software is computed from this reliability model at a given confidence level using confidence intervals obtained for the transition probabilities during the experiment. An example demonstrating the method is provided.

  1. Uninformative Prior Multiple Target Tracking Using Evidential Particle Filters

    NASA Astrophysics Data System (ADS)

    Worthy, J. L., III; Holzinger, M. J.

    Space situational awareness requires the ability to initialize state estimation from short measurements and the reliable association of observations to support the characterization of the space environment. The electro-optical systems used to observe space objects cannot fully characterize the state of an object given a short, unobservable sequence of measurements. Further, it is difficult to associate these short-arc measurements if many such measurements are generated through the observation of a cluster of satellites, debris from a satellite break-up, or from spurious detections of an object. An optimization based, probabilistic short-arc observation association approach coupled with a Dempster-Shafer based evidential particle filter in a multiple target tracking framework is developed and proposed to address these problems. The optimization based approach is shown in literature to be computationally efficient and can produce probabilities of association, state estimates, and covariances while accounting for systemic errors. Rigorous application of Dempster-Shafer theory is shown to be effective at enabling ignorance to be properly accounted for in estimation by augmenting probability with belief and plausibility. The proposed multiple hypothesis framework will use a non-exclusive hypothesis formulation of Dempster-Shafer theory to assign belief mass to candidate association pairs and generate tracks based on the belief to plausibility ratio. The proposed algorithm is demonstrated using simulated observations of a GEO satellite breakup scenario.

  2. Foster Placement Disruptions Associated with Problem Behavior: Mitigating a Threshold Effect

    ERIC Educational Resources Information Center

    Fisher, Philip A.; Stoolmiller, Mike; Mannering, Anne M.; Takahashi, Aiko; Chamberlain, Patricia

    2011-01-01

    Objective: Placement disruptions have adverse effects on foster children. Identifying reliable predictors of placement disruptions might assist in the allocation of services to prevent disruptions. There were two objectives in this study: (a) to replicate a prior finding that the number of daily child problem behaviors at entry into a new foster…

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agalgaonkar, Yashodhan P.; Hammerstrom, Donald J.

    The Pacific Northwest Smart Grid Demonstration (PNWSGD) was a smart grid technology performance evaluation project that included multiple U.S. states and cooperation from multiple electric utilities in the northwest region. One of the local objectives for the project was to achieve improved distribution system reliability. Toward this end, some PNWSGD utilities automated their distribution systems, including the application of fault detection, isolation, and restoration and advanced metering infrastructure. In light of this investment, a major challenge was to establish a correlation between implementation of these smart grid technologies and actual improvements of distribution system reliability. This paper proposes using Welch’smore » t-test to objectively determine and quantify whether distribution system reliability is improving over time. The proposed methodology is generic, and it can be implemented by any utility after calculation of the standard reliability indices. The effectiveness of the proposed hypothesis testing approach is demonstrated through comprehensive practical results. It is believed that wider adoption of the proposed approach can help utilities to evaluate a realistic long-term performance of smart grid technologies.« less

  4. Assessing the Quality of Problems in Problem-Based Learning

    ERIC Educational Resources Information Center

    Sockalingam, Nachamma; Rotgans, Jerome; Schmidt, Henk

    2012-01-01

    This study evaluated the construct validity and reliability of a newly devised 32-item problem quality rating scale intended to measure the quality of problems in problem-based learning. The rating scale measured the following five characteristics of problems: the extent to which the problem (1) leads to learning objectives, (2) is familiar, (3)…

  5. Recent advances in computational structural reliability analysis methods

    NASA Astrophysics Data System (ADS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-10-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  6. Recent advances in computational structural reliability analysis methods

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  7. The multiple sclerosis work difficulties questionnaire: translation and cross-cultural adaptation to Turkish and assessment of validity and reliability.

    PubMed

    Kahraman, Turhan; Özdoğar, Asiye Tuba; Honan, Cynthia Alison; Ertekin, Özge; Özakbaş, Serkan

    2018-05-09

    To linguistically and culturally adapt the Multiple Sclerosis Work Difficulties Questionnaire-23 (MSWDQ-23) for use in Turkey, and to examine its reliability and validity. Following standard forward-back translation of the MSWDQ-23, it was administered to 124 people with multiple sclerosis (MS). Validity was evaluated using related outcome measures including those related to employment status and expectations, disability level, fatigue, walking, and quality of life. Randomly selected participants were asked to complete the MSWDQ-23 again to assess test-retest reliability. Confirmatory factor analysis on the MSWDQ-23 demonstrated a good fit for the data, and the internal consistency of each subscale was excellent. The test-retest reliability for the total score, psychological/cognitive barriers, physical barriers, and external barriers subscales were high. The MSWDQ-23 and its subscales were positively correlated with the employment, disability level, walking, and fatigue outcome measures. This study suggests that the Turkish version of MSWDQ-23 has high reliability and adequate validity, and it can be used to determine the difficulties faced by people with multiple sclerosis in workplace. Moreover, the study provides evidence about the test-retest reliability of the questionnaire. Implications for rehabilitation Multiple sclerosis affects young people of working age. Understanding work-related problems is crucial to enhance people with multiple sclerosis likelihood of maintaining their job. The Multiple Sclerosis Work Difficulties Questionnaire-23 (MSWDQ-23) is a valid and reliable measure of perceived workplace difficulties in people with multiple sclerosis: we presented its validation to Turkish. Professionals working in the field of vocational rehabilitation may benefit from using the MSWDQ-23 to predict the current work outcomes and future employment expectations.

  8. Research in Mathematics Education: Multiple Methods for Multiple Uses

    ERIC Educational Resources Information Center

    Battista, Michael; Smith, Margaret S.; Boerst, Timothy; Sutton, John; Confrey, Jere; White, Dorothy; Knuth, Eric; Quander, Judith

    2009-01-01

    Recent federal education policies and reports have generated considerable debate about the meaning, methods, and goals of "scientific research" in mathematics education. Concentrating on the critical problem of determining which educational programs and practices reliably improve students' mathematics achievement, these policies and reports focus…

  9. Multi-Objective Lake Superior Regulation

    NASA Astrophysics Data System (ADS)

    Asadzadeh, M.; Razavi, S.; Tolson, B.

    2011-12-01

    At the direction of the International Joint Commission (IJC) the International Upper Great Lakes Study (IUGLS) Board is investigating possible changes to the present method of regulating the outflows of Lake Superior (SUP) to better meet the contemporary needs of the stakeholders. In this study, a new plan in the form of a rule curve that is directly interpretable for regulation of SUP is proposed. The proposed rule curve has 18 parameters that should be optimized. The IUGLS Board is also interested in a regulation strategy that considers potential effects of climate uncertainty. Therefore, the quality of the rule curve is assessed simultaneously for multiple supply sequences that represent various future climate scenarios. The rule curve parameters are obtained by solving a computationally intensive bi-objective simulation-optimization problem that maximizes the total increase in navigation and hydropower benefits of the new regulation plan and minimizes the sum of all normalized constraint violations. The objective and constraint values are obtained from a Microsoft Excel based Shared Vision Model (SVM) that compares any new SUP regulation plan with the current regulation policy. The underlying optimization problem is solved by a recently developed, highly efficient multi-objective optimization algorithm called Pareto Archived Dynamically Dimensioned Search (PA-DDS). To further improve the computational efficiency of the simulation-optimization problem, the model pre-emption strategy is used in a novel way to avoid the complete evaluation of regulation plans with low quality in both objectives. Results show that the generated rule curve is robust and typically more reliable when facing unpredictable climate conditions compared to other SUP regulation plans.

  10. Wireless Sensor Network Optimization: Multi-Objective Paradigm.

    PubMed

    Iqbal, Muhammad; Naeem, Muhammad; Anpalagan, Alagan; Ahmed, Ashfaq; Azam, Muhammad

    2015-07-20

    Optimization problems relating to wireless sensor network planning, design, deployment and operation often give rise to multi-objective optimization formulations where multiple desirable objectives compete with each other and the decision maker has to select one of the tradeoff solutions. These multiple objectives may or may not conflict with each other. Keeping in view the nature of the application, the sensing scenario and input/output of the problem, the type of optimization problem changes. To address different nature of optimization problems relating to wireless sensor network design, deployment, operation, planing and placement, there exist a plethora of optimization solution types. We review and analyze different desirable objectives to show whether they conflict with each other, support each other or they are design dependent. We also present a generic multi-objective optimization problem relating to wireless sensor network which consists of input variables, required output, objectives and constraints. A list of constraints is also presented to give an overview of different constraints which are considered while formulating the optimization problems in wireless sensor networks. Keeping in view the multi facet coverage of this article relating to multi-objective optimization, this will open up new avenues of research in the area of multi-objective optimization relating to wireless sensor networks.

  11. [Using projective tests in forensic psychiatry may lead to wrong conclusions. Only empirically tested tests should be used].

    PubMed

    Trygg, L; Dåderman, A M; Wiklund, N; Meurling, A W; Lindgren, M; Lidberg, L; Levander, S

    2001-06-27

    The use of projective and psychometric psychological tests at the Department of Forensic Psychiatry in Stockholm (Huddinge), Sweden, was studied for a population of 60 men, including many patients with neuropsychological disabilities and multiple psychiatric disorders. The results showed that the use of projective tests like Rorschach, Object Relations Test, and House-Tree-Person was more frequent than the use of objective psychometric tests. Neuropsychological test batteries like the Halstead-Reitan Neuropsychological Test Battery or Luria-Nebraska Neuropsychological Battery were not used. The majority of patients were, however, assessed by intelligence scales like the WAIS-R. The questionable reliability and validity of the projective tests, and the risk of subjective interpretations, raise a problem when used in a forensic setting, since the courts' decisions about a sentence to prison or psychiatric care is based on the forensic psychiatric assessment. The use of objective psychometric neuropsychological tests and personality tests is recommended.

  12. Informed multi-objective decision-making in environmental management using Pareto optimality

    Treesearch

    Maureen C. Kennedy; E. David Ford; Peter Singleton; Mark Finney; James K. Agee

    2008-01-01

    Effective decisionmaking in environmental management requires the consideration of multiple objectives that may conflict. Common optimization methods use weights on the multiple objectives to aggregate them into a single value, neglecting valuable insight into the relationships among the objectives in the management problem.

  13. EasyDIAg: A tool for easy determination of interrater agreement.

    PubMed

    Holle, Henning; Rein, Robert

    2015-09-01

    Reliable measurements are fundamental for the empirical sciences. In observational research, measurements often consist of observers categorizing behavior into nominal-scaled units. Since the categorization is the outcome of a complex judgment process, it is important to evaluate the extent to which these judgments are reproducible, by having multiple observers independently rate the same behavior. A challenge in determining interrater agreement for timed-event sequential data is to develop clear objective criteria to determine whether two raters' judgments relate to the same event (the linking problem). Furthermore, many studies presently report only raw agreement indices, without considering the degree to which agreement can occur by chance alone. Here, we present a novel, free, and open-source toolbox (EasyDIAg) designed to assist researchers with the linking problem, while also providing chance-corrected estimates of interrater agreement. Additional tools are included to facilitate the development of coding schemes and rater training.

  14. Reliability and Validity Evidence of Multiple Balance Assessments in Athletes With a Concussion

    PubMed Central

    Murray, Nicholas; Salvatore, Anthony; Powell, Douglas; Reed-Jones, Rebecca

    2014-01-01

    Context: An estimated 300 000 sport-related concussion injuries occur in the United States annually. Approximately 30% of individuals with concussions experience balance disturbances. Common methods of balance assessment include the Clinical Test of Sensory Organization and Balance (CTSIB), the Sensory Organization Test (SOT), the Balance Error Scoring System (BESS), and the Romberg test; however, the National Collegiate Athletic Association recommended the Wii Fit as an alternative measure of balance in athletes with a concussion. A central concern regarding the implementation of the Wii Fit is whether it is reliable and valid for measuring balance disturbance in athletes with concussion. Objective: To examine the reliability and validity evidence for the CTSIB, SOT, BESS, Romberg test, and Wii Fit for detecting balance disturbance in athletes with a concussion. Data Sources: Literature considered for review included publications with reliability and validity data for the assessments of balance (CTSIB, SOT, BESS, Romberg test, and Wii Fit) from PubMed, PsycINFO, and CINAHL. Data Extraction: We identified 63 relevant articles for consideration in the review. Of the 63 articles, 28 were considered appropriate for inclusion and 35 were excluded. Data Synthesis: No current reliability or validity information supports the use of the CTSIB, SOT, Romberg test, or Wii Fit for balance assessment in athletes with a concussion. The BESS demonstrated moderate to high reliability (interclass correlation coefficient = 0.87) and low to moderate validity (sensitivity = 34%, specificity = 87%). However, the Romberg test and Wii Fit have been shown to be reliable tools in the assessment of balance in Parkinson patients. Conclusions: The BESS can evaluate balance problems after a concussion. However, it lacks the ability to detect balance problems after the third day of recovery. Further investigation is needed to establish the use of the CTSIB, SOT, Romberg test, and Wii Fit for assessing balance in athletes with concussions. PMID:24933431

  15. Real-time reliability measure-driven multi-hypothesis tracking using 2D and 3D features

    NASA Astrophysics Data System (ADS)

    Zúñiga, Marcos D.; Brémond, François; Thonnat, Monique

    2011-12-01

    We propose a new multi-target tracking approach, which is able to reliably track multiple objects even with poor segmentation results due to noisy environments. The approach takes advantage of a new dual object model combining 2D and 3D features through reliability measures. In order to obtain these 3D features, a new classifier associates an object class label to each moving region (e.g. person, vehicle), a parallelepiped model and visual reliability measures of its attributes. These reliability measures allow to properly weight the contribution of noisy, erroneous or false data in order to better maintain the integrity of the object dynamics model. Then, a new multi-target tracking algorithm uses these object descriptions to generate tracking hypotheses about the objects moving in the scene. This tracking approach is able to manage many-to-many visual target correspondences. For achieving this characteristic, the algorithm takes advantage of 3D models for merging dissociated visual evidence (moving regions) potentially corresponding to the same real object, according to previously obtained information. The tracking approach has been validated using video surveillance benchmarks publicly accessible. The obtained performance is real time and the results are competitive compared with other tracking algorithms, with minimal (or null) reconfiguration effort between different videos.

  16. Reliable Adaptive Data Aggregation Route Strategy for a Trade-off between Energy and Lifetime in WSNs

    PubMed Central

    Guo, Wenzhong; Hong, Wei; Zhang, Bin; Chen, Yuzhong; Xiong, Naixue

    2014-01-01

    Mobile security is one of the most fundamental problems in Wireless Sensor Networks (WSNs). The data transmission path will be compromised for some disabled nodes. To construct a secure and reliable network, designing an adaptive route strategy which optimizes energy consumption and network lifetime of the aggregation cost is of great importance. In this paper, we address the reliable data aggregation route problem for WSNs. Firstly, to ensure nodes work properly, we propose a data aggregation route algorithm which improves the energy efficiency in the WSN. The construction process achieved through discrete particle swarm optimization (DPSO) saves node energy costs. Then, to balance the network load and establish a reliable network, an adaptive route algorithm with the minimal energy and the maximum lifetime is proposed. Since it is a non-linear constrained multi-objective optimization problem, in this paper we propose a DPSO with the multi-objective fitness function combined with the phenotype sharing function and penalty function to find available routes. Experimental results show that compared with other tree routing algorithms our algorithm can effectively reduce energy consumption and trade off energy consumption and network lifetime. PMID:25215944

  17. Wireless Sensor Network Optimization: Multi-Objective Paradigm

    PubMed Central

    Iqbal, Muhammad; Naeem, Muhammad; Anpalagan, Alagan; Ahmed, Ashfaq; Azam, Muhammad

    2015-01-01

    Optimization problems relating to wireless sensor network planning, design, deployment and operation often give rise to multi-objective optimization formulations where multiple desirable objectives compete with each other and the decision maker has to select one of the tradeoff solutions. These multiple objectives may or may not conflict with each other. Keeping in view the nature of the application, the sensing scenario and input/output of the problem, the type of optimization problem changes. To address different nature of optimization problems relating to wireless sensor network design, deployment, operation, planing and placement, there exist a plethora of optimization solution types. We review and analyze different desirable objectives to show whether they conflict with each other, support each other or they are design dependent. We also present a generic multi-objective optimization problem relating to wireless sensor network which consists of input variables, required output, objectives and constraints. A list of constraints is also presented to give an overview of different constraints which are considered while formulating the optimization problems in wireless sensor networks. Keeping in view the multi facet coverage of this article relating to multi-objective optimization, this will open up new avenues of research in the area of multi-objective optimization relating to wireless sensor networks. PMID:26205271

  18. Evaluation of Complex Human Performance: The Promise of Computer-Based Simulation

    ERIC Educational Resources Information Center

    Newsom, Robert S.; And Others

    1978-01-01

    For the training and placement of professional workers, multiple-choice instruments are the norm for wide-scale measurement and evaluation efforts. These instruments contain fundamental problems. Computer-based management simulations may provide solutions to these problems, appear scoreable and reliable, offer increased validity, and are better…

  19. Test of understanding of vectors: A reliable multiple-choice vector concept test

    NASA Astrophysics Data System (ADS)

    Barniol, Pablo; Zavala, Genaro

    2014-06-01

    In this article we discuss the findings of our research on students' understanding of vector concepts in problems without physical context. First, we develop a complete taxonomy of the most frequent errors made by university students when learning vector concepts. This study is based on the results of several test administrations of open-ended problems in which a total of 2067 students participated. Using this taxonomy, we then designed a 20-item multiple-choice test [Test of understanding of vectors (TUV)] and administered it in English to 423 students who were completing the required sequence of introductory physics courses at a large private Mexican university. We evaluated the test's content validity, reliability, and discriminatory power. The results indicate that the TUV is a reliable assessment tool. We also conducted a detailed analysis of the students' understanding of the vector concepts evaluated in the test. The TUV is included in the Supplemental Material as a resource for other researchers studying vector learning, as well as instructors teaching the material.

  20. Reliability and Prevalence of an Atypical Development of Phonological Skills in French-Speaking Dyslexics

    ERIC Educational Resources Information Center

    Sprenger-Charolles, Liliane; Cole, Pascale; Kipffer-Piquard, Agnes; Pinton, Florence; Billard, Catherine

    2009-01-01

    In the present study, conducted with French-speaking children, we examined the reliability (group study) and the prevalence (multiple-case study) of dyslexics' phonological deficits in reading and reading-related skills in comparison with Reading Level (RL) controls. All dyslexics with no comorbidity problem schooled in a special institution for…

  1. Multiple Object Based RFID System Using Security Level

    NASA Astrophysics Data System (ADS)

    Kim, Jiyeon; Jung, Jongjin; Ryu, Ukjae; Ko, Hoon; Joe, Susan; Lee, Yongjun; Kim, Boyeon; Chang, Yunseok; Lee, Kyoonha

    2007-12-01

    RFID systems are increasingly applied for operational convenience in wide range of industries and individual life. However, it is uneasy for a person to control many tags because common RFID systems have the restriction that a tag used to identify just a single object. In addition, RFID systems can make some serious problems in violation of privacy and security because of their radio frequency communication. In this paper, we propose a multiple object RFID tag which can keep multiple object identifiers for different applications in a same tag. The proposed tag allows simultaneous access for their pair applications. We also propose an authentication protocol for multiple object tag to prevent serious problems of security and privacy in RFID applications. Especially, we focus on efficiency of the authentication protocol by considering security levels of applications. In the proposed protocol, the applications go through different authentication procedures according to security level of the object identifier stored in the tag. We implemented the proposed RFID scheme and made experimental results about efficiency and stability for the scheme.

  2. Reliability Standards of Complex Engineering Systems

    NASA Astrophysics Data System (ADS)

    Galperin, E. M.; Zayko, V. A.; Gorshkalev, P. A.

    2017-11-01

    Production and manufacture play an important role in today’s modern society. Industrial production is nowadays characterized by increased and complex communications between its parts. The problem of preventing accidents in a large industrial enterprise becomes especially relevant. In these circumstances, the reliability of enterprise functioning is of particular importance. Potential damage caused by an accident at such enterprise may lead to substantial material losses and, in some cases, can even cause a loss of human lives. That is why industrial enterprise functioning reliability is immensely important. In terms of their reliability, industrial facilities (objects) are divided into simple and complex. Simple objects are characterized by only two conditions: operable and non-operable. A complex object exists in more than two conditions. The main characteristic here is the stability of its operation. This paper develops the reliability indicator combining the set theory methodology and a state space method. Both are widely used to analyze dynamically developing probability processes. The research also introduces a set of reliability indicators for complex technical systems.

  3. Exploiting replication in distributed systems

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.; Joseph, T. A.

    1989-01-01

    Techniques are examined for replicating data and execution in directly distributed systems: systems in which multiple processes interact directly with one another while continuously respecting constraints on their joint behavior. Directly distributed systems are often required to solve difficult problems, ranging from management of replicated data to dynamic reconfiguration in response to failures. It is shown that these problems reduce to more primitive, order-based consistency problems, which can be solved using primitives such as the reliable broadcast protocols. Moreover, given a system that implements reliable broadcast primitives, a flexible set of high-level tools can be provided for building a wide variety of directly distributed application programs.

  4. Reliability based design including future tests and multiagent approaches

    NASA Astrophysics Data System (ADS)

    Villanueva, Diane

    The initial stages of reliability-based design optimization involve the formulation of objective functions and constraints, and building a model to estimate the reliability of the design with quantified uncertainties. However, even experienced hands often overlook important objective functions and constraints that affect the design. In addition, uncertainty reduction measures, such as tests and redesign, are often not considered in reliability calculations during the initial stages. This research considers two areas that concern the design of engineering systems: 1) the trade-off of the effect of a test and post-test redesign on reliability and cost and 2) the search for multiple candidate designs as insurance against unforeseen faults in some designs. In this research, a methodology was developed to estimate the effect of a single future test and post-test redesign on reliability and cost. The methodology uses assumed distributions of computational and experimental errors with re-design rules to simulate alternative future test and redesign outcomes to form a probabilistic estimate of the reliability and cost for a given design. Further, it was explored how modeling a future test and redesign provides a company an opportunity to balance development costs versus performance by simultaneously designing the design and the post-test redesign rules during the initial design stage. The second area of this research considers the use of dynamic local surrogates, or surrogate-based agents, to locate multiple candidate designs. Surrogate-based global optimization algorithms often require search in multiple candidate regions of design space, expending most of the computation needed to define multiple alternate designs. Thus, focusing on solely locating the best design may be wasteful. We extended adaptive sampling surrogate techniques to locate multiple optima by building local surrogates in sub-regions of the design space to identify optima. The efficiency of this method was studied, and the method was compared to other surrogate-based optimization methods that aim to locate the global optimum using two two-dimensional test functions, a six-dimensional test function, and a five-dimensional engineering example.

  5. Combining heterogenous features for 3D hand-held object recognition

    NASA Astrophysics Data System (ADS)

    Lv, Xiong; Wang, Shuang; Li, Xiangyang; Jiang, Shuqiang

    2014-10-01

    Object recognition has wide applications in the area of human-machine interaction and multimedia retrieval. However, due to the problem of visual polysemous and concept polymorphism, it is still a great challenge to obtain reliable recognition result for the 2D images. Recently, with the emergence and easy availability of RGB-D equipment such as Kinect, this challenge could be relieved because the depth channel could bring more information. A very special and important case of object recognition is hand-held object recognition, as hand is a straight and natural way for both human-human interaction and human-machine interaction. In this paper, we study the problem of 3D object recognition by combining heterogenous features with different modalities and extraction techniques. For hand-craft feature, although it reserves the low-level information such as shape and color, it has shown weakness in representing hiconvolutionalgh-level semantic information compared with the automatic learned feature, especially deep feature. Deep feature has shown its great advantages in large scale dataset recognition but is not always robust to rotation or scale variance compared with hand-craft feature. In this paper, we propose a method to combine hand-craft point cloud features and deep learned features in RGB and depth channle. First, hand-held object segmentation is implemented by using depth cues and human skeleton information. Second, we combine the extracted hetegerogenous 3D features in different stages using linear concatenation and multiple kernel learning (MKL). Then a training model is used to recognize 3D handheld objects. Experimental results validate the effectiveness and gerneralization ability of the proposed method.

  6. Structural reliability calculation method based on the dual neural network and direct integration method.

    PubMed

    Li, Haibin; He, Yun; Nie, Xiaobo

    2018-01-01

    Structural reliability analysis under uncertainty is paid wide attention by engineers and scholars due to reflecting the structural characteristics and the bearing actual situation. The direct integration method, started from the definition of reliability theory, is easy to be understood, but there are still mathematics difficulties in the calculation of multiple integrals. Therefore, a dual neural network method is proposed for calculating multiple integrals in this paper. Dual neural network consists of two neural networks. The neural network A is used to learn the integrand function, and the neural network B is used to simulate the original function. According to the derivative relationships between the network output and the network input, the neural network B is derived from the neural network A. On this basis, the performance function of normalization is employed in the proposed method to overcome the difficulty of multiple integrations and to improve the accuracy for reliability calculations. The comparisons between the proposed method and Monte Carlo simulation method, Hasofer-Lind method, the mean value first-order second moment method have demonstrated that the proposed method is an efficient and accurate reliability method for structural reliability problems.

  7. Reliability Analysis and Modeling of ZigBee Networks

    NASA Astrophysics Data System (ADS)

    Lin, Cheng-Min

    The architecture of ZigBee networks focuses on developing low-cost, low-speed ubiquitous communication between devices. The ZigBee technique is based on IEEE 802.15.4, which specifies the physical layer and medium access control (MAC) for a low rate wireless personal area network (LR-WPAN). Currently, numerous wireless sensor networks have adapted the ZigBee open standard to develop various services to promote improved communication quality in our daily lives. The problem of system and network reliability in providing stable services has become more important because these services will be stopped if the system and network reliability is unstable. The ZigBee standard has three kinds of networks; star, tree and mesh. The paper models the ZigBee protocol stack from the physical layer to the application layer and analyzes these layer reliability and mean time to failure (MTTF). Channel resource usage, device role, network topology and application objects are used to evaluate reliability in the physical, medium access control, network, and application layers, respectively. In the star or tree networks, a series system and the reliability block diagram (RBD) technique can be used to solve their reliability problem. However, a division technology is applied here to overcome the problem because the network complexity is higher than that of the others. A mesh network using division technology is classified into several non-reducible series systems and edge parallel systems. Hence, the reliability of mesh networks is easily solved using series-parallel systems through our proposed scheme. The numerical results demonstrate that the reliability will increase for mesh networks when the number of edges in parallel systems increases while the reliability quickly drops when the number of edges and the number of nodes increase for all three networks. More use of resources is another factor impact on reliability decreasing. However, lower network reliability will occur due to network complexity, more resource usage and complex object relationship.

  8. Institutional Research Needs for U. S. Community Colleges.

    ERIC Educational Resources Information Center

    Washington State Board for Community Coll. Education, Seattle. Research and Planning Office.

    Seven problem areas where research is needed critically at the two-year institution level are identified: (1) establish reliability and stability of MIS/data base; (2) find reliable predictive instruments and/or formulae; (3) analyze support services and academic assistance objectives; (4) develop research methods to evaluate curricula; (5)…

  9. Factors that Affect Operational Reliability of Turbojet Engines

    NASA Technical Reports Server (NTRS)

    1956-01-01

    The problem of improving operational reliability of turbojet engines is studied in a series of papers. Failure statistics for this engine are presented, the theory and experimental evidence on how engine failures occur are described, and the methods available for avoiding failure in operation are discussed. The individual papers of the series are Objectives, Failure Statistics, Foreign-Object Damage, Compressor Blades, Combustor Assembly, Nozzle Diaphrams, Turbine Buckets, Turbine Disks, Rolling Contact Bearings, Engine Fuel Controls, and Summary Discussion.

  10. Construction of Economics Achievement Test for Assessment of Students

    ERIC Educational Resources Information Center

    Osadebe, P. U.

    2014-01-01

    The study was carried out to construct a valid and reliable test in Economics for secondary school students. Two research questions were drawn to guide the establishment of validity and reliability for the Economics Achievement Test (EAT). It is a multiple choice objective test of five options with 100 items. A sample of 1000 students was randomly…

  11. Reliability model of a monopropellant auxiliary propulsion system

    NASA Technical Reports Server (NTRS)

    Greenberg, J. S.

    1971-01-01

    A mathematical model and associated computer code has been developed which computes the reliability of a monopropellant blowdown hydrazine spacecraft auxiliary propulsion system as a function of time. The propulsion system is used to adjust or modify the spacecraft orbit over an extended period of time. The multiple orbit corrections are the multiple objectives which the auxiliary propulsion system is designed to achieve. Thus the reliability model computes the probability of successfully accomplishing each of the desired orbit corrections. To accomplish this, the reliability model interfaces with a computer code that models the performance of a blowdown (unregulated) monopropellant auxiliary propulsion system. The computer code acts as a performance model and as such gives an accurate time history of the system operating parameters. The basic timing and status information is passed on to and utilized by the reliability model which establishes the probability of successfully accomplishing the orbit corrections.

  12. Parts, Materials, and Processes Experience Summary. Volume 1; [Catalog of ALERT and Other Information on Basic Design, Reliability, Quality and Applications Programs

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The ALERT program, a system for communicating common problems with parts, materials, and processes, is condensed and catalogued. Expanded information on selected topics is provided by relating the problem area (failure) to the cause, the investigations and findings, the suggestions for avoidance (inspections, screening tests, proper part applications), and failure analysis procedures. The basic objective of ALERT is the avoidance of the recurrence of parts, materials, and processed problems, thus improving the reliability of equipment produced for and used by the government.

  13. Identifying multiple influential spreaders based on generalized closeness centrality

    NASA Astrophysics Data System (ADS)

    Liu, Huan-Li; Ma, Chuang; Xiang, Bing-Bing; Tang, Ming; Zhang, Hai-Feng

    2018-02-01

    To maximize the spreading influence of multiple spreaders in complex networks, one important fact cannot be ignored: the multiple spreaders should be dispersively distributed in networks, which can effectively reduce the redundance of information spreading. For this purpose, we define a generalized closeness centrality (GCC) index by generalizing the closeness centrality index to a set of nodes. The problem converts to how to identify multiple spreaders such that an objective function has the minimal value. By comparing with the K-means clustering algorithm, we find that the optimization problem is very similar to the problem of minimizing the objective function in the K-means method. Therefore, how to find multiple nodes with the highest GCC value can be approximately solved by the K-means method. Two typical transmission dynamics-epidemic spreading process and rumor spreading process are implemented in real networks to verify the good performance of our proposed method.

  14. Choosing a reliability inspection plan for interval censored data

    DOE PAGES

    Lu, Lu; Anderson-Cook, Christine Michaela

    2017-04-19

    Reliability test plans are important for producing precise and accurate assessment of reliability characteristics. This paper explores different strategies for choosing between possible inspection plans for interval censored data given a fixed testing timeframe and budget. A new general cost structure is proposed for guiding precise quantification of total cost in inspection test plan. Multiple summaries of reliability are considered and compared as the criteria for choosing the best plans using an easily adapted method. Different cost structures and representative true underlying reliability curves demonstrate how to assess different strategies given the logistical constraints and nature of the problem. Resultsmore » show several general patterns exist across a wide variety of scenarios. Given the fixed total cost, plans that inspect more units with less frequency based on equally spaced time points are favored due to the ease of implementation and consistent good performance across a large number of case study scenarios. Plans with inspection times chosen based on equally spaced probabilities offer improved reliability estimates for the shape of the distribution, mean lifetime, and failure time for a small fraction of population only for applications with high infant mortality rates. The paper uses a Monte Carlo simulation based approach in addition to the common evaluation based on the asymptotic variance and offers comparison and recommendation for different applications with different objectives. Additionally, the paper outlines a variety of different reliability metrics to use as criteria for optimization, presents a general method for evaluating different alternatives, as well as provides case study results for different common scenarios.« less

  15. Choosing a reliability inspection plan for interval censored data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Lu; Anderson-Cook, Christine Michaela

    Reliability test plans are important for producing precise and accurate assessment of reliability characteristics. This paper explores different strategies for choosing between possible inspection plans for interval censored data given a fixed testing timeframe and budget. A new general cost structure is proposed for guiding precise quantification of total cost in inspection test plan. Multiple summaries of reliability are considered and compared as the criteria for choosing the best plans using an easily adapted method. Different cost structures and representative true underlying reliability curves demonstrate how to assess different strategies given the logistical constraints and nature of the problem. Resultsmore » show several general patterns exist across a wide variety of scenarios. Given the fixed total cost, plans that inspect more units with less frequency based on equally spaced time points are favored due to the ease of implementation and consistent good performance across a large number of case study scenarios. Plans with inspection times chosen based on equally spaced probabilities offer improved reliability estimates for the shape of the distribution, mean lifetime, and failure time for a small fraction of population only for applications with high infant mortality rates. The paper uses a Monte Carlo simulation based approach in addition to the common evaluation based on the asymptotic variance and offers comparison and recommendation for different applications with different objectives. Additionally, the paper outlines a variety of different reliability metrics to use as criteria for optimization, presents a general method for evaluating different alternatives, as well as provides case study results for different common scenarios.« less

  16. Reliable Multi-Label Learning via Conformal Predictor and Random Forest for Syndrome Differentiation of Chronic Fatigue in Traditional Chinese Medicine

    PubMed Central

    Wang, Huazhen; Liu, Xin; Lv, Bing; Yang, Fan; Hong, Yanzhu

    2014-01-01

    Objective Chronic Fatigue (CF) still remains unclear about its etiology, pathophysiology, nomenclature and diagnostic criteria in the medical community. Traditional Chinese medicine (TCM) adopts a unique diagnostic method, namely ‘bian zheng lun zhi’ or syndrome differentiation, to diagnose the CF with a set of syndrome factors, which can be regarded as the Multi-Label Learning (MLL) problem in the machine learning literature. To obtain an effective and reliable diagnostic tool, we use Conformal Predictor (CP), Random Forest (RF) and Problem Transformation method (PT) for the syndrome differentiation of CF. Methods and Materials In this work, using PT method, CP-RF is extended to handle MLL problem. CP-RF applies RF to measure the confidence level (p-value) of each label being the true label, and then selects multiple labels whose p-values are larger than the pre-defined significance level as the region prediction. In this paper, we compare the proposed CP-RF with typical CP-NBC(Naïve Bayes Classifier), CP-KNN(K-Nearest Neighbors) and ML-KNN on CF dataset, which consists of 736 cases. Specifically, 95 symptoms are used to identify CF, and four syndrome factors are employed in the syndrome differentiation, including ‘spleen deficiency’, ‘heart deficiency’, ‘liver stagnation’ and ‘qi deficiency’. The Results CP-RF demonstrates an outstanding performance beyond CP-NBC, CP-KNN and ML-KNN under the general metrics of subset accuracy, hamming loss, one-error, coverage, ranking loss and average precision. Furthermore, the performance of CP-RF remains steady at the large scale of confidence levels from 80% to 100%, which indicates its robustness to the threshold determination. In addition, the confidence evaluation provided by CP is valid and well-calibrated. Conclusion CP-RF not only offers outstanding performance but also provides valid confidence evaluation for the CF syndrome differentiation. It would be well applicable to TCM practitioners and facilitate the utilities of objective, effective and reliable computer-based diagnosis tool. PMID:24918430

  17. Personifying self in physics problem situations involving forces as a student help strategy

    NASA Astrophysics Data System (ADS)

    Tabor-Morris, A. E.

    2013-03-01

    How can physics teachers best guide students regarding physics problem situations involving forces? A suggestion is made here to personify oneself as the object in question, that is, to pretend to be the object undergoing forces and then qualify and quantify those forces according to their vectors for the system at hand. This personification is not meant to empower the object to act, just to sense the forces it is experiencing. This strategy may be especially useful to beginning physics learners attacking problems that involve both multiple forces AND multiple objects, since each object acted upon needs to be considered separately, using the idea that one cannot be two places at once. An example of this type of problem expounded on here is Atwood's machine: two weights hung over a pulley with a single rope. Another example given is electromagnetic forces on one charge caused by other charges in the vicinity. Discussion is made on implementation of classroom strategies. Department of Physics

  18. [Development of a proverb test for assessment of concrete thinking problems in schizophrenic patients].

    PubMed

    Barth, A; Küfferle, B

    2001-11-01

    Concretism is considered an important aspect of schizophrenic thought disorder. Traditionally it is measured using the method of proverb interpretation, in which metaphoric proverbs are presented with the request that the subject tell its meaning. Interpretations are recorded and scored on concretistic tendencies. However, this method has two problems: its reliability is doubtful and it is rather complicated to perform. In this paper, a new version of a multiple choice proverb test is presented which can solve these problems in a reliable and economic manner. Using the new test, it is has been shown that schizophrenic patients have greater deficits in proverb interpretation than depressive patients.

  19. Multiple exposure photographic (MEP) technique: an objective assessment of sperm motility in infertility management.

    PubMed

    Adetoro, O O

    1988-06-01

    Multiple exposure photography (MEP), an objective technique, was used in determining the percentage of motile sperms in the semen samples from 41 males being investigated for infertility. This technique was compared with the conventional subjective ordinary microscopy method of spermatozoal motility assessment. A satisfactory correlation was observed in percentage sperm motility assessment using the two methods but the MEP estimation was more consistent and reliable. The value of this technique of sperm motility study in the developing world is discussed.

  20. The reliability and validity of fatigue measures during multiple-sprint work: an issue revisited.

    PubMed

    Glaister, Mark; Howatson, Glyn; Pattison, John R; McInnes, Gill

    2008-09-01

    The ability to repeatedly produce a high-power output or sprint speed is a key fitness component of most field and court sports. The aim of this study was to evaluate the validity and reliability of eight different approaches to quantify this parameter in tests of multiple-sprint performance. Ten physically active men completed two trials of each of two multiple-sprint running protocols with contrasting recovery periods. Protocol 1 consisted of 12 x 30-m sprints repeated every 35 seconds; protocol 2 consisted of 12 x 30-m sprints repeated every 65 seconds. All testing was performed in an indoor sports facility, and sprint times were recorded using twin-beam photocells. All but one of the formulae showed good construct validity, as evidenced by similar within-protocol fatigue scores. However, the assumptions on which many of the formulae were based, combined with poor or inconsistent test-retest reliability (coefficient of variation range: 0.8-145.7%; intraclass correlation coefficient range: 0.09-0.75), suggested many problems regarding logical validity. In line with previous research, the results support the percentage decrement calculation as the most valid and reliable method of quantifying fatigue in tests of multiple-sprint performance.

  1. The feasibility of manual parameter tuning for deformable breast MR image registration from a multi-objective optimization perspective.

    PubMed

    Pirpinia, Kleopatra; Bosman, Peter A N; Loo, Claudette E; Winter-Warnars, Gonneke; Janssen, Natasja N Y; Scholten, Astrid N; Sonke, Jan-Jakob; van Herk, Marcel; Alderliesten, Tanja

    2017-06-23

    Deformable image registration is typically formulated as an optimization problem involving a linearly weighted combination of terms that correspond to objectives of interest (e.g. similarity, deformation magnitude). The weights, along with multiple other parameters, need to be manually tuned for each application, a task currently addressed mainly via trial-and-error approaches. Such approaches can only be successful if there is a sensible interplay between parameters, objectives, and desired registration outcome. This, however, is not well established. To study this interplay, we use multi-objective optimization, where multiple solutions exist that represent the optimal trade-offs between the objectives, forming a so-called Pareto front. Here, we focus on weight tuning. To study the space a user has to navigate during manual weight tuning, we randomly sample multiple linear combinations. To understand how these combinations relate to desirability of registration outcome, we associate with each outcome a mean target registration error (TRE) based on expert-defined anatomical landmarks. Further, we employ a multi-objective evolutionary algorithm that optimizes the weight combinations, yielding a Pareto front of solutions, which can be directly navigated by the user. To study how the complexity of manual weight tuning changes depending on the registration problem, we consider an easy problem, prone-to-prone breast MR image registration, and a hard problem, prone-to-supine breast MR image registration. Lastly, we investigate how guidance information as an additional objective influences the prone-to-supine registration outcome. Results show that the interplay between weights, objectives, and registration outcome makes manual weight tuning feasible for the prone-to-prone problem, but very challenging for the harder prone-to-supine problem. Here, patient-specific, multi-objective weight optimization is needed, obtaining a mean TRE of 13.6 mm without guidance information reduced to 7.3 mm with guidance information, but also providing a Pareto front that exhibits an intuitively sensible interplay between weights, objectives, and registration outcome, allowing outcome selection.

  2. The feasibility of manual parameter tuning for deformable breast MR image registration from a multi-objective optimization perspective

    NASA Astrophysics Data System (ADS)

    Pirpinia, Kleopatra; Bosman, Peter A. N.; E Loo, Claudette; Winter-Warnars, Gonneke; Y Janssen, Natasja N.; Scholten, Astrid N.; Sonke, Jan-Jakob; van Herk, Marcel; Alderliesten, Tanja

    2017-07-01

    Deformable image registration is typically formulated as an optimization problem involving a linearly weighted combination of terms that correspond to objectives of interest (e.g. similarity, deformation magnitude). The weights, along with multiple other parameters, need to be manually tuned for each application, a task currently addressed mainly via trial-and-error approaches. Such approaches can only be successful if there is a sensible interplay between parameters, objectives, and desired registration outcome. This, however, is not well established. To study this interplay, we use multi-objective optimization, where multiple solutions exist that represent the optimal trade-offs between the objectives, forming a so-called Pareto front. Here, we focus on weight tuning. To study the space a user has to navigate during manual weight tuning, we randomly sample multiple linear combinations. To understand how these combinations relate to desirability of registration outcome, we associate with each outcome a mean target registration error (TRE) based on expert-defined anatomical landmarks. Further, we employ a multi-objective evolutionary algorithm that optimizes the weight combinations, yielding a Pareto front of solutions, which can be directly navigated by the user. To study how the complexity of manual weight tuning changes depending on the registration problem, we consider an easy problem, prone-to-prone breast MR image registration, and a hard problem, prone-to-supine breast MR image registration. Lastly, we investigate how guidance information as an additional objective influences the prone-to-supine registration outcome. Results show that the interplay between weights, objectives, and registration outcome makes manual weight tuning feasible for the prone-to-prone problem, but very challenging for the harder prone-to-supine problem. Here, patient-specific, multi-objective weight optimization is needed, obtaining a mean TRE of 13.6 mm without guidance information reduced to 7.3 mm with guidance information, but also providing a Pareto front that exhibits an intuitively sensible interplay between weights, objectives, and registration outcome, allowing outcome selection.

  3. Systematic procedure for designing processes with multiple environmental objectives.

    PubMed

    Kim, Ki-Joo; Smith, Raymond L

    2005-04-01

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems due to the complex nature of the problems, the need for complex assessments, and the complicated analysis of multidimensional results. In this paper, a novel systematic procedure is presented for designing processes with multiple environmental objectives. This procedure has four steps: initialization, screening, evaluation, and visualization. The first two steps are used for systematic problem formulation based on mass and energy estimation and order of magnitude analysis. In the third step, an efficient parallel multiobjective steady-state genetic algorithm is applied to design environmentally benign and economically viable processes and to provide more accurate and uniform Pareto optimal solutions. In the last step a new visualization technique for illustrating multiple objectives and their design parameters on the same diagram is developed. Through these integrated steps the decision-maker can easily determine design alternatives with respect to his or her preferences. Most importantly, this technique is independent of the number of objectives and design parameters. As a case study, acetic acid recovery from aqueous waste mixtures is investigated by minimizing eight potential environmental impacts and maximizing total profit. After applying the systematic procedure, the most preferred design alternatives and their design parameters are easily identified.

  4. A Simulation Based Approach to Optimize Berth Throughput Under Uncertainty at Marine Container Terminals

    NASA Technical Reports Server (NTRS)

    Golias, Mihalis M.

    2011-01-01

    Berth scheduling is a critical function at marine container terminals and determining the best berth schedule depends on several factors including the type and function of the port, size of the port, location, nearby competition, and type of contractual agreement between the terminal and the carriers. In this paper we formulate the berth scheduling problem as a bi-objective mixed-integer problem with the objective to maximize customer satisfaction and reliability of the berth schedule under the assumption that vessel handling times are stochastic parameters following a discrete and known probability distribution. A combination of an exact algorithm, a Genetic Algorithms based heuristic and a simulation post-Pareto analysis is proposed as the solution approach to the resulting problem. Based on a number of experiments it is concluded that the proposed berth scheduling policy outperforms the berth scheduling policy where reliability is not considered.

  5. Air Force Technical Objective Document, FY89.

    DTIC Science & Technology

    1988-04-01

    threat warning; multimegawatt stand-off jammers; a family of new, broadband , active decoy expendables; E4? subsystems and EW suites for Military...and monolithic integrated circuits. (3) Microwave TWTs Develop microwave tube technology and selected thermionic power sources and amplifiers for ECM...Improved design reliability and multiple application of tube technology are stressed. Improve Traveling Wave Tube ( TWT ) reliability by instrumenting a TWT

  6. Multi-objective optimization for generating a weighted multi-model ensemble

    NASA Astrophysics Data System (ADS)

    Lee, H.

    2017-12-01

    Many studies have demonstrated that multi-model ensembles generally show better skill than each ensemble member. When generating weighted multi-model ensembles, the first step is measuring the performance of individual model simulations using observations. There is a consensus on the assignment of weighting factors based on a single evaluation metric. When considering only one evaluation metric, the weighting factor for each model is proportional to a performance score or inversely proportional to an error for the model. While this conventional approach can provide appropriate combinations of multiple models, the approach confronts a big challenge when there are multiple metrics under consideration. When considering multiple evaluation metrics, it is obvious that a simple averaging of multiple performance scores or model ranks does not address the trade-off problem between conflicting metrics. So far, there seems to be no best method to generate weighted multi-model ensembles based on multiple performance metrics. The current study applies the multi-objective optimization, a mathematical process that provides a set of optimal trade-off solutions based on a range of evaluation metrics, to combining multiple performance metrics for the global climate models and their dynamically downscaled regional climate simulations over North America and generating a weighted multi-model ensemble. NASA satellite data and the Regional Climate Model Evaluation System (RCMES) software toolkit are used for assessment of the climate simulations. Overall, the performance of each model differs markedly with strong seasonal dependence. Because of the considerable variability across the climate simulations, it is important to evaluate models systematically and make future projections by assigning optimized weighting factors to the models with relatively good performance. Our results indicate that the optimally weighted multi-model ensemble always shows better performance than an arithmetic ensemble mean and may provide reliable future projections.

  7. The Challenges of Measuring Glycemic Variability

    PubMed Central

    Rodbard, David

    2012-01-01

    This commentary reviews several of the challenges encountered when attempting to quantify glycemic variability and correlate it with risk of diabetes complications. These challenges include (1) immaturity of the field, including problems of data accuracy, precision, reliability, cost, and availability; (2) larger relative error in the estimates of glycemic variability than in the estimates of the mean glucose; (3) high correlation between glycemic variability and mean glucose level; (4) multiplicity of measures; (5) correlation of the multiple measures; (6) duplication or reinvention of methods; (7) confusion of measures of glycemic variability with measures of quality of glycemic control; (8) the problem of multiple comparisons when assessing relationships among multiple measures of variability and multiple clinical end points; and (9) differing needs for routine clinical practice and clinical research applications. PMID:22768904

  8. A class of stochastic optimization problems with one quadratic & several linear objective functions and extended portfolio selection model

    NASA Astrophysics Data System (ADS)

    Xu, Jiuping; Li, Jun

    2002-09-01

    In this paper a class of stochastic multiple-objective programming problems with one quadratic, several linear objective functions and linear constraints has been introduced. The former model is transformed into a deterministic multiple-objective nonlinear programming model by means of the introduction of random variables' expectation. The reference direction approach is used to deal with linear objectives and results in a linear parametric optimization formula with a single linear objective function. This objective function is combined with the quadratic function using the weighted sums. The quadratic problem is transformed into a linear (parametric) complementary problem, the basic formula for the proposed approach. The sufficient and necessary conditions for (properly, weakly) efficient solutions and some construction characteristics of (weakly) efficient solution sets are obtained. An interactive algorithm is proposed based on reference direction and weighted sums. Varying the parameter vector on the right-hand side of the model, the DM can freely search the efficient frontier with the model. An extended portfolio selection model is formed when liquidity is considered as another objective to be optimized besides expectation and risk. The interactive approach is illustrated with a practical example.

  9. Communication and cooperation in underwater acoustic networks

    NASA Astrophysics Data System (ADS)

    Yerramalli, Srinivas

    In this thesis, we present a study of several problems related to underwater point to point communications and network formation. We explore techniques to improve the achievable data rate on a point to point link using better physical layer techniques and then study sensor cooperation which improves the throughput and reliability in an underwater network. Robust point-to-point communications in underwater networks has become increasingly critical in several military and civilian applications related to underwater communications. We present several physical layer signaling and detection techniques tailored to the underwater channel model to improve the reliability of data detection. First, a simplified underwater channel model in which the time scale distortion on each path is assumed to be the same (single scale channel model in contrast to a more general multi scale model). A novel technique, which exploits the nature of OFDM signaling and the time scale distortion, called Partial FFT Demodulation is derived. It is observed that this new technique has some unique interference suppression properties and performs better than traditional equalizers in several scenarios of interest. Next, we consider the multi scale model for the underwater channel and assume that single scale processing is performed at the receiver. We then derive optimized front end pre-processing techniques to reduce the interference caused during single scale processing of signals transmitted on a multi-scale channel. We then propose an improvised channel estimation technique using dictionary optimization methods for compressive sensing and show that significant performance gains can be obtained using this technique. In the next part of this thesis, we consider the problem of sensor node cooperation among rational nodes whose objective is to improve their individual data rates. We first consider the problem of transmitter cooperation in a multiple access channel and investigate the stability of the grand coalition of transmitters using tools from cooperative game theory and show that the grand coalition in both the asymptotic regimes of high and low SNR. Towards studying the problem of receiver cooperation for a broadcast channel, we propose a game theoretic model for the broadcast channel and then derive a game theoretic duality between the multiple access and the broadcast channel and show that how the equilibria of the broadcast channel are related to the multiple access channel and vice versa.

  10. Algorithmic Perspectives on Problem Formulations in MDO

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia M.; Lewis, Robert Michael

    2000-01-01

    This work is concerned with an approach to formulating the multidisciplinary optimization (MDO) problem that reflects an algorithmic perspective on MDO problem solution. The algorithmic perspective focuses on formulating the problem in light of the abilities and inabilities of optimization algorithms, so that the resulting nonlinear programming problem can be solved reliably and efficiently by conventional optimization techniques. We propose a modular approach to formulating MDO problems that takes advantage of the problem structure, maximizes the autonomy of implementation, and allows for multiple easily interchangeable problem statements to be used depending on the available resources and the characteristics of the application problem.

  11. Multiple object tracking using the shortest path faster association algorithm.

    PubMed

    Xi, Zhenghao; Liu, Heping; Liu, Huaping; Yang, Bin

    2014-01-01

    To solve the persistently multiple object tracking in cluttered environments, this paper presents a novel tracking association approach based on the shortest path faster algorithm. First, the multiple object tracking is formulated as an integer programming problem of the flow network. Then we relax the integer programming to a standard linear programming problem. Therefore, the global optimum can be quickly obtained using the shortest path faster algorithm. The proposed method avoids the difficulties of integer programming, and it has a lower worst-case complexity than competing methods but better robustness and tracking accuracy in complex environments. Simulation results show that the proposed algorithm takes less time than other state-of-the-art methods and can operate in real time.

  12. Multiple Object Tracking Using the Shortest Path Faster Association Algorithm

    PubMed Central

    Liu, Heping; Liu, Huaping; Yang, Bin

    2014-01-01

    To solve the persistently multiple object tracking in cluttered environments, this paper presents a novel tracking association approach based on the shortest path faster algorithm. First, the multiple object tracking is formulated as an integer programming problem of the flow network. Then we relax the integer programming to a standard linear programming problem. Therefore, the global optimum can be quickly obtained using the shortest path faster algorithm. The proposed method avoids the difficulties of integer programming, and it has a lower worst-case complexity than competing methods but better robustness and tracking accuracy in complex environments. Simulation results show that the proposed algorithm takes less time than other state-of-the-art methods and can operate in real time. PMID:25215322

  13. Grey-Theory-Based Optimization Model of Emergency Logistics Considering Time Uncertainty.

    PubMed

    Qiu, Bao-Jian; Zhang, Jiang-Hua; Qi, Yuan-Tao; Liu, Yang

    2015-01-01

    Natural disasters occur frequently in recent years, causing huge casualties and property losses. Nowadays, people pay more and more attention to the emergency logistics problems. This paper studies the emergency logistics problem with multi-center, multi-commodity, and single-affected-point. Considering that the path near the disaster point may be damaged, the information of the state of the paths is not complete, and the travel time is uncertainty, we establish the nonlinear programming model that objective function is the maximization of time-satisfaction degree. To overcome these drawbacks: the incomplete information and uncertain time, this paper firstly evaluates the multiple roads of transportation network based on grey theory and selects the reliable and optimal path. Then simplify the original model under the scenario that the vehicle only follows the optimal path from the emergency logistics center to the affected point, and use Lingo software to solve it. The numerical experiments are presented to show the feasibility and effectiveness of the proposed method.

  14. Grey-Theory-Based Optimization Model of Emergency Logistics Considering Time Uncertainty

    PubMed Central

    Qiu, Bao-Jian; Zhang, Jiang-Hua; Qi, Yuan-Tao; Liu, Yang

    2015-01-01

    Natural disasters occur frequently in recent years, causing huge casualties and property losses. Nowadays, people pay more and more attention to the emergency logistics problems. This paper studies the emergency logistics problem with multi-center, multi-commodity, and single-affected-point. Considering that the path near the disaster point may be damaged, the information of the state of the paths is not complete, and the travel time is uncertainty, we establish the nonlinear programming model that objective function is the maximization of time-satisfaction degree. To overcome these drawbacks: the incomplete information and uncertain time, this paper firstly evaluates the multiple roads of transportation network based on grey theory and selects the reliable and optimal path. Then simplify the original model under the scenario that the vehicle only follows the optimal path from the emergency logistics center to the affected point, and use Lingo software to solve it. The numerical experiments are presented to show the feasibility and effectiveness of the proposed method. PMID:26417946

  15. Decomposition-Based Decision Making for Aerospace Vehicle Design

    NASA Technical Reports Server (NTRS)

    Borer, Nicholas K.; Mavris, DImitri N.

    2005-01-01

    Most practical engineering systems design problems have multiple and conflicting objectives. Furthermore, the satisfactory attainment level for each objective ( requirement ) is likely uncertain early in the design process. Systems with long design cycle times will exhibit more of this uncertainty throughout the design process. This is further complicated if the system is expected to perform for a relatively long period of time, as now it will need to grow as new requirements are identified and new technologies are introduced. These points identify a need for a systems design technique that enables decision making amongst multiple objectives in the presence of uncertainty. Traditional design techniques deal with a single objective or a small number of objectives that are often aggregates of the overarching goals sought through the generation of a new system. Other requirements, although uncertain, are viewed as static constraints to this single or multiple objective optimization problem. With either of these formulations, enabling tradeoffs between the requirements, objectives, or combinations thereof is a slow, serial process that becomes increasingly complex as more criteria are added. This research proposal outlines a technique that attempts to address these and other idiosyncrasies associated with modern aerospace systems design. The proposed formulation first recasts systems design into a multiple criteria decision making problem. The now multiple objectives are decomposed to discover the critical characteristics of the objective space. Tradeoffs between the objectives are considered amongst these critical characteristics by comparison to a probabilistic ideal tradeoff solution. The proposed formulation represents a radical departure from traditional methods. A pitfall of this technique is in the validation of the solution: in a multi-objective sense, how can a decision maker justify a choice between non-dominated alternatives? A series of examples help the reader to observe how this technique can be applied to aerospace systems design and compare the results of this so-called Decomposition-Based Decision Making to more traditional design approaches.

  16. Interactive Reference Point Procedure Based on the Conic Scalarizing Function

    PubMed Central

    2014-01-01

    In multiobjective optimization methods, multiple conflicting objectives are typically converted into a single objective optimization problem with the help of scalarizing functions. The conic scalarizing function is a general characterization of Benson proper efficient solutions of non-convex multiobjective problems in terms of saddle points of scalar Lagrangian functions. This approach preserves convexity. The conic scalarizing function, as a part of a posteriori or a priori methods, has successfully been applied to several real-life problems. In this paper, we propose a conic scalarizing function based interactive reference point procedure where the decision maker actively takes part in the solution process and directs the search according to her or his preferences. An algorithmic framework for the interactive solution of multiple objective optimization problems is presented and is utilized for solving some illustrative examples. PMID:24723795

  17. Robust object matching for persistent tracking with heterogeneous features.

    PubMed

    Guo, Yanlin; Hsu, Steve; Sawhney, Harpreet S; Kumar, Rakesh; Shan, Ying

    2007-05-01

    This paper addresses the problem of matching vehicles across multiple sightings under variations in illumination and camera poses. Since multiple observations of a vehicle are separated in large temporal and/or spatial gaps, thus prohibiting the use of standard frame-to-frame data association, we employ features extracted over a sequence during one time interval as a vehicle fingerprint that is used to compute the likelihood that two or more sequence observations are from the same or different vehicles. Furthermore, since our domain is aerial video tracking, in order to deal with poor image quality and large resolution and quality variations, our approach employs robust alignment and match measures for different stages of vehicle matching. Most notably, we employ a heterogeneous collection of features such as lines, points, and regions in an integrated matching framework. Heterogeneous features are shown to be important. Line and point features provide accurate localization and are employed for robust alignment across disparate views. The challenges of change in pose, aspect, and appearances across two disparate observations are handled by combining a novel feature-based quasi-rigid alignment with flexible matching between two or more sequences. However, since lines and points are relatively sparse, they are not adequate to delineate the object and provide a comprehensive matching set that covers the complete object. Region features provide a high degree of coverage and are employed for continuous frames to provide a delineation of the vehicle region for subsequent generation of a match measure. Our approach reliably delineates objects by representing regions as robust blob features and matching multiple regions to multiple regions using Earth Mover's Distance (EMD). Extensive experimentation under a variety of real-world scenarios and over hundreds of thousands of Confirmatory Identification (CID) trails has demonstrated about 95 percent accuracy in vehicle reacquisition with both visible and Infrared (IR) imaging cameras.

  18. Solving standard traveling salesman problem and multiple traveling salesman problem by using branch-and-bound

    NASA Astrophysics Data System (ADS)

    Saad, Shakila; Wan Jaafar, Wan Nurhadani; Jamil, Siti Jasmida

    2013-04-01

    The standard Traveling Salesman Problem (TSP) is the classical Traveling Salesman Problem (TSP) while Multiple Traveling Salesman Problem (MTSP) is an extension of TSP when more than one salesman is involved. The objective of MTSP is to find the least costly route that the traveling salesman problem can take if he wishes to visit exactly once each of a list of n cities and then return back to the home city. There are a few methods that can be used to solve MTSP. The objective of this research is to implement an exact method called Branch-and-Bound (B&B) algorithm. Briefly, the idea of B&B algorithm is to start with the associated Assignment Problem (AP). A branching strategy will be applied to the TSP and MTSP which is Breadth-first-Search (BFS). 11 nodes of cities are implemented for both problem and the solutions to the problem are presented.

  19. A Multiple Ant Colony Metahuristic for the Air Refueling Tanker Assignment Problem

    DTIC Science & Technology

    2002-03-01

    Problem The tanker assignment problem can be modeled as a job shop scheduling problem ( JSSP ). The JSSP is made up of n jobs, composed of m ordered...points) to be processed on all the machines (tankers). The problem with using JSSP is that the tanker assignment problem has multiple objectives... JSSP will minimize the time it takes for all jobs, but this may take an inordinate number of tankers. Thus using JSSP alone is not necessarily a good

  20. Prevalence, Patterns, and Predictors of Sleep Problems and Daytime Sleepiness in Young Adolescents with Attention-Deficit/Hyperactivity Disorder

    ERIC Educational Resources Information Center

    Langberg, Joshua M.; Molitor, Stephen J.; Oddo, Lauren E.; Eadeh, Hana-May; Dvorsky, Melissa R.; Becker, Stephen P.

    2017-01-01

    Objective: The primary objective of this study was to evaluate the prevalence of multiple types of sleep problems in young adolescents with ADHD. Method: 262 adolescents comprehensively diagnosed with ADHD and their caregivers completed well-validated measures of sleep problems and daytime sleepiness. Participants also completed measures related…

  1. System of systems design: Evaluating aircraft in a fleet context using reliability and non-deterministic approaches

    NASA Astrophysics Data System (ADS)

    Frommer, Joshua B.

    This work develops and implements a solution framework that allows for an integrated solution to a resource allocation system-of-systems problem associated with designing vehicles for integration into an existing fleet to extend that fleet's capability while improving efficiency. Typically, aircraft design focuses on using a specific design mission while a fleet perspective would provide a broader capability. Aspects of design for both the vehicles and missions may be, for simplicity, deterministic in nature or, in a model that reflects actual conditions, uncertain. Toward this end, the set of tasks or goals for the to-be-planned system-of-systems will be modeled more accurately with non-deterministic values, and the designed platforms will be evaluated using reliability analysis. The reliability, defined as the probability of a platform or set of platforms to complete possible missions, will contribute to the fitness of the overall system. The framework includes building surrogate models for metrics such as capability and cost, and includes the ideas of reliability in the overall system-level design space. The concurrent design and allocation system-of-systems problem is a multi-objective mixed integer nonlinear programming (MINLP) problem. This study considered two system-of-systems problems that seek to simultaneously design new aircraft and allocate these aircraft into a fleet to provide a desired capability. The Coast Guard's Integrated Deepwater System program inspired the first problem, which consists of a suite of search-and-find missions for aircraft based on descriptions from the National Search and Rescue Manual. The second represents suppression of enemy air defense operations similar to those carried out by the U.S. Air Force, proposed as part of the Department of Defense Network Centric Warfare structure, and depicted in MILSTD-3013. The two problems seem similar, with long surveillance segments, but because of the complex nature of aircraft design, the analysis of the vehicle for high-speed attack combined with a long loiter period is considerably different from that for quick cruise to an area combined with a low speed search. However, the framework developed to solve this class of system-of-systems problem handles both scenarios and leads to a solution type for this kind of problem. On the vehicle-level of the problem, different technology can have an impact on the fleet-level. One such technology is Morphing, the ability to change shape, which is an ideal candidate technology for missions with dissimilar segments, such as the aforementioned two. A framework, using surrogate models based on optimally-sized aircraft, and using probabilistic parameters to define a concept of operations, is investigated; this has provided insight into the setup of the optimization problem, the use of the reliability metric, and the measurement of fleet level impacts of morphing aircraft. The research consisted of four phases. The two initial phases built and defined the framework to solve system-of-systems problem; these investigations used the search-and-find scenario as the example application. The first phase included the design of fixed-geometry and morphing aircraft for a range of missions and evaluated the aircraft capability using non-deterministic mission parameters. The second phase introduced the idea of multiple aircraft in a fleet, but only considered a fleet consisting of one aircraft type. The third phase incorporated the simultaneous design of a new vehicle and allocation into a fleet for the search-and-find scenario; in this phase, multiple types of aircraft are considered. The fourth phase repeated the simultaneous new aircraft design and fleet allocation for the SEAD scenario to show that the approach is not specific to the search-and-find scenario. The framework presented in this work appears to be a viable approach for concurrently designing and allocating constituents in a system, specifically aircraft in a fleet. The research also shows that new technology impact can be assessed at the fleet level using conceptual design principles.

  2. Inventory of Motive of Preference for Conventional Paper-and-Pencil Tests: A Study of Validity and Reliability

    ERIC Educational Resources Information Center

    Eser, Mehmet Taha; Dogan, Nuri

    2017-01-01

    Purpose: The objective of this study is to develop the Inventory of Motive of Preference for Conventional Paper-And-Pencil Tests and to evaluate students' motives for preferring written tests, short-answer tests, true/false tests or multiple-choice tests. This will add a measurement tool to the literature with valid and reliable results to help…

  3. Optimal Resource Allocation for NOMA-TDMA Scheme with α-Fairness in Industrial Internet of Things.

    PubMed

    Sun, Yanjing; Guo, Yiyu; Li, Song; Wu, Dapeng; Wang, Bin

    2018-05-15

    In this paper, a joint non-orthogonal multiple access and time division multiple access (NOMA-TDMA) scheme is proposed in Industrial Internet of Things (IIoT), which allowed multiple sensors to transmit in the same time-frequency resource block using NOMA. The user scheduling, time slot allocation, and power control are jointly optimized in order to maximize the system α -fair utility under transmit power constraint and minimum rate constraint. The optimization problem is nonconvex because of the fractional objective function and the nonconvex constraints. To deal with the original problem, we firstly convert the objective function in the optimization problem into a difference of two convex functions (D.C.) form, and then propose a NOMA-TDMA-DC algorithm to exploit the global optimum. Numerical results show that the NOMA-TDMA scheme significantly outperforms the traditional orthogonal multiple access scheme in terms of both spectral efficiency and user fairness.

  4. Network Design for Reliability and Resilience to Attack

    DTIC Science & Technology

    2014-03-01

    attacker can destroy n arcs in the network SPNI Shortest-Path Network-Interdiction problem TSP Traveling Salesman Problem UB upper bound UKR Ukraine...elimination from the traveling salesman problem (TSP). Literature calls a walk that does not contain a cycle a path [19]. The objective function in...arc lengths as random variables with known probability distributions. The m-median problem seeks to design a network with minimum average travel cost

  5. The Initial Development of Object Knowledge by a Learning Robot

    PubMed Central

    Modayil, Joseph; Kuipers, Benjamin

    2008-01-01

    We describe how a robot can develop knowledge of the objects in its environment directly from unsupervised sensorimotor experience. The object knowledge consists of multiple integrated representations: trackers that form spatio-temporal clusters of sensory experience, percepts that represent properties for the tracked objects, classes that support efficient generalization from past experience, and actions that reliably change object percepts. We evaluate how well this intrinsically acquired object knowledge can be used to solve externally specified tasks including object recognition and achieving goals that require both planning and continuous control. PMID:19953188

  6. U.S. Geological Survey Quality-Assurance Project for Sediment Analysis

    USGS Publications Warehouse

    Gordon, John D.; Newland, Carla

    2000-01-01

    Introduction Sediment is derived primarily from natural weathering of rock and is an assemblage of individual mineral grains that are then deposited by some physical agent, such as water, wind, ice, or gravity (Fetter, 1988). The U.S. Geological Survey (USGS) samples sediments and collects data on the amount of sediment in selected waterways. The most pressing sediment-related problems are associated with environmental questions, such as the transport and fate of attached pollutants, effects of sediment on aquatic biota and their habitats, and effects on sediment transport from land-use changes. Current (2000) sediment issues require that sediment studies address multiple objectives in water-resources management (Koltun and others, 1997). To support sediment research, the USGS operates laboratories for the analysis of the physical characteristics of sediment. Sediment laboratories producing data for the USGS have two principal functions: (1) the determination of suspended-sediment concentration in samples and (2) the determination of sand/fine separations. The reliability of these determinations and the usefulness of the data are dependent on the accuracy and reliability of the laboratory analyses (Guy, 1969).

  7. A computational framework for prime implicants identification in noncoherent dynamic systems.

    PubMed

    Di Maio, Francesco; Baronchelli, Samuele; Zio, Enrico

    2015-01-01

    Dynamic reliability methods aim at complementing the capability of traditional static approaches (e.g., event trees [ETs] and fault trees [FTs]) by accounting for the system dynamic behavior and its interactions with the system state transition process. For this, the system dynamics is here described by a time-dependent model that includes the dependencies with the stochastic transition events. In this article, we present a novel computational framework for dynamic reliability analysis whose objectives are i) accounting for discrete stochastic transition events and ii) identifying the prime implicants (PIs) of the dynamic system. The framework entails adopting a multiple-valued logic (MVL) to consider stochastic transitions at discretized times. Then, PIs are originally identified by a differential evolution (DE) algorithm that looks for the optimal MVL solution of a covering problem formulated for MVL accident scenarios. For testing the feasibility of the framework, a dynamic noncoherent system composed of five components that can fail at discretized times has been analyzed, showing the applicability of the framework to practical cases. © 2014 Society for Risk Analysis.

  8. Confounding Problems in Multifactor AOV When Using Several Organismic Variables of Limited Reliability

    ERIC Educational Resources Information Center

    Games, Paul A.

    1975-01-01

    A brief introduction is presented on how multiple regression and linear model techniques can handle data analysis situations that most educators and psychologists think of as appropriate for analysis of variance. (Author/BJG)

  9. A new compound control method for sine-on-random mixed vibration test

    NASA Astrophysics Data System (ADS)

    Zhang, Buyun; Wang, Ruochen; Zeng, Falin

    2017-09-01

    Vibration environmental test (VET) is one of the important and effective methods to provide supports for the strength design, reliability and durability test of mechanical products. A new separation control strategy was proposed to apply in multiple-input multiple-output (MIMO) sine on random (SOR) mixed mode vibration test, which is the advanced and intensive test type of VET. As the key problem of the strategy, correlation integral method was applied to separate the mixed signals which included random and sinusoidal components. The feedback control formula of MIMO linear random vibration system was systematically deduced in frequency domain, and Jacobi control algorithm was proposed in view of the elements, such as self-spectrum, coherence, and phase of power spectral density (PSD) matrix. Based on the excessive correction of excitation in sine vibration test, compression factor was introduced to reduce the excitation correction, avoiding the destruction to vibration table or other devices. The two methods were synthesized to be applied in MIMO SOR vibration test system. In the final, verification test system with the vibration of a cantilever beam as the control object was established to verify the reliability and effectiveness of the methods proposed in the paper. The test results show that the exceeding values can be controlled in the tolerance range of references accurately, and the method can supply theory and application supports for mechanical engineering.

  10. Final Report: Resolving and Discriminating Overlapping Anomalies from Multiple Objects in Cluttered Environments

    DTIC Science & Technology

    2015-12-15

    UXO community . NAME Total Number: PERCENT_SUPPORTEDNAME FTE Equivalent: Total Number: Irma Shamatava 0.50 0.50 1 Resolving and Discriminating...Distinguishing an object of interest from innocuous items is the main problem that the UXO community is facing currently. This inverse problem...innocuous items is the main problem that the UXO community is facing currently. This inverse problem demands fast and accurate representation of

  11. Evaluation of the Validity and Reliability of the Waterlow Pressure Ulcer Risk Assessment Scale

    PubMed Central

    Charalambous, Charalambos; Koulori, Agoritsa; Vasilopoulos, Aristidis; Roupa, Zoe

    2018-01-01

    Introduction Prevention is the ideal strategy to tackle the problem of pressure ulcers. Pressure ulcer risk assessment scales are one of the most pivotal measures applied to tackle the problem, much criticisms has been developed regarding the validity and reliability of these scales. Objective To investigate the validity and reliability of the Waterlow pressure ulcer risk assessment scale. Method The methodology used is a narrative literature review, the bibliography was reviewed through Cinahl, Pubmed, EBSCO, Medline and Google scholar, 26 scientific articles where identified. The articles where chosen due to their direct correlation with the objective under study and their scientific relevance. Results The construct and face validity of the Waterlow appears adequate, but with regards to content validity changes in the category age and gender can be beneficial. The concurrent validity cannot be assessed. The predictive validity of the Waterlow is characterized by high specificity and low sensitivity. The inter-rater reliability has been demonstrated to be inadequate, this may be due to lack of clear definitions within the categories and differentiating level of knowledge between the users. Conclusion Due to the limitations presented regarding the validity and reliability of the Waterlow pressure ulcer risk assessment scale, the scale should be used in conjunction with clinical assessment to provide optimum results. PMID:29736104

  12. LED induced autofluorescence (LIAF) imager with eight multi-filters for oral cancer diagnosis

    NASA Astrophysics Data System (ADS)

    Huang, Ting-Wei; Cheng, Nai-Lun; Tsai, Ming-Hsui; Chiou, Jin-Chern; Mang, Ou-Yang

    2016-03-01

    Oral cancer is one of the serious and growing problem in many developing and developed countries. The simple oral visual screening by clinician can reduce 37,000 oral cancer deaths annually worldwide. However, the conventional oral examination with the visual inspection and the palpation of oral lesions is not an objective and reliable approach for oral cancer diagnosis, and it may cause the delayed hospital treatment for the patients of oral cancer or leads to the oral cancer out of control in the late stage. Therefore, a device for oral cancer detection are developed for early diagnosis and treatment. A portable LED Induced autofluorescence (LIAF) imager is developed by our group. It contained the multiple wavelength of LED excitation light and the rotary filter ring of eight channels to capture ex-vivo oral tissue autofluorescence images. The advantages of LIAF imager compared to other devices for oral cancer diagnosis are that LIAF imager has a probe of L shape for fixing the object distance, protecting the effect of ambient light, and observing the blind spot in the deep port between the gumsgingiva and the lining of the mouth. Besides, the multiple excitation of LED light source can induce multiple autofluorescence, and LIAF imager with the rotary filter ring of eight channels can detect the spectral images of multiple narrow bands. The prototype of a portable LIAF imager is applied in the clinical trials for some cases in Taiwan, and the images of the clinical trial with the specific excitation show the significant differences between normal tissue and oral tissue under these cases.

  13. Evaluation of Genetic Algorithm Concepts using Model Problems. Part 1; Single-Objective Optimization

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.; Pulliam, Thomas H.

    2003-01-01

    A genetic-algorithm-based optimization approach is described and evaluated using a simple hill-climbing model problem. The model problem utilized herein allows for the broad specification of a large number of search spaces including spaces with an arbitrary number of genes or decision variables and an arbitrary number hills or modes. In the present study, only single objective problems are considered. Results indicate that the genetic algorithm optimization approach is flexible in application and extremely reliable, providing optimal results for all problems attempted. The most difficult problems - those with large hyper-volumes and multi-mode search spaces containing a large number of genes - require a large number of function evaluations for GA convergence, but they always converge.

  14. Online two-stage association method for robust multiple people tracking

    NASA Astrophysics Data System (ADS)

    Lv, Jingqin; Fang, Jiangxiong; Yang, Jie

    2011-07-01

    Robust multiple people tracking is very important for many applications. It is a challenging problem due to occlusion and interaction in crowded scenarios. This paper proposes an online two-stage association method for robust multiple people tracking. In the first stage, short tracklets generated by linking people detection responses grow longer by particle filter based tracking, with detection confidence embedded into the observation model. And, an examining scheme runs at each frame for the reliability of tracking. In the second stage, multiple people tracking is achieved by linking tracklets to generate trajectories. An online tracklet association method is proposed to solve the linking problem, which allows applications in time-critical scenarios. This method is evaluated on the popular CAVIAR dataset. The experimental results show that our two-stage method is robust.

  15. Engineering tradeoff problems viewed as multiple objective optimizations and the VODCA methodology

    NASA Astrophysics Data System (ADS)

    Morgan, T. W.; Thurgood, R. L.

    1984-05-01

    This paper summarizes a rational model for making engineering tradeoff decisions. The model is a hybrid from the fields of social welfare economics, communications, and operations research. A solution methodology (Vector Optimization Decision Convergence Algorithm - VODCA) firmly grounded in the economic model is developed both conceptually and mathematically. The primary objective for developing the VODCA methodology was to improve the process for extracting relative value information about the objectives from the appropriate decision makers. This objective was accomplished by employing data filtering techniques to increase the consistency of the relative value information and decrease the amount of information required. VODCA is applied to a simplified hypothetical tradeoff decision problem. Possible use of multiple objective analysis concepts and the VODCA methodology in product-line development and market research are discussed.

  16. Proposed reliability cost model

    NASA Technical Reports Server (NTRS)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  17. Cost effective simulation-based multiobjective optimization in the performance of an internal combustion engine

    NASA Astrophysics Data System (ADS)

    Aittokoski, Timo; Miettinen, Kaisa

    2008-07-01

    Solving real-life engineering problems can be difficult because they often have multiple conflicting objectives, the objective functions involved are highly nonlinear and they contain multiple local minima. Furthermore, function values are often produced via a time-consuming simulation process. These facts suggest the need for an automated optimization tool that is efficient (in terms of number of objective function evaluations) and capable of solving global and multiobjective optimization problems. In this article, the requirements on a general simulation-based optimization system are discussed and such a system is applied to optimize the performance of a two-stroke combustion engine. In the example of a simulation-based optimization problem, the dimensions and shape of the exhaust pipe of a two-stroke engine are altered, and values of three conflicting objective functions are optimized. These values are derived from power output characteristics of the engine. The optimization approach involves interactive multiobjective optimization and provides a convenient tool to balance between conflicting objectives and to find good solutions.

  18. NASA trend analysis procedures

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This publication is primarily intended for use by NASA personnel engaged in managing or implementing trend analysis programs. 'Trend analysis' refers to the observation of current activity in the context of the past in order to infer the expected level of future activity. NASA trend analysis was divided into 5 categories: problem, performance, supportability, programmatic, and reliability. Problem trend analysis uncovers multiple occurrences of historical hardware or software problems or failures in order to focus future corrective action. Performance trend analysis observes changing levels of real-time or historical flight vehicle performance parameters such as temperatures, pressures, and flow rates as compared to specification or 'safe' limits. Supportability trend analysis assesses the adequacy of the spaceflight logistics system; example indicators are repair-turn-around time and parts stockage levels. Programmatic trend analysis uses quantitative indicators to evaluate the 'health' of NASA programs of all types. Finally, reliability trend analysis attempts to evaluate the growth of system reliability based on a decreasing rate of occurrence of hardware problems over time. Procedures for conducting all five types of trend analysis are provided in this publication, prepared through the joint efforts of the NASA Trend Analysis Working Group.

  19. A guide to multi-objective optimization for ecological problems with an application to cackling goose management

    USGS Publications Warehouse

    Williams, Perry J.; Kendall, William L.

    2017-01-01

    Choices in ecological research and management are the result of balancing multiple, often competing, objectives. Multi-objective optimization (MOO) is a formal decision-theoretic framework for solving multiple objective problems. MOO is used extensively in other fields including engineering, economics, and operations research. However, its application for solving ecological problems has been sparse, perhaps due to a lack of widespread understanding. Thus, our objective was to provide an accessible primer on MOO, including a review of methods common in other fields, a review of their application in ecology, and a demonstration to an applied resource management problem.A large class of methods for solving MOO problems can be separated into two strategies: modelling preferences pre-optimization (the a priori strategy), or modelling preferences post-optimization (the a posteriori strategy). The a priori strategy requires describing preferences among objectives without knowledge of how preferences affect the resulting decision. In the a posteriori strategy, the decision maker simultaneously considers a set of solutions (the Pareto optimal set) and makes a choice based on the trade-offs observed in the set. We describe several methods for modelling preferences pre-optimization, including: the bounded objective function method, the lexicographic method, and the weighted-sum method. We discuss modelling preferences post-optimization through examination of the Pareto optimal set. We applied each MOO strategy to the natural resource management problem of selecting a population target for cackling goose (Branta hutchinsii minima) abundance. Cackling geese provide food security to Native Alaskan subsistence hunters in the goose's nesting area, but depredate crops on private agricultural fields in wintering areas. We developed objective functions to represent the competing objectives related to the cackling goose population target and identified an optimal solution first using the a priori strategy, and then by examining trade-offs in the Pareto set using the a posteriori strategy. We used four approaches for selecting a final solution within the a posteriori strategy; the most common optimal solution, the most robust optimal solution, and two solutions based on maximizing a restricted portion of the Pareto set. We discuss MOO with respect to natural resource management, but MOO is sufficiently general to cover any ecological problem that contains multiple competing objectives that can be quantified using objective functions.

  20. Efficient integration of spectral features for vehicle tracking utilizing an adaptive sensor

    NASA Astrophysics Data System (ADS)

    Uzkent, Burak; Hoffman, Matthew J.; Vodacek, Anthony

    2015-03-01

    Object tracking in urban environments is an important and challenging problem that is traditionally tackled using visible and near infrared wavelengths. By inserting extended data such as spectral features of the objects one can improve the reliability of the identification process. However, huge increase in data created by hyperspectral imaging is usually prohibitive. To overcome the complexity problem, we propose a persistent air-to-ground target tracking system inspired by a state-of-the-art, adaptive, multi-modal sensor. The adaptive sensor is capable of providing panchromatic images as well as the spectra of desired pixels. This addresses the data challenge of hyperspectral tracking by only recording spectral data as needed. Spectral likelihoods are integrated into a data association algorithm in a Bayesian fashion to minimize the likelihood of misidentification. A framework for controlling spectral data collection is developed by incorporating motion segmentation information and prior information from a Gaussian Sum filter (GSF) movement predictions from a multi-model forecasting set. An intersection mask of the surveillance area is extracted from OpenStreetMap source and incorporated into the tracking algorithm to perform online refinement of multiple model set. The proposed system is tested using challenging and realistic scenarios generated in an adverse environment.

  1. Object classification for obstacle avoidance

    NASA Astrophysics Data System (ADS)

    Regensburger, Uwe; Graefe, Volker

    1991-03-01

    Object recognition is necessary for any mobile robot operating autonomously in the real world. This paper discusses an object classifier based on a 2-D object model. Obstacle candidates are tracked and analyzed false alarms generated by the object detector are recognized and rejected. The methods have been implemented on a multi-processor system and tested in real-world experiments. They work reliably under favorable conditions but sometimes problems occur e. g. when objects contain many features (edges) or move in front of structured background.

  2. Some Reliability Problems in a Criterion-Referenced Test.

    ERIC Educational Resources Information Center

    Roudabush, Glenn E.; Green, Donald Ross

    This paper describes the development of a criterion-referenced test. The Prescriptive Mathematics Inventory (PMI) was developed to measure 400 stated behavioral objectives. The test consists of three overlapping levels with the objectives chosen to cover 90 to 95 per cent of the mathematics curriculum nominally taught in grades 4 through 8. Each…

  3. Electromagnetic interference-aware transmission scheduling and power control for dynamic wireless access in hospital environments.

    PubMed

    Phunchongharn, Phond; Hossain, Ekram; Camorlinga, Sergio

    2011-11-01

    We study the multiple access problem for e-Health applications (referred to as secondary users) coexisting with medical devices (referred to as primary or protected users) in a hospital environment. In particular, we focus on transmission scheduling and power control of secondary users in multiple spatial reuse time-division multiple access (STDMA) networks. The objective is to maximize the spectrum utilization of secondary users and minimize their power consumption subject to the electromagnetic interference (EMI) constraints for active and passive medical devices and minimum throughput guarantee for secondary users. The multiple access problem is formulated as a dual objective optimization problem which is shown to be NP-complete. We propose a joint scheduling and power control algorithm based on a greedy approach to solve the problem with much lower computational complexity. To this end, an enhanced greedy algorithm is proposed to improve the performance of the greedy algorithm by finding the optimal sequence of secondary users for scheduling. Using extensive simulations, the tradeoff in performance in terms of spectrum utilization, energy consumption, and computational complexity is evaluated for both the algorithms.

  4. Asymptotically reliable transport of multimedia/graphics over wireless channels

    NASA Astrophysics Data System (ADS)

    Han, Richard Y.; Messerschmitt, David G.

    1996-03-01

    We propose a multiple-delivery transport service tailored for graphics and video transported over connections with wireless access. This service operates at the interface between the transport and application layers, balancing the subjective delay and image quality objectives of the application with the low reliability and limited bandwidth of the wireless link. While techniques like forward-error correction, interleaving and retransmission improve reliability over wireless links, they also increase latency substantially when bandwidth is limited. Certain forms of interactive multimedia datatypes can benefit from an initial delivery of a corrupt packet to lower the perceptual latency, as long as reliable delivery occurs eventually. Multiple delivery of successively refined versions of the received packet, terminating when a sufficiently reliable version arrives, exploits the redundancy inherently required to improve reliability without a traffic penalty. Modifications to acknowledgment-repeat-request (ARQ) methods to implement this transport service are proposed, which we term `leaky ARQ'. For the specific case of pixel-coded window-based text/graphics, we describe additional functions needed to more effectively support urgent delivery and asymptotic reliability. X server emulation suggests that users will accept a multi-second delay between a (possibly corrupt) packet and the ultimate reliably-delivered version. The relaxed delay for reliable delivery can be exploited for traffic capacity improvement using scheduling of retransmissions.

  5. Differential evolution-simulated annealing for multiple sequence alignment

    NASA Astrophysics Data System (ADS)

    Addawe, R. C.; Addawe, J. M.; Sueño, M. R. K.; Magadia, J. C.

    2017-10-01

    Multiple sequence alignments (MSA) are used in the analysis of molecular evolution and sequence structure relationships. In this paper, a hybrid algorithm, Differential Evolution - Simulated Annealing (DESA) is applied in optimizing multiple sequence alignments (MSAs) based on structural information, non-gaps percentage and totally conserved columns. DESA is a robust algorithm characterized by self-organization, mutation, crossover, and SA-like selection scheme of the strategy parameters. Here, the MSA problem is treated as a multi-objective optimization problem of the hybrid evolutionary algorithm, DESA. Thus, we name the algorithm as DESA-MSA. Simulated sequences and alignments were generated to evaluate the accuracy and efficiency of DESA-MSA using different indel sizes, sequence lengths, deletion rates and insertion rates. The proposed hybrid algorithm obtained acceptable solutions particularly for the MSA problem evaluated based on the three objectives.

  6. A leader-follower-interactive method for regional water resources management with considering multiple water demands and eco-environmental constraints

    NASA Astrophysics Data System (ADS)

    Chen, Yizhong; Lu, Hongwei; Li, Jing; Ren, Lixia; He, Li

    2017-05-01

    This study presents the mathematical formulation and implementations of a synergistic optimization framework based on an understanding of water availability and reliability together with the characteristics of multiple water demands. This framework simultaneously integrates a set of leader-followers-interactive objectives established by different decision makers during the synergistic optimization. The upper-level model (leader's one) determines the optimal pollutants discharge to satisfy the environmental target. The lower-level model (follower's one) accepts the dispatch requirement from the upper-level one and dominates the optimal water-allocation strategy to maximize economic benefits representing the regional authority. The complicated bi-level model significantly improves upon the conventional programming methods through the mutual influence and restriction between the upper- and lower-level decision processes, particularly when limited water resources are available for multiple completing users. To solve the problem, a bi-level interactive solution algorithm based on satisfactory degree is introduced into the decision-making process for measuring to what extent the constraints are met and the objective reaches its optima. The capabilities of the proposed model are illustrated through a real-world case study of water resources management system in the district of Fengtai located in Beijing, China. Feasible decisions in association with water resources allocation, wastewater emission and pollutants discharge would be sequentially generated for balancing the objectives subject to the given water-related constraints, which can enable Stakeholders to grasp the inherent conflicts and trade-offs between the environmental and economic interests. The performance of the developed bi-level model is enhanced by comparing with single-level models. Moreover, in consideration of the uncertainty in water demand and availability, sensitivity analysis and policy analysis are employed for identifying their impacts on the final decisions and improving the practical applications.

  7. A new approach to power quality and electricity reliability monitoring-case study illustrations of the capabilities of the I-GridTM system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Divan, Deepak; Brumsickle, William; Eto, Joseph

    2003-04-01

    This report describes a new approach for collecting information on power quality and reliability and making it available in the public domain. Making this information readily available in a form that is meaningful to electricity consumers is necessary for enabling more informed private and public decisions regarding electricity reliability. The system dramatically reduces the cost (and expertise) needed for customers to obtain information on the most significant power quality events, called voltage sags and interruptions. The system also offers widespread access to information on power quality collected from multiple sites and the potential for capturing information on the impacts ofmore » power quality problems, together enabling a wide variety of analysis and benchmarking to improve system reliability. Six case studies demonstrate selected functionality and capabilities of the system, including: Linking measured power quality events to process interruption and downtime; Demonstrating the ability to correlate events recorded by multiple monitors to narrow and confirm the causes of power quality events; and Benchmarking power quality and reliability on a firm and regional basis.« less

  8. A Study on Software-based Sensing Technology for Multiple Object Control in AR Video

    PubMed Central

    Jung, Sungmo; Song, Jae-gu; Hwang, Dae-Joon; Ahn, Jae Young; Kim, Seoksoo

    2010-01-01

    Researches on Augmented Reality (AR) have recently received attention. With these, the Machine-to-Machine (M2M) market has started to be active and there are numerous efforts to apply this to real life in all sectors of society. To date, the M2M market has applied the existing marker-based AR technology in entertainment, business and other industries. With the existing marker-based AR technology, a designated object can only be loaded on the screen from one marker and a marker has to be added to load on the screen the same object again. This situation creates a problem where the relevant marker’should be extracted and printed in screen so that loading of the multiple objects is enabled. However, since the distance between markers will not be measured in the process of detecting and copying markers, the markers can be overlapped and thus the objects would not be augmented. To solve this problem, a circle having the longest radius needs to be created from a focal point of a marker to be copied, so that no object is copied within the confines of the circle. In this paper, software-based sensing technology for multiple object detection and loading using PPHT has been developed and overlapping marker control according to multiple object control has been studied using the Bresenham and Mean Shift algorithms. PMID:22163444

  9. A study on software-based sensing technology for multiple object control in AR video.

    PubMed

    Jung, Sungmo; Song, Jae-Gu; Hwang, Dae-Joon; Ahn, Jae Young; Kim, Seoksoo

    2010-01-01

    Researches on Augmented Reality (AR) have recently received attention. With these, the Machine-to-Machine (M2M) market has started to be active and there are numerous efforts to apply this to real life in all sectors of society. To date, the M2M market has applied the existing marker-based AR technology in entertainment, business and other industries. With the existing marker-based AR technology, a designated object can only be loaded on the screen from one marker and a marker has to be added to load on the screen the same object again. This situation creates a problem where the relevant marker'should be extracted and printed in screen so that loading of the multiple objects is enabled. However, since the distance between markers will not be measured in the process of detecting and copying markers, the markers can be overlapped and thus the objects would not be augmented. To solve this problem, a circle having the longest radius needs to be created from a focal point of a marker to be copied, so that no object is copied within the confines of the circle. In this paper, software-based sensing technology for multiple object detection and loading using PPHT has been developed and overlapping marker control according to multiple object control has been studied using the Bresenham and Mean Shift algorithms.

  10. Multi-objective optimisation and decision-making of space station logistics strategies

    NASA Astrophysics Data System (ADS)

    Zhu, Yue-he; Luo, Ya-zhong

    2016-10-01

    Space station logistics strategy optimisation is a complex engineering problem with multiple objectives. Finding a decision-maker-preferred compromise solution becomes more significant when solving such a problem. However, the designer-preferred solution is not easy to determine using the traditional method. Thus, a hybrid approach that combines the multi-objective evolutionary algorithm, physical programming, and differential evolution (DE) algorithm is proposed to deal with the optimisation and decision-making of space station logistics strategies. A multi-objective evolutionary algorithm is used to acquire a Pareto frontier and help determine the range parameters of the physical programming. Physical programming is employed to convert the four-objective problem into a single-objective problem, and a DE algorithm is applied to solve the resulting physical programming-based optimisation problem. Five kinds of objective preference are simulated and compared. The simulation results indicate that the proposed approach can produce good compromise solutions corresponding to different decision-makers' preferences.

  11. Dynamic cellular manufacturing system considering machine failure and workload balance

    NASA Astrophysics Data System (ADS)

    Rabbani, Masoud; Farrokhi-Asl, Hamed; Ravanbakhsh, Mohammad

    2018-02-01

    Machines are a key element in the production system and their failure causes irreparable effects in terms of cost and time. In this paper, a new multi-objective mathematical model for dynamic cellular manufacturing system (DCMS) is provided with consideration of machine reliability and alternative process routes. In this dynamic model, we attempt to resolve the problem of integrated family (part/machine cell) formation as well as the operators' assignment to the cells. The first objective minimizes the costs associated with the DCMS. The second objective optimizes the labor utilization and, finally, a minimum value of the variance of workload between different cells is obtained by the third objective function. Due to the NP-hard nature of the cellular manufacturing problem, the problem is initially validated by the GAMS software in small-sized problems, and then the model is solved by two well-known meta-heuristic methods including non-dominated sorting genetic algorithm and multi-objective particle swarm optimization in large-scaled problems. Finally, the results of the two algorithms are compared with respect to five different comparison metrics.

  12. Solving multi-objective optimization problems in conservation with the reference point method

    PubMed Central

    Dujardin, Yann; Chadès, Iadine

    2018-01-01

    Managing the biodiversity extinction crisis requires wise decision-making processes able to account for the limited resources available. In most decision problems in conservation biology, several conflicting objectives have to be taken into account. Most methods used in conservation either provide suboptimal solutions or use strong assumptions about the decision-maker’s preferences. Our paper reviews some of the existing approaches to solve multi-objective decision problems and presents new multi-objective linear programming formulations of two multi-objective optimization problems in conservation, allowing the use of a reference point approach. Reference point approaches solve multi-objective optimization problems by interactively representing the preferences of the decision-maker with a point in the criteria (objectives) space, called the reference point. We modelled and solved the following two problems in conservation: a dynamic multi-species management problem under uncertainty and a spatial allocation resource management problem. Results show that the reference point method outperforms classic methods while illustrating the use of an interactive methodology for solving combinatorial problems with multiple objectives. The method is general and can be adapted to a wide range of ecological combinatorial problems. PMID:29293650

  13. Reliability Sensitivity Analysis and Design Optimization of Composite Structures Based on Response Surface Methodology

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    2003-01-01

    This report discusses the development and application of two alternative strategies in the form of global and sequential local response surface (RS) techniques for the solution of reliability-based optimization (RBO) problems. The problem of a thin-walled composite circular cylinder under axial buckling instability is used as a demonstrative example. In this case, the global technique uses a single second-order RS model to estimate the axial buckling load over the entire feasible design space (FDS) whereas the local technique uses multiple first-order RS models with each applied to a small subregion of FDS. Alternative methods for the calculation of unknown coefficients in each RS model are explored prior to the solution of the optimization problem. The example RBO problem is formulated as a function of 23 uncorrelated random variables that include material properties, thickness and orientation angle of each ply, cylinder diameter and length, as well as the applied load. The mean values of the 8 ply thicknesses are treated as independent design variables. While the coefficients of variation of all random variables are held fixed, the standard deviations of ply thicknesses can vary during the optimization process as a result of changes in the design variables. The structural reliability analysis is based on the first-order reliability method with reliability index treated as the design constraint. In addition to the probabilistic sensitivity analysis of reliability index, the results of the RBO problem are presented for different combinations of cylinder length and diameter and laminate ply patterns. The two strategies are found to produce similar results in terms of accuracy with the sequential local RS technique having a considerably better computational efficiency.

  14. Reliability-based design optimization using a generalized subset simulation method and posterior approximation

    NASA Astrophysics Data System (ADS)

    Ma, Yuan-Zhuo; Li, Hong-Shuang; Yao, Wei-Xing

    2018-05-01

    The evaluation of the probabilistic constraints in reliability-based design optimization (RBDO) problems has always been significant and challenging work, which strongly affects the performance of RBDO methods. This article deals with RBDO problems using a recently developed generalized subset simulation (GSS) method and a posterior approximation approach. The posterior approximation approach is used to transform all the probabilistic constraints into ordinary constraints as in deterministic optimization. The assessment of multiple failure probabilities required by the posterior approximation approach is achieved by GSS in a single run at all supporting points, which are selected by a proper experimental design scheme combining Sobol' sequences and Bucher's design. Sequentially, the transformed deterministic design optimization problem can be solved by optimization algorithms, for example, the sequential quadratic programming method. Three optimization problems are used to demonstrate the efficiency and accuracy of the proposed method.

  15. The 5K70SK automatically tuned, high power, S-band klystron

    NASA Technical Reports Server (NTRS)

    Goldfinger, A.

    1977-01-01

    Primary objectives include delivery of 44 5K70SK klystron amplifier tubes and 26 remote tuner assemblies with spare parts kits. Results of a reliability demonstration on a klystron test cavity are discussed, along with reliability tests performed on a remote tuning unit. Production problems and one design modification are reported and discussed. Results of PAT and DVT are included.

  16. Pre-Proposal Assessment of Reliability for Spacecraft Docking with Limited Information

    NASA Technical Reports Server (NTRS)

    Brall, Aron

    2013-01-01

    This paper addresses the problem of estimating the reliability of a critical system function as well as its impact on the system reliability when limited information is available. The approach addresses the basic function reliability, and then the impact of multiple attempts to accomplish the function. The dependence of subsequent attempts on prior failure to accomplish the function is also addressed. The autonomous docking of two spacecraft was the specific example that generated the inquiry, and the resultant impact on total reliability generated substantial interest in presenting the results due to the relative insensitivity of overall performance to basic function reliability and moderate degradation given sufficient attempts to try and accomplish the required goal. The application of the methodology allows proper emphasis on the characteristics that can be estimated with some knowledge, and to insulate the integrity of the design from those characteristics that can't be properly estimated with any rational value of uncertainty. The nature of NASA's missions contains a great deal of uncertainty due to the pursuit of new science or operations. This approach can be applied to any function where multiple attempts at success, with or without degradation, are allowed.

  17. MONSS: A multi-objective nonlinear simplex search approach

    NASA Astrophysics Data System (ADS)

    Zapotecas-Martínez, Saúl; Coello Coello, Carlos A.

    2016-01-01

    This article presents a novel methodology for dealing with continuous box-constrained multi-objective optimization problems (MOPs). The proposed algorithm adopts a nonlinear simplex search scheme in order to obtain multiple elements of the Pareto optimal set. The search is directed by a well-distributed set of weight vectors, each of which defines a scalarization problem that is solved by deforming a simplex according to the movements described by Nelder and Mead's method. Considering an MOP with n decision variables, the simplex is constructed using n+1 solutions which minimize different scalarization problems defined by n+1 neighbor weight vectors. All solutions found in the search are used to update a set of solutions considered to be the minima for each separate problem. In this way, the proposed algorithm collectively obtains multiple trade-offs among the different conflicting objectives, while maintaining a proper representation of the Pareto optimal front. In this article, it is shown that a well-designed strategy using just mathematical programming techniques can be competitive with respect to the state-of-the-art multi-objective evolutionary algorithms against which it was compared.

  18. The Reliability and Validity of the Dominic Interactive: A Computerized Child Report Instrument for Mental Health Problems

    ERIC Educational Resources Information Center

    Kuijpers, Rowella C. W. M.; Otten, Roy; Krol, Nicole P. C. M.; Vermulst, Ad A.; Engels, Rutger C. M. E.

    2013-01-01

    Background: Children and youths' self-report of mental health problems is considered essential but complicated. Objective: This study examines the psychometric properties of the Dominic Interactive, a computerized DSM-IV based self-report questionnaire and explores informant correspondence. Methods: The Dominic Interactive was administered to 214…

  19. Reliability of Fault Tolerant Control Systems. Part 2

    NASA Technical Reports Server (NTRS)

    Wu, N. Eva

    2000-01-01

    This paper reports Part II of a two part effort that is intended to delineate the relationship between reliability and fault tolerant control in a quantitative manner. Reliability properties peculiar to fault-tolerant control systems are emphasized, such as the presence of analytic redundancy in high proportion, the dependence of failures on control performance, and high risks associated with decisions in redundancy management due to multiple sources of uncertainties and sometimes large processing requirements. As a consequence, coverage of failures through redundancy management can be severely limited. The paper proposes to formulate the fault tolerant control problem as an optimization problem that maximizes coverage of failures through redundancy management. Coverage modeling is attempted in a way that captures its dependence on the control performance and on the diagnostic resolution. Under the proposed redundancy management policy, it is shown that an enhanced overall system reliability can be achieved with a control law of a superior robustness, with an estimator of a higher resolution, and with a control performance requirement of a lesser stringency.

  20. A parametric LQ approach to multiobjective control system design

    NASA Technical Reports Server (NTRS)

    Kyr, Douglas E.; Buchner, Marc

    1988-01-01

    The synthesis of a constant parameter output feedback control law of constrained structure is set in a multiple objective linear quadratic regulator (MOLQR) framework. The use of intuitive objective functions such as model-following ability and closed-loop trajectory sensitivity, allow multiple objective decision making techniques, such as the surrogate worth tradeoff method, to be applied. For the continuous-time deterministic problem with an infinite time horizon, dynamic compensators as well as static output feedback controllers can be synthesized using a descent Anderson-Moore algorithm modified to impose linear equality constraints on the feedback gains by moving in feasible directions. Results of three different examples are presented, including a unique reformulation of the sensitivity reduction problem.

  1. Improving Predictions of Multiple Binary Models in ILP

    PubMed Central

    2014-01-01

    Despite the success of ILP systems in learning first-order rules from small number of examples and complexly structured data in various domains, they struggle in dealing with multiclass problems. In most cases they boil down a multiclass problem into multiple black-box binary problems following the one-versus-one or one-versus-rest binarisation techniques and learn a theory for each one. When evaluating the learned theories of multiple class problems in one-versus-rest paradigm particularly, there is a bias caused by the default rule toward the negative classes leading to an unrealistic high performance beside the lack of prediction integrity between the theories. Here we discuss the problem of using one-versus-rest binarisation technique when it comes to evaluating multiclass data and propose several methods to remedy this problem. We also illustrate the methods and highlight their link to binary tree and Formal Concept Analysis (FCA). Our methods allow learning of a simple, consistent, and reliable multiclass theory by combining the rules of the multiple one-versus-rest theories into one rule list or rule set theory. Empirical evaluation over a number of data sets shows that our proposed methods produce coherent and accurate rule models from the rules learned by the ILP system of Aleph. PMID:24696657

  2. Intelligent composting assisted by a wireless sensing network.

    PubMed

    López, Marga; Martinez-Farre, Xavier; Casas, Oscar; Quilez, Marcos; Polo, Jose; Lopez, Oscar; Hornero, Gemma; Pinilla, Mirta R; Rovira, Carlos; Ramos, Pedro M; Borges, Beatriz; Marques, Hugo; Girão, Pedro Silva

    2014-04-01

    Monitoring of the moisture and temperature of composting process is a key factor to obtain a quality product beyond the quality of raw materials. Current methodologies for monitoring these two parameters are time consuming for workers, sometimes not sufficiently reliable to help decision-making and thus are ignored in some cases. This article describes an advance on monitoring of composting process through a Wireless Sensor Network (WSN) that allows measurement of temperature and moisture in real time in multiple points of the composting material, the Compo-ball system. To implement such measurement capabilities on-line, a WSN composed of multiple sensor nodes was designed and implemented to provide the staff with an efficient monitoring composting management tool. After framing the problem, the objectives and characteristics of the WSN are briefly discussed and a short description of the hardware and software of the network's components are presented. Presentation and discussion of practical issues and results obtained with the WSN during a demonstration stage that took place in several composting sites concludes the paper. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. An evaluation of exact methods for the multiple subset maximum cardinality selection problem.

    PubMed

    Brusco, Michael J; Köhn, Hans-Friedrich; Steinley, Douglas

    2016-05-01

    The maximum cardinality subset selection problem requires finding the largest possible subset from a set of objects, such that one or more conditions are satisfied. An important extension of this problem is to extract multiple subsets, where the addition of one more object to a larger subset would always be preferred to increases in the size of one or more smaller subsets. We refer to this as the multiple subset maximum cardinality selection problem (MSMCSP). A recently published branch-and-bound algorithm solves the MSMCSP as a partitioning problem. Unfortunately, the computational requirement associated with the algorithm is often enormous, thus rendering the method infeasible from a practical standpoint. In this paper, we present an alternative approach that successively solves a series of binary integer linear programs to obtain a globally optimal solution to the MSMCSP. Computational comparisons of the methods using published similarity data for 45 food items reveal that the proposed sequential method is computationally far more efficient than the branch-and-bound approach. © 2016 The British Psychological Society.

  4. Spatial resolution enhancement of satellite image data using fusion approach

    NASA Astrophysics Data System (ADS)

    Lestiana, H.; Sukristiyanti

    2018-02-01

    Object identification using remote sensing data has a problem when the spatial resolution is not in accordance with the object. The fusion approach is one of methods to solve the problem, to improve the object recognition and to increase the objects information by combining data from multiple sensors. The application of fusion image can be used to estimate the environmental component that is needed to monitor in multiple views, such as evapotranspiration estimation, 3D ground-based characterisation, smart city application, urban environments, terrestrial mapping, and water vegetation. Based on fusion application method, the visible object in land area has been easily recognized using the method. The variety of object information in land area has increased the variation of environmental component estimation. The difficulties in recognizing the invisible object like Submarine Groundwater Discharge (SGD), especially in tropical area, might be decreased by the fusion method. The less variation of the object in the sea surface temperature is a challenge to be solved.

  5. Rectangular Array Model Supporting Students' Spatial Structuring in Learning Multiplication

    ERIC Educational Resources Information Center

    Shanty, Nenden Octavarulia; Wijaya, Surya

    2012-01-01

    We examine how rectangular array model can support students' spatial structuring in learning multiplication. To begin, we define what we mean by spatial structuring as the mental operation of constructing an organization or form for an object or set of objects. For that reason, the eggs problem was chosen as the starting point in which the…

  6. Children with mathematical learning disability fail in recruiting verbal and numerical brain regions when solving simple multiplication problems.

    PubMed

    Berteletti, Ilaria; Prado, Jérôme; Booth, James R

    2014-08-01

    Greater skill in solving single-digit multiplication problems requires a progressive shift from a reliance on numerical to verbal mechanisms over development. Children with mathematical learning disability (MD), however, are thought to suffer from a specific impairment in numerical mechanisms. Here we tested the hypothesis that this impairment might prevent MD children from transitioning toward verbal mechanisms when solving single-digit multiplication problems. Brain activations during multiplication problems were compared in MD and typically developing (TD) children (3rd to 7th graders) in numerical and verbal regions which were individuated by independent localizer tasks. We used small (e.g., 2 × 3) and large (e.g., 7 × 9) problems as these problems likely differ in their reliance on verbal versus numerical mechanisms. Results indicate that MD children have reduced activations in both the verbal (i.e., left inferior frontal gyrus and left middle temporal to superior temporal gyri) and the numerical (i.e., right superior parietal lobule including intra-parietal sulcus) regions suggesting that both mechanisms are impaired. Moreover, the only reliable activation observed for MD children was in the numerical region when solving small problems. This suggests that MD children could effectively engage numerical mechanisms only for the easier problems. Conversely, TD children showed a modulation of activation with problem size in the verbal regions. This suggests that TD children were effectively engaging verbal mechanisms for the easier problems. Moreover, TD children with better language skills were more effective at engaging verbal mechanisms. In conclusion, results suggest that the numerical- and language-related processes involved in solving multiplication problems are impaired in MD children. Published by Elsevier Ltd.

  7. "Notice of Violation of IEEE Publication Principles" Multiobjective Reinforcement Learning: A Comprehensive Overview.

    PubMed

    Liu, Chunming; Xu, Xin; Hu, Dewen

    2013-04-29

    Reinforcement learning is a powerful mechanism for enabling agents to learn in an unknown environment, and most reinforcement learning algorithms aim to maximize some numerical value, which represents only one long-term objective. However, multiple long-term objectives are exhibited in many real-world decision and control problems; therefore, recently, there has been growing interest in solving multiobjective reinforcement learning (MORL) problems with multiple conflicting objectives. The aim of this paper is to present a comprehensive overview of MORL. In this paper, the basic architecture, research topics, and naive solutions of MORL are introduced at first. Then, several representative MORL approaches and some important directions of recent research are reviewed. The relationships between MORL and other related research are also discussed, which include multiobjective optimization, hierarchical reinforcement learning, and multi-agent reinforcement learning. Finally, research challenges and open problems of MORL techniques are highlighted.

  8. Behavioral pattern identification for structural health monitoring in complex systems

    NASA Astrophysics Data System (ADS)

    Gupta, Shalabh

    Estimation of structural damage and quantification of structural integrity are critical for safe and reliable operation of human-engineered complex systems, such as electromechanical, thermofluid, and petrochemical systems. Damage due to fatigue crack is one of the most commonly encountered sources of structural degradation in mechanical systems. Early detection of fatigue damage is essential because the resulting structural degradation could potentially cause catastrophic failures, leading to loss of expensive equipment and human life. Therefore, for reliable operation and enhanced availability, it is necessary to develop capabilities for prognosis and estimation of impending failures, such as the onset of wide-spread fatigue crack damage in mechanical structures. This dissertation presents information-based online sensing of fatigue damage using the analytical tools of symbolic time series analysis ( STSA). Anomaly detection using STSA is a pattern recognition method that has been recently developed based upon a fixed-structure, fixed-order Markov chain. The analysis procedure is built upon the principles of Symbolic Dynamics, Information Theory and Statistical Pattern Recognition. The dissertation demonstrates real-time fatigue damage monitoring based on time series data of ultrasonic signals. Statistical pattern changes are measured using STSA to monitor the evolution of fatigue damage. Real-time anomaly detection is presented as a solution to the forward (analysis) problem and the inverse (synthesis) problem. (1) the forward problem - The primary objective of the forward problem is identification of the statistical changes in the time series data of ultrasonic signals due to gradual evolution of fatigue damage. (2) the inverse problem - The objective of the inverse problem is to infer the anomalies from the observed time series data in real time based on the statistical information generated during the forward problem. A computer-controlled special-purpose fatigue test apparatus, equipped with multiple sensing devices (e.g., ultrasonics and optical microscope) for damage analysis, has been used to experimentally validate the STSA method for early detection of anomalous behavior. The sensor information is integrated with a software module consisting of the STSA algorithm for real-time monitoring of fatigue damage. Experiments have been conducted under different loading conditions on specimens constructed from the ductile aluminium alloy 7075 - T6. The dissertation has also investigated the application of the STSA method for early detection of anomalies in other engineering disciplines. Two primary applications include combustion instability in a generic thermal pulse combustor model and whirling phenomenon in a typical misaligned shaft.

  9. Reliability of a single objective measure in assessing sleepiness.

    PubMed

    Sunwoo, Bernie Y; Jackson, Nicholas; Maislin, Greg; Gurubhagavatula, Indira; George, Charles F; Pack, Allan I

    2012-01-01

    To evaluate reliability of single objective tests in assessing sleepiness. Subjects who completed polysomnography underwent a 4-nap multiple sleep latency test (MSLT) the following day. Prior to each nap opportunity on MSLT, subjects performed the psychomotor vigilance test (PVT) and divided attention driving task (DADT). Results of single versus multiple test administrations were compared using the intraclass correlation coefficient (ICC) and adjusted for test administration order effects to explore time of day effects. Measures were explored as continuous and binary (i.e., impaired or not impaired). Community-based sample evaluated at a tertiary, university-based sleep center. 372 adult commercial vehicle operators oversampled for increased obstructive sleep apnea risk. N/A. AS CONTINUOUS MEASURES, ICC WERE AS FOLLOWS: MSLT 0.45, PVT median response time 0.69, PVT number of lapses 0.51, 10-min DADT tracking error 0.87, 20-min DADT tracking error 0.90. Based on binary outcomes, ICC were: MSLT 0.63, PVT number of lapses 0.85, 10-min DADT 0.95, 20-min DADT 0.96. Statistically significant time of day effects were seen in both the MSLT and PVT but not the DADT. Correlation between ESS and different objective tests was strongest for MSLT, range [-0.270 to -0.195] and persisted across all time points. Single DADT and PVT administrations are reliable measures of sleepiness. A single MSLT administration can reasonably discriminate individuals with MSL < 8 minutes. These results support the use of a single administration of some objective tests of sleepiness when performed under controlled conditions in routine clinical care.

  10. Learning of Rule Ensembles for Multiple Attribute Ranking Problems

    NASA Astrophysics Data System (ADS)

    Dembczyński, Krzysztof; Kotłowski, Wojciech; Słowiński, Roman; Szeląg, Marcin

    In this paper, we consider the multiple attribute ranking problem from a Machine Learning perspective. We propose two approaches to statistical learning of an ensemble of decision rules from decision examples provided by the Decision Maker in terms of pairwise comparisons of some objects. The first approach consists in learning a preference function defining a binary preference relation for a pair of objects. The result of application of this function on all pairs of objects to be ranked is then exploited using the Net Flow Score procedure, giving a linear ranking of objects. The second approach consists in learning a utility function for single objects. The utility function also gives a linear ranking of objects. In both approaches, the learning is based on the boosting technique. The presented approaches to Preference Learning share good properties of the decision rule preference model and have good performance in the massive-data learning problems. As Preference Learning and Multiple Attribute Decision Aiding share many concepts and methodological issues, in the introduction, we review some aspects bridging these two fields. To illustrate the two approaches proposed in this paper, we solve with them a toy example concerning the ranking of a set of cars evaluated by multiple attributes. Then, we perform a large data experiment on real data sets. The first data set concerns credit rating. Since recent research in the field of Preference Learning is motivated by the increasing role of modeling preferences in recommender systems and information retrieval, we chose two other massive data sets from this area - one comes from movie recommender system MovieLens, and the other concerns ranking of text documents from 20 Newsgroups data set.

  11. Local search heuristic for the discrete leader-follower problem with multiple follower objectives

    NASA Astrophysics Data System (ADS)

    Kochetov, Yury; Alekseeva, Ekaterina; Mezmaz, Mohand

    2016-10-01

    We study a discrete bilevel problem, called as well as leader-follower problem, with multiple objectives at the lower level. It is assumed that constraints at the upper level can include variables of both levels. For such ill-posed problem we define feasible and optimal solutions for pessimistic case. A central point of this work is a two stage method to get a feasible solution under the pessimistic case, given a leader decision. The target of the first stage is a follower solution that violates the leader constraints. The target of the second stage is a pessimistic feasible solution. Each stage calls a heuristic and a solver for a series of particular mixed integer programs. The method is integrated inside a local search based heuristic that is designed to find near-optimal leader solutions.

  12. A collaborative scheduling model for the supply-hub with multiple suppliers and multiple manufacturers.

    PubMed

    Li, Guo; Lv, Fei; Guan, Xu

    2014-01-01

    This paper investigates a collaborative scheduling model in the assembly system, wherein multiple suppliers have to deliver their components to the multiple manufacturers under the operation of Supply-Hub. We first develop two different scenarios to examine the impact of Supply-Hub. One is that suppliers and manufacturers make their decisions separately, and the other is that the Supply-Hub makes joint decisions with collaborative scheduling. The results show that our scheduling model with the Supply-Hub is a NP-complete problem, therefore, we propose an auto-adapted differential evolution algorithm to solve this problem. Moreover, we illustrate that the performance of collaborative scheduling by the Supply-Hub is superior to separate decision made by each manufacturer and supplier. Furthermore, we also show that the algorithm proposed has good convergence and reliability, which can be applicable to more complicated supply chain environment.

  13. A Collaborative Scheduling Model for the Supply-Hub with Multiple Suppliers and Multiple Manufacturers

    PubMed Central

    Lv, Fei; Guan, Xu

    2014-01-01

    This paper investigates a collaborative scheduling model in the assembly system, wherein multiple suppliers have to deliver their components to the multiple manufacturers under the operation of Supply-Hub. We first develop two different scenarios to examine the impact of Supply-Hub. One is that suppliers and manufacturers make their decisions separately, and the other is that the Supply-Hub makes joint decisions with collaborative scheduling. The results show that our scheduling model with the Supply-Hub is a NP-complete problem, therefore, we propose an auto-adapted differential evolution algorithm to solve this problem. Moreover, we illustrate that the performance of collaborative scheduling by the Supply-Hub is superior to separate decision made by each manufacturer and supplier. Furthermore, we also show that the algorithm proposed has good convergence and reliability, which can be applicable to more complicated supply chain environment. PMID:24892104

  14. Computational methods for efficient structural reliability and reliability sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  15. A Delay-Aware and Reliable Data Aggregation for Cyber-Physical Sensing

    PubMed Central

    Zhang, Jinhuan; Long, Jun; Zhang, Chengyuan; Zhao, Guihu

    2017-01-01

    Physical information sensed by various sensors in a cyber-physical system should be collected for further operation. In many applications, data aggregation should take reliability and delay into consideration. To address these problems, a novel Tiered Structure Routing-based Delay-Aware and Reliable Data Aggregation scheme named TSR-DARDA for spherical physical objects is proposed. By dividing the spherical network constructed by dispersed sensor nodes into circular tiers with specifically designed widths and cells, TSTR-DARDA tries to enable as many nodes as possible to transmit data simultaneously. In order to ensure transmission reliability, lost packets are retransmitted. Moreover, to minimize the latency while maintaining reliability for data collection, in-network aggregation and broadcast techniques are adopted to deal with the transmission between data collecting nodes in the outer layer and their parent data collecting nodes in the inner layer. Thus, the optimization problem is transformed to minimize the delay under reliability constraints by controlling the system parameters. To demonstrate the effectiveness of the proposed scheme, we have conducted extensive theoretical analysis and comparisons to evaluate the performance of TSR-DARDA. The analysis and simulations show that TSR-DARDA leads to lower delay with reliability satisfaction. PMID:28218668

  16. Towards lexicographic multi-objective linear programming using grossone methodology

    NASA Astrophysics Data System (ADS)

    Cococcioni, Marco; Pappalardo, Massimo; Sergeyev, Yaroslav D.

    2016-10-01

    Lexicographic Multi-Objective Linear Programming (LMOLP) problems can be solved in two ways: preemptive and nonpreemptive. The preemptive approach requires the solution of a series of LP problems, with changing constraints (each time the next objective is added, a new constraint appears). The nonpreemptive approach is based on a scalarization of the multiple objectives into a single-objective linear function by a weighted combination of the given objectives. It requires the specification of a set of weights, which is not straightforward and can be time consuming. In this work we present both mathematical and software ingredients necessary to solve LMOLP problems using a recently introduced computational methodology (allowing one to work numerically with infinities and infinitesimals) based on the concept of grossone. The ultimate goal of such an attempt is an implementation of a simplex-like algorithm, able to solve the original LMOLP problem by solving only one single-objective problem and without the need to specify finite weights. The expected advantages are therefore obvious.

  17. Constructing objective tests

    NASA Astrophysics Data System (ADS)

    Aubrecht, Gordon J.; Aubrecht, Judith D.

    1983-07-01

    True-false or multiple-choice tests can be useful instruments for evaluating student progress. We examine strategies for planning objective tests which serve to test the material covered in science (physics) courses. We also examine strategies for writing questions for tests within a test blueprint. The statistical basis for judging the quality of test items are discussed. Reliability, difficulty, and discrimination indices are defined and examples presented. Our recommendation are rather easily put into practice.

  18. Break-even Analysis: Tool for Budget Planning

    ERIC Educational Resources Information Center

    Lohmann, Roger A.

    1976-01-01

    Multiple funding creates special management problems for the administrator of a human service agency. This article presents a useful analytic technique adapted from business practice that can help the administrator draw up and balance a unified budget. Such a budget also affords reliable overview of the agency's financial status. (Author)

  19. Probabilistic Design of a Wind Tunnel Model to Match the Response of a Full-Scale Aircraft

    NASA Technical Reports Server (NTRS)

    Mason, Brian H.; Stroud, W. Jefferson; Krishnamurthy, T.; Spain, Charles V.; Naser, Ahmad S.

    2005-01-01

    approach is presented for carrying out the reliability-based design of a plate-like wing that is part of a wind tunnel model. The goal is to design the wind tunnel model to match the stiffness characteristics of the wing box of a flight vehicle while satisfying strength-based risk/reliability requirements that prevents damage to the wind tunnel model and fixtures. The flight vehicle is a modified F/A-18 aircraft. The design problem is solved using reliability-based optimization techniques. The objective function to be minimized is the difference between the displacements of the wind tunnel model and the corresponding displacements of the flight vehicle. The design variables control the thickness distribution of the wind tunnel model. Displacements of the wind tunnel model change with the thickness distribution, while displacements of the flight vehicle are a set of fixed data. The only constraint imposed is that the probability of failure is less than a specified value. Failure is assumed to occur if the stress caused by aerodynamic pressure loading is greater than the specified strength allowable. Two uncertain quantities are considered: the allowable stress and the thickness distribution of the wind tunnel model. Reliability is calculated using Monte Carlo simulation with response surfaces that provide approximate values of stresses. The response surface equations are, in turn, computed from finite element analyses of the wind tunnel model at specified design points. Because the response surface approximations were fit over a small region centered about the current design, the response surfaces were refit periodically as the design variables changed. Coarse-grained parallelism was used to simultaneously perform multiple finite element analyses. Studies carried out in this paper demonstrate that this scheme of using moving response surfaces and coarse-grained computational parallelism reduce the execution time of the Monte Carlo simulation enough to make the design problem tractable. The results of the reliability-based designs performed in this paper show that large decreases in the probability of stress-based failure can be realized with only small sacrifices in the ability of the wind tunnel model to represent the displacements of the full-scale vehicle.

  20. A new multi-objective optimization model for preventive maintenance and replacement scheduling of multi-component systems

    NASA Astrophysics Data System (ADS)

    Moghaddam, Kamran S.; Usher, John S.

    2011-07-01

    In this article, a new multi-objective optimization model is developed to determine the optimal preventive maintenance and replacement schedules in a repairable and maintainable multi-component system. In this model, the planning horizon is divided into discrete and equally-sized periods in which three possible actions must be planned for each component, namely maintenance, replacement, or do nothing. The objective is to determine a plan of actions for each component in the system while minimizing the total cost and maximizing overall system reliability simultaneously over the planning horizon. Because of the complexity, combinatorial and highly nonlinear structure of the mathematical model, two metaheuristic solution methods, generational genetic algorithm, and a simulated annealing are applied to tackle the problem. The Pareto optimal solutions that provide good tradeoffs between the total cost and the overall reliability of the system can be obtained by the solution approach. Such a modeling approach should be useful for maintenance planners and engineers tasked with the problem of developing recommended maintenance plans for complex systems of components.

  1. On the Discovery of Evolving Truth

    PubMed Central

    Li, Yaliang; Li, Qi; Gao, Jing; Su, Lu; Zhao, Bo; Fan, Wei; Han, Jiawei

    2015-01-01

    In the era of big data, information regarding the same objects can be collected from increasingly more sources. Unfortunately, there usually exist conflicts among the information coming from different sources. To tackle this challenge, truth discovery, i.e., to integrate multi-source noisy information by estimating the reliability of each source, has emerged as a hot topic. In many real world applications, however, the information may come sequentially, and as a consequence, the truth of objects as well as the reliability of sources may be dynamically evolving. Existing truth discovery methods, unfortunately, cannot handle such scenarios. To address this problem, we investigate the temporal relations among both object truths and source reliability, and propose an incremental truth discovery framework that can dynamically update object truths and source weights upon the arrival of new data. Theoretical analysis is provided to show that the proposed method is guaranteed to converge at a fast rate. The experiments on three real world applications and a set of synthetic data demonstrate the advantages of the proposed method over state-of-the-art truth discovery methods. PMID:26705502

  2. Evaluating cognition in individuals with Huntington disease: Neuro-QoL cognitive functioning measures.

    PubMed

    Lai, Jin-Shei; Goodnight, Siera; Downing, Nancy R; Ready, Rebecca E; Paulsen, Jane S; Kratz, Anna L; Stout, Julie C; McCormack, Michael K; Cella, David; Ross, Christopher; Russell, Jenna; Carlozzi, Noelle E

    2018-03-01

    Cognitive functioning impacts health-related quality of life (HRQOL) for individuals with Huntington disease (HD). The Neuro-QoL includes two patient-reported outcome (PRO) measures of cognition-Executive Function (EF) and General Concerns (GC). These measures have not previously been validated for use in HD. The purpose of this analysis is to evaluate the reliability and validity of the Neuro-QoL Cognitive Function measures for use in HD. Five hundred ten individuals with prodromal or manifest HD completed the Neuro-QoL Cognition measures, two other PRO measures of HRQOL (WHODAS 2.0 and EQ5D), and a depression measure (PROMIS Depression). Measures of functioning The Total Functional Capacity and behavior (Problem Behaviors Assessment) were completed by clinician interview. Objective measures of cognition were obtained using clinician-administered Symbol Digit Modalities Test and the Stroop Test (Word, Color, and Interference). Self-rated, clinician-rated, and objective composite scores were developed. We examined the Neuro-QoL measures for reliability, convergent validity, discriminant validity, and known-groups validity. Excellent reliabilities (Cronbach's alphas ≥ 0.94) were found. Convergent validity was supported, with strong relationships between self-reported measures of cognition. Discriminant validity was supported by less robust correlations between self-reported cognition and other constructs. Prodromal participants reported fewer cognitive problems than manifest groups, and early-stage HD participants reported fewer problems than late-stage HD participants. The Neuro-QoL Cognition measures provide reliable and valid assessments of self-reported cognitive functioning for individuals with HD. Findings support the utility of these measures for assessing self-reported cognition.

  3. Multiple-Objective Optimal Designs for Studying the Dose Response Function and Interesting Dose Levels

    PubMed Central

    Hyun, Seung Won; Wong, Weng Kee

    2016-01-01

    We construct an optimal design to simultaneously estimate three common interesting features in a dose-finding trial with possibly different emphasis on each feature. These features are (1) the shape of the dose-response curve, (2) the median effective dose and (3) the minimum effective dose level. A main difficulty of this task is that an optimal design for a single objective may not perform well for other objectives. There are optimal designs for dual objectives in the literature but we were unable to find optimal designs for 3 or more objectives to date with a concrete application. A reason for this is that the approach for finding a dual-objective optimal design does not work well for a 3 or more multiple-objective design problem. We propose a method for finding multiple-objective optimal designs that estimate the three features with user-specified higher efficiencies for the more important objectives. We use the flexible 4-parameter logistic model to illustrate the methodology but our approach is applicable to find multiple-objective optimal designs for other types of objectives and models. We also investigate robustness properties of multiple-objective optimal designs to mis-specification in the nominal parameter values and to a variation in the optimality criterion. We also provide computer code for generating tailor made multiple-objective optimal designs. PMID:26565557

  4. Multiple-Objective Optimal Designs for Studying the Dose Response Function and Interesting Dose Levels.

    PubMed

    Hyun, Seung Won; Wong, Weng Kee

    2015-11-01

    We construct an optimal design to simultaneously estimate three common interesting features in a dose-finding trial with possibly different emphasis on each feature. These features are (1) the shape of the dose-response curve, (2) the median effective dose and (3) the minimum effective dose level. A main difficulty of this task is that an optimal design for a single objective may not perform well for other objectives. There are optimal designs for dual objectives in the literature but we were unable to find optimal designs for 3 or more objectives to date with a concrete application. A reason for this is that the approach for finding a dual-objective optimal design does not work well for a 3 or more multiple-objective design problem. We propose a method for finding multiple-objective optimal designs that estimate the three features with user-specified higher efficiencies for the more important objectives. We use the flexible 4-parameter logistic model to illustrate the methodology but our approach is applicable to find multiple-objective optimal designs for other types of objectives and models. We also investigate robustness properties of multiple-objective optimal designs to mis-specification in the nominal parameter values and to a variation in the optimality criterion. We also provide computer code for generating tailor made multiple-objective optimal designs.

  5. Algorithms and Array Design Criteria for Robust Imaging in Interferometry

    NASA Astrophysics Data System (ADS)

    Kurien, Binoy George

    Optical interferometry is a technique for obtaining high-resolution imagery of a distant target by interfering light from multiple telescopes. Image restoration from interferometric measurements poses a unique set of challenges. The first challenge is that the measurement set provides only a sparse-sampling of the object's Fourier Transform and hence image formation from these measurements is an inherently ill-posed inverse problem. Secondly, atmospheric turbulence causes severe distortion of the phase of the Fourier samples. We develop array design conditions for unique Fourier phase recovery, as well as a comprehensive algorithmic framework based on the notion of redundant-spaced-calibration (RSC), which together achieve reliable image reconstruction in spite of these challenges. Within this framework, we see that classical interferometric observables such as the bispectrum and closure phase can limit sensitivity, and that generalized notions of these observables can improve both theoretical and empirical performance. Our framework leverages techniques from lattice theory to resolve integer phase ambiguities in the interferometric phase measurements, and from graph theory, to select a reliable set of generalized observables. We analyze the expected shot-noise-limited performance of our algorithm for both pairwise and Fizeau interferometric architectures and corroborate this analysis with simulation results. We apply techniques from the field of compressed sensing to perform image reconstruction from the estimates of the object's Fourier coefficients. The end result is a comprehensive strategy to achieve well-posed and easily-predictable reconstruction performance in optical interferometry.

  6. Quality Evaluation of Raw Moutan Cortex Using the AHP and Gray Correlation-TOPSIS Method

    PubMed Central

    Zhou, Sujuan; Liu, Bo; Meng, Jiang

    2017-01-01

    Background: Raw Moutan cortex (RMC) is an important Chinese herbal medicine. Comprehensive and objective quality evaluation of Chinese herbal medicine has been one of the most important issues in the modern herbs development. Objective: To evaluate and compare the quality of RMC using the weighted gray correlation- Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) method. Materials and Methods: The percentage composition of gallic acid, catechin, oxypaeoniflorin, paeoniflorin, quercetin, benzoylpaeoniflorin, paeonol in different batches of RMC was determined, and then adopting MATLAB programming to construct the gray correlation-TOPSIS assessment model for quality evaluation of RMC. Results: The quality evaluation results of model evaluation and objective evaluation were consistent, reliable, and stable. Conclusion: The model of gray correlation-TOPSIS can be well applied to the quality evaluation of traditional Chinese medicine with multiple components and has broad prospect in application. SUMMARY The experiment tries to construct a model to evaluate the quality of RMC using the weighted gray correlation- Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) method. Results show the model is reliable and provide a feasible way in evaluating quality of traditional Chinese medicine with multiple components. PMID:28839384

  7. Multiple player tracking in sports video: a dual-mode two-way bayesian inference approach with progressive observation modeling.

    PubMed

    Xing, Junliang; Ai, Haizhou; Liu, Liwei; Lao, Shihong

    2011-06-01

    Multiple object tracking (MOT) is a very challenging task yet of fundamental importance for many practical applications. In this paper, we focus on the problem of tracking multiple players in sports video which is even more difficult due to the abrupt movements of players and their complex interactions. To handle the difficulties in this problem, we present a new MOT algorithm which contributes both in the observation modeling level and in the tracking strategy level. For the observation modeling, we develop a progressive observation modeling process that is able to provide strong tracking observations and greatly facilitate the tracking task. For the tracking strategy, we propose a dual-mode two-way Bayesian inference approach which dynamically switches between an offline general model and an online dedicated model to deal with single isolated object tracking and multiple occluded object tracking integrally by forward filtering and backward smoothing. Extensive experiments on different kinds of sports videos, including football, basketball, as well as hockey, demonstrate the effectiveness and efficiency of the proposed method.

  8. Application of decentralized cooperative problem solving in dynamic flexible scheduling

    NASA Astrophysics Data System (ADS)

    Guan, Zai-Lin; Lei, Ming; Wu, Bo; Wu, Ya; Yang, Shuzi

    1995-08-01

    The object of this study is to discuss an intelligent solution to the problem of task-allocation in shop floor scheduling. For this purpose, the technique of distributed artificial intelligence (DAI) is applied. Intelligent agents (IAs) are used to realize decentralized cooperation, and negotiation is realized by using message passing based on the contract net model. Multiple agents, such as manager agents, workcell agents, and workstation agents, make game-like decisions based on multiple criteria evaluations. This procedure of decentralized cooperative problem solving makes local scheduling possible. And by integrating such multiple local schedules, dynamic flexible scheduling for the whole shop floor production can be realized.

  9. Multi-Objective Hybrid Optimal Control for Multiple-Flyby Interplanetary Mission Design Using Chemical Propulsion

    NASA Technical Reports Server (NTRS)

    Englander, Jacob; Vavrina, Matthew

    2015-01-01

    The customer (scientist or project manager) most often does not want just one point solution to the mission design problem Instead, an exploration of a multi-objective trade space is required. For a typical main-belt asteroid mission the customer might wish to see the trade-space of: Launch date vs. Flight time vs. Deliverable mass, while varying the destination asteroid, planetary flybys, launch year, etcetera. To address this question we use a multi-objective discrete outer-loop which defines many single objective real-valued inner-loop problems.

  10. The Utility and Challenges of Using ICD Codes in Child Maltreatment Research: A Review of Existing Literature

    ERIC Educational Resources Information Center

    Scott, Debbie; Tonmyr, Lil; Fraser, Jenny; Walker, Sue; McKenzie, Kirsten

    2009-01-01

    Objective: The objectives of this article are to explore the extent to which the International Statistical Classification of Diseases and Related Health Problems (ICD) has been used in child abuse research, to describe how the ICD system has been applied, and to assess factors affecting the reliability of ICD coded data in child abuse research.…

  11. Application of Multi-Hypothesis Sequential Monte Carlo for Breakup Analysis

    NASA Astrophysics Data System (ADS)

    Faber, W. R.; Zaidi, W.; Hussein, I. I.; Roscoe, C. W. T.; Wilkins, M. P.; Schumacher, P. W., Jr.

    As more objects are launched into space, the potential for breakup events and space object collisions is ever increasing. These events create large clouds of debris that are extremely hazardous to space operations. Providing timely, accurate, and statistically meaningful Space Situational Awareness (SSA) data is crucial in order to protect assets and operations in space. The space object tracking problem, in general, is nonlinear in both state dynamics and observations, making it ill-suited to linear filtering techniques such as the Kalman filter. Additionally, given the multi-object, multi-scenario nature of the problem, space situational awareness requires multi-hypothesis tracking and management that is combinatorially challenging in nature. In practice, it is often seen that assumptions of underlying linearity and/or Gaussianity are used to provide tractable solutions to the multiple space object tracking problem. However, these assumptions are, at times, detrimental to tracking data and provide statistically inconsistent solutions. This paper details a tractable solution to the multiple space object tracking problem applicable to space object breakup events. Within this solution, simplifying assumptions of the underlying probability density function are relaxed and heuristic methods for hypothesis management are avoided. This is done by implementing Sequential Monte Carlo (SMC) methods for both nonlinear filtering as well as hypothesis management. This goal of this paper is to detail the solution and use it as a platform to discuss computational limitations that hinder proper analysis of large breakup events.

  12. Tracking Algorithm of Multiple Pedestrians Based on Particle Filters in Video Sequences

    PubMed Central

    Liu, Yun; Wang, Chuanxu; Zhang, Shujun; Cui, Xuehong

    2016-01-01

    Pedestrian tracking is a critical problem in the field of computer vision. Particle filters have been proven to be very useful in pedestrian tracking for nonlinear and non-Gaussian estimation problems. However, pedestrian tracking in complex environment is still facing many problems due to changes of pedestrian postures and scale, moving background, mutual occlusion, and presence of pedestrian. To surmount these difficulties, this paper presents tracking algorithm of multiple pedestrians based on particle filters in video sequences. The algorithm acquires confidence value of the object and the background through extracting a priori knowledge thus to achieve multipedestrian detection; it adopts color and texture features into particle filter to get better observation results and then automatically adjusts weight value of each feature according to current tracking environment. During the process of tracking, the algorithm processes severe occlusion condition to prevent drift and loss phenomena caused by object occlusion and associates detection results with particle state to propose discriminated method for object disappearance and emergence thus to achieve robust tracking of multiple pedestrians. Experimental verification and analysis in video sequences demonstrate that proposed algorithm improves the tracking performance and has better tracking results. PMID:27847514

  13. Optimal Bi-Objective Redundancy Allocation for Systems Reliability and Risk Management.

    PubMed

    Govindan, Kannan; Jafarian, Ahmad; Azbari, Mostafa E; Choi, Tsan-Ming

    2016-08-01

    In the big data era, systems reliability is critical to effective systems risk management. In this paper, a novel multiobjective approach, with hybridization of a known algorithm called NSGA-II and an adaptive population-based simulated annealing (APBSA) method is developed to solve the systems reliability optimization problems. In the first step, to create a good algorithm, we use a coevolutionary strategy. Since the proposed algorithm is very sensitive to parameter values, the response surface method is employed to estimate the appropriate parameters of the algorithm. Moreover, to examine the performance of our proposed approach, several test problems are generated, and the proposed hybrid algorithm and other commonly known approaches (i.e., MOGA, NRGA, and NSGA-II) are compared with respect to four performance measures: 1) mean ideal distance; 2) diversification metric; 3) percentage of domination; and 4) data envelopment analysis. The computational studies have shown that the proposed algorithm is an effective approach for systems reliability and risk management.

  14. Multiple-foil microabrasion package (A0023)

    NASA Technical Reports Server (NTRS)

    Mcdonnell, J. A. M.; Ashworth, D. G.; Carey, W. C.; Flavill, R. P.; Jennison, R. C.

    1984-01-01

    The specific scientific objectives of this experiment are to measure the spatial distribution, size, velocity, radiance, and composition of microparticles in near-Earth space. The technological objectives are to measure erosion rates resulting from microparticle impacts and to evaluate thin-foil meteor 'bumpers'. The combinations of sensitivity and reliability in this experiment will provide up to 1000 impacts per month for laboratory analysis and will extend current sensitivity limits by 5 orders of magnitude in mass.

  15. [Reliability and validity of warning signs checklist for screening psychological, behavioral and developmental problems of children].

    PubMed

    Huang, X N; Zhang, Y; Feng, W W; Wang, H S; Cao, B; Zhang, B; Yang, Y F; Wang, H M; Zheng, Y; Jin, X M; Jia, M X; Zou, X B; Zhao, C X; Robert, J; Jing, Jin

    2017-06-02

    Objective: To evaluate the reliability and validity of warning signs checklist developed by the National Health and Family Planning Commission of the People's Republic of China (NHFPC), so as to determine the screening effectiveness of warning signs on developmental problems of early childhood. Method: Stratified random sampling method was used to assess the reliability and validity of checklist of warning sign and 2 110 children 0 to 6 years of age(1 513 low-risk subjects and 597 high-risk subjects) were recruited from 11 provinces of China. The reliability evaluation for the warning signs included the test-retest reliability and interrater reliability. With the use of Age and Stage Questionnaire (ASQ) and Gesell Development Diagnosis Scale (GESELL) as the criterion scales, criterion validity was assessed by determining the correlation and consistency between the screening results of warning signs and the criterion scales. Result: In terms of the warning signs, the screening positive rates at different ages ranged from 10.8%(21/141) to 26.2%(51/137). The median (interquartile) testing time for each subject was 1(0.6) minute. Both the test-retest reliability and interrater reliability of warning signs reached 0.7 or above, indicating that the stability was good. In terms of validity assessment, there was remarkable consistency between ASQ and warning signs, with the Kappa value of 0.63. With the use of GESELL as criterion, it was determined that the sensitivity of warning signs in children with suspected developmental delay was 82.2%, and the specificity was 77.7%. The overall Youden index was 0.6. Conclusion: The reliability and validity of warning signs checklist for screening early childhood developmental problems have met the basic requirements of psychological screening scales, with the characteristics of short testing time and easy operation. Thus, this warning signs checklist can be used for screening psychological and behavioral problems of early childhood, especially in community settings.

  16. Biologically-inspired approaches for self-organization, adaptation, and collaboration of heterogeneous autonomous systems

    NASA Astrophysics Data System (ADS)

    Steinberg, Marc

    2011-06-01

    This paper presents a selective survey of theoretical and experimental progress in the development of biologicallyinspired approaches for complex surveillance and reconnaissance problems with multiple, heterogeneous autonomous systems. The focus is on approaches that may address ISR problems that can quickly become mathematically intractable or otherwise impractical to implement using traditional optimization techniques as the size and complexity of the problem is increased. These problems require dealing with complex spatiotemporal objectives and constraints at a variety of levels from motion planning to task allocation. There is also a need to ensure solutions are reliable and robust to uncertainty and communications limitations. First, the paper will provide a short introduction to the current state of relevant biological research as relates to collective animal behavior. Second, the paper will describe research on largely decentralized, reactive, or swarm approaches that have been inspired by biological phenomena such as schools of fish, flocks of birds, ant colonies, and insect swarms. Next, the paper will discuss approaches towards more complex organizational and cooperative mechanisms in team and coalition behaviors in order to provide mission coverage of large, complex areas. Relevant team behavior may be derived from recent advances in understanding of the social and cooperative behaviors used for collaboration by tens of animals with higher-level cognitive abilities such as mammals and birds. Finally, the paper will briefly discuss challenges involved in user interaction with these types of systems.

  17. Testing jumps via false discovery rate control.

    PubMed

    Yen, Yu-Min

    2013-01-01

    Many recently developed nonparametric jump tests can be viewed as multiple hypothesis testing problems. For such multiple hypothesis tests, it is well known that controlling type I error often makes a large proportion of erroneous rejections, and such situation becomes even worse when the jump occurrence is a rare event. To obtain more reliable results, we aim to control the false discovery rate (FDR), an efficient compound error measure for erroneous rejections in multiple testing problems. We perform the test via the Barndorff-Nielsen and Shephard (BNS) test statistic, and control the FDR with the Benjamini and Hochberg (BH) procedure. We provide asymptotic results for the FDR control. From simulations, we examine relevant theoretical results and demonstrate the advantages of controlling the FDR. The hybrid approach is then applied to empirical analysis on two benchmark stock indices with high frequency data.

  18. Satellite Fault Diagnosis Using Support Vector Machines Based on a Hybrid Voting Mechanism

    PubMed Central

    Yang, Shuqiang; Zhu, Xiaoqian; Jin, Songchang; Wang, Xiang

    2014-01-01

    The satellite fault diagnosis has an important role in enhancing the safety, reliability, and availability of the satellite system. However, the problem of enormous parameters and multiple faults makes a challenge to the satellite fault diagnosis. The interactions between parameters and misclassifications from multiple faults will increase the false alarm rate and the false negative rate. On the other hand, for each satellite fault, there is not enough fault data for training. To most of the classification algorithms, it will degrade the performance of model. In this paper, we proposed an improving SVM based on a hybrid voting mechanism (HVM-SVM) to deal with the problem of enormous parameters, multiple faults, and small samples. Many experimental results show that the accuracy of fault diagnosis using HVM-SVM is improved. PMID:25215324

  19. On Space Exploration and Human Error: A Paper on Reliability and Safety

    NASA Technical Reports Server (NTRS)

    Bell, David G.; Maluf, David A.; Gawdiak, Yuri

    2005-01-01

    NASA space exploration should largely address a problem class in reliability and risk management stemming primarily from human error, system risk and multi-objective trade-off analysis, by conducting research into system complexity, risk characterization and modeling, and system reasoning. In general, in every mission we can distinguish risk in three possible ways: a) known-known, b) known-unknown, and c) unknown-unknown. It is probably almost certain that space exploration will partially experience similar known or unknown risks embedded in the Apollo missions, Shuttle or Station unless something alters how NASA will perceive and manage safety and reliability

  20. Trauma Focused CBT for Children with Co-Occurring Trauma and Behavior Problems

    ERIC Educational Resources Information Center

    Cohen, Judith A.; Berliner, Lucy; Mannarino, Anthony

    2010-01-01

    Objective: Childhood trauma impacts multiple domains of functioning including behavior. Traumatized children commonly have behavioral problems that therapists must effectively evaluate and manage in the context of providing trauma-focused treatment. This manuscript describes practical strategies for managing behavior problems in the context of…

  1. An object-oriented approach to risk and reliability analysis : methodology and aviation safety applications.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane

    2003-09-01

    This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less

  2. Unreliability as a Threat to Understanding Psychopathology: The Cautionary Tale of Attentional Bias

    PubMed Central

    Rodebaugh, Thomas L.; Scullin, Rachel B.; Langer, Julia K.; Dixon, David J.; Huppert, Jonathan D.; Bernstein, Amit; Zvielli, Ariel; Lenze, Eric J.

    2016-01-01

    The use of unreliable measures constitutes a threat to our understanding of psychopathology, because advancement of science using both behavioral and biologically-oriented measures can only be certain if such measurements are reliable. Two pillars of NIMH’s portfolio – the Research Domain Criteria (RDoC) initiative for psychopathology and the target engagement initiative in clinical trials – cannot succeed without measures that possess the high reliability necessary for tests involving mediation and selection based on individual differences. We focus on the historical lack of reliability of attentional bias measures as an illustration of how reliability can pose a threat to our understanding. Our own data replicate previous findings of poor reliability for traditionally-used scores, which suggests a serious problem with the ability to test theories regarding attentional bias. This lack of reliability may also suggest problems with the assumption (in both theory and the formula for the scores) that attentional bias is consistent and stable across time. In contrast, measures accounting for attention as a dynamic process in time show good reliability in our data. The field is sorely in need of research reporting findings and reliability for attentional bias scores using multiple methods, including those focusing on dynamic processes over time. We urge researchers to test and report reliability of all measures, considering findings of low reliability not just as a nuisance but as an opportunity to modify and improve upon the underlying theory. Full assessment of reliability of measures will maximize the possibility that RDoC (and psychological science more generally) will succeed. PMID:27322741

  3. Competing Biases in Mental Arithmetic: When Division Is More and Multiplication Is Less.

    PubMed

    Shaki, Samuel; Fischer, Martin H

    2017-01-01

    Mental arithmetic exhibits various biases. Among those is a tendency to overestimate addition and to underestimate subtraction outcomes. Does such "operational momentum" (OM) also affect multiplication and division? Twenty-six adults produced lines whose lengths corresponded to the correct outcomes of multiplication and division problems shown in symbolic format. We found a reliable tendency to over-estimate division outcomes, i.e., reverse OM. We suggest that anchoring on the first operand (a tendency to use this number as a reference for further quantitative reasoning) contributes to cognitive biases in mental arithmetic.

  4. Recognition of partially occluded threat objects using the annealed Hopefield network

    NASA Technical Reports Server (NTRS)

    Kim, Jung H.; Yoon, Sung H.; Park, Eui H.; Ntuen, Celestine A.

    1992-01-01

    Recognition of partially occluded objects has been an important issue to airport security because occlusion causes significant problems in identifying and locating objects during baggage inspection. The neural network approach is suitable for the problems in the sense that the inherent parallelism of neural networks pursues many hypotheses in parallel resulting in high computation rates. Moreover, they provide a greater degree of robustness or fault tolerance than conventional computers. The annealed Hopfield network which is derived from the mean field annealing (MFA) has been developed to find global solutions of a nonlinear system. In the study, it has been proven that the system temperature of MFA is equivalent to the gain of the sigmoid function of a Hopfield network. In our early work, we developed the hybrid Hopfield network (HHN) for fast and reliable matching. However, HHN doesn't guarantee global solutions and yields false matching under heavily occluded conditions because HHN is dependent on initial states by its nature. In this paper, we present the annealed Hopfield network (AHN) for occluded object matching problems. In AHN, the mean field theory is applied to the hybird Hopfield network in order to improve computational complexity of the annealed Hopfield network and provide reliable matching under heavily occluded conditions. AHN is slower than HHN. However, AHN provides near global solutions without initial restrictions and provides less false matching than HHN. In conclusion, a new algorithm based upon a neural network approach was developed to demonstrate the feasibility of the automated inspection of threat objects from x-ray images. The robustness of the algorithm is proved by identifying occluded target objects with large tolerance of their features.

  5. Salient object detection method based on multiple semantic features

    NASA Astrophysics Data System (ADS)

    Wang, Chunyang; Yu, Chunyan; Song, Meiping; Wang, Yulei

    2018-04-01

    The existing salient object detection model can only detect the approximate location of salient object, or highlight the background, to resolve the above problem, a salient object detection method was proposed based on image semantic features. First of all, three novel salient features were presented in this paper, including object edge density feature (EF), object semantic feature based on the convex hull (CF) and object lightness contrast feature (LF). Secondly, the multiple salient features were trained with random detection windows. Thirdly, Naive Bayesian model was used for combine these features for salient detection. The results on public datasets showed that our method performed well, the location of salient object can be fixed and the salient object can be accurately detected and marked by the specific window.

  6. Electronics reliability and measurement technology

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S. (Editor)

    1987-01-01

    A summary is presented of the Electronics Reliability and Measurement Technology Workshop. The meeting examined the U.S. electronics industry with particular focus on reliability and state-of-the-art technology. A general consensus of the approximately 75 attendees was that "the U.S. electronics industries are facing a crisis that may threaten their existence". The workshop had specific objectives to discuss mechanisms to improve areas such as reliability, yield, and performance while reducing failure rates, delivery times, and cost. The findings of the workshop addressed various aspects of the industry from wafers to parts to assemblies. Key problem areas that were singled out for attention are identified, and action items necessary to accomplish their resolution are recommended.

  7. A framework for multi-stakeholder decision-making and ...

    EPA Pesticide Factsheets

    We propose a decision-making framework to compute compromise solutions that balance conflicting priorities of multiple stakeholders on multiple objectives. In our setting, we shape the stakeholder dis-satisfaction distribution by solving a conditional-value-at-risk (CVaR) minimization problem. The CVaR problem is parameterized by a probability level that shapes the tail of the dissatisfaction distribution. The proposed approach allows us to compute a family of compromise solutions and generalizes multi-stakeholder settings previously proposed in the literature that minimize average and worst-case dissatisfactions. We use the concept of the CVaR norm to give a geometric interpretation to this problem +and use the properties of this norm to prove that the CVaR minimization problem yields Pareto optimal solutions for any choice of the probability level. We discuss a broad range of potential applications of the framework that involve complex decision-making processes. We demonstrate the developments using a biowaste facility location case study in which we seek to balance stakeholder priorities on transportation, safety, water quality, and capital costs. This manuscript describes the methodology of a new decision-making framework that computes compromise solutions that balance conflicting priorities of multiple stakeholders on multiple objectives as needed for SHC Decision Science and Support Tools project. A biowaste facility location is employed as the case study

  8. Decision-making for foot-and-mouth disease control: Objectives matter

    USGS Publications Warehouse

    Probert, William J. M.; Shea, Katriona; Fonnesbeck, Christopher J.; Runge, Michael C.; Carpenter, Tim E.; Durr, Salome; Garner, M. Graeme; Harvey, Neil; Stevenson, Mark A.; Webb, Colleen T.; Werkman, Marleen; Tildesley, Michael J.; Ferrari, Matthew J.

    2016-01-01

    Formal decision-analytic methods can be used to frame disease control problems, the first step of which is to define a clear and specific objective. We demonstrate the imperative of framing clearly-defined management objectives in finding optimal control actions for control of disease outbreaks. We illustrate an analysis that can be applied rapidly at the start of an outbreak when there are multiple stakeholders involved with potentially multiple objectives, and when there are also multiple disease models upon which to compare control actions. The output of our analysis frames subsequent discourse between policy-makers, modellers and other stakeholders, by highlighting areas of discord among different management objectives and also among different models used in the analysis. We illustrate this approach in the context of a hypothetical foot-and-mouth disease (FMD) outbreak in Cumbria, UK using outputs from five rigorously-studied simulation models of FMD spread. We present both relative rankings and relative performance of controls within each model and across a range of objectives. Results illustrate how control actions change across both the base metric used to measure management success and across the statistic used to rank control actions according to said metric. This work represents a first step towards reconciling the extensive modelling work on disease control problems with frameworks for structured decision making.

  9. Why do people appear not to extrapolate trajectories during multiple object tracking? A computational investigation

    PubMed Central

    Zhong, Sheng-hua; Ma, Zheng; Wilson, Colin; Liu, Yan; Flombaum, Jonathan I

    2014-01-01

    Intuitively, extrapolating object trajectories should make visual tracking more accurate. This has proven to be true in many contexts that involve tracking a single item. But surprisingly, when tracking multiple identical items in what is known as “multiple object tracking,” observers often appear to ignore direction of motion, relying instead on basic spatial memory. We investigated potential reasons for this behavior through probabilistic models that were endowed with perceptual limitations in the range of typical human observers, including noisy spatial perception. When we compared a model that weights its extrapolations relative to other sources of information about object position, and one that does not extrapolate at all, we found no reliable difference in performance, belying the intuition that extrapolation always benefits tracking. In follow-up experiments we found this to be true for a variety of models that weight observations and predictions in different ways; in some cases we even observed worse performance for models that use extrapolations compared to a model that does not at all. Ultimately, the best performing models either did not extrapolate, or extrapolated very conservatively, relying heavily on observations. These results illustrate the difficulty and attendant hazards of using noisy inputs to extrapolate the trajectories of multiple objects simultaneously in situations with targets and featurally confusable nontargets. PMID:25311300

  10. Normalized Metadata Generation for Human Retrieval Using Multiple Video Surveillance Cameras.

    PubMed

    Jung, Jaehoon; Yoon, Inhye; Lee, Seungwon; Paik, Joonki

    2016-06-24

    Since it is impossible for surveillance personnel to keep monitoring videos from a multiple camera-based surveillance system, an efficient technique is needed to help recognize important situations by retrieving the metadata of an object-of-interest. In a multiple camera-based surveillance system, an object detected in a camera has a different shape in another camera, which is a critical issue of wide-range, real-time surveillance systems. In order to address the problem, this paper presents an object retrieval method by extracting the normalized metadata of an object-of-interest from multiple, heterogeneous cameras. The proposed metadata generation algorithm consists of three steps: (i) generation of a three-dimensional (3D) human model; (ii) human object-based automatic scene calibration; and (iii) metadata generation. More specifically, an appropriately-generated 3D human model provides the foot-to-head direction information that is used as the input of the automatic calibration of each camera. The normalized object information is used to retrieve an object-of-interest in a wide-range, multiple-camera surveillance system in the form of metadata. Experimental results show that the 3D human model matches the ground truth, and automatic calibration-based normalization of metadata enables a successful retrieval and tracking of a human object in the multiple-camera video surveillance system.

  11. Normalized Metadata Generation for Human Retrieval Using Multiple Video Surveillance Cameras

    PubMed Central

    Jung, Jaehoon; Yoon, Inhye; Lee, Seungwon; Paik, Joonki

    2016-01-01

    Since it is impossible for surveillance personnel to keep monitoring videos from a multiple camera-based surveillance system, an efficient technique is needed to help recognize important situations by retrieving the metadata of an object-of-interest. In a multiple camera-based surveillance system, an object detected in a camera has a different shape in another camera, which is a critical issue of wide-range, real-time surveillance systems. In order to address the problem, this paper presents an object retrieval method by extracting the normalized metadata of an object-of-interest from multiple, heterogeneous cameras. The proposed metadata generation algorithm consists of three steps: (i) generation of a three-dimensional (3D) human model; (ii) human object-based automatic scene calibration; and (iii) metadata generation. More specifically, an appropriately-generated 3D human model provides the foot-to-head direction information that is used as the input of the automatic calibration of each camera. The normalized object information is used to retrieve an object-of-interest in a wide-range, multiple-camera surveillance system in the form of metadata. Experimental results show that the 3D human model matches the ground truth, and automatic calibration-based normalization of metadata enables a successful retrieval and tracking of a human object in the multiple-camera video surveillance system. PMID:27347961

  12. Evaluation of Genetic Algorithm Concepts Using Model Problems. Part 2; Multi-Objective Optimization

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.; Pulliam, Thomas H.

    2003-01-01

    A genetic algorithm approach suitable for solving multi-objective optimization problems is described and evaluated using a series of simple model problems. Several new features including a binning selection algorithm and a gene-space transformation procedure are included. The genetic algorithm is suitable for finding pareto optimal solutions in search spaces that are defined by any number of genes and that contain any number of local extrema. Results indicate that the genetic algorithm optimization approach is flexible in application and extremely reliable, providing optimal results for all optimization problems attempted. The binning algorithm generally provides pareto front quality enhancements and moderate convergence efficiency improvements for most of the model problems. The gene-space transformation procedure provides a large convergence efficiency enhancement for problems with non-convoluted pareto fronts and a degradation in efficiency for problems with convoluted pareto fronts. The most difficult problems --multi-mode search spaces with a large number of genes and convoluted pareto fronts-- require a large number of function evaluations for GA convergence, but always converge.

  13. Optimizing multiple reliable forward contracts for reservoir allocation using multitime scale streamflow forecasts

    NASA Astrophysics Data System (ADS)

    Lu, Mengqian; Lall, Upmanu; Robertson, Andrew W.; Cook, Edward

    2017-03-01

    Streamflow forecasts at multiple time scales provide a new opportunity for reservoir management to address competing objectives. Market instruments such as forward contracts with specified reliability are considered as a tool that may help address the perceived risk associated with the use of such forecasts in lieu of traditional operation and allocation strategies. A water allocation process that enables multiple contracts for water supply and hydropower production with different durations, while maintaining a prescribed level of flood risk reduction, is presented. The allocation process is supported by an optimization model that considers multitime scale ensemble forecasts of monthly streamflow and flood volume over the upcoming season and year, the desired reliability and pricing of proposed contracts for hydropower and water supply. It solves for the size of contracts at each reliability level that can be allocated for each future period, while meeting target end of period reservoir storage with a prescribed reliability. The contracts may be insurable, given that their reliability is verified through retrospective modeling. The process can allow reservoir operators to overcome their concerns as to the appropriate skill of probabilistic forecasts, while providing water users with short-term and long-term guarantees as to how much water or energy they may be allocated. An application of the optimization model to the Bhakra Dam, India, provides an illustration of the process. The issues of forecast skill and contract performance are examined. A field engagement of the idea is useful to develop a real-world perspective and needs a suitable institutional environment.

  14. Verification of road databases using multiple road models

    NASA Astrophysics Data System (ADS)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  15. Multiple task performance as a predictor of the potential of air traffic controller trainees.

    DOT National Transportation Integrated Search

    1972-01-01

    Two hundred and twenty-nine air traffic controller trainees were tested on the CAMI Multiple Task Performance Battery. The battery provides objective measures of monitoring, arithmetical skills, visual discrimination, and group problem solving. The c...

  16. Dynamic electrical impedance imaging with the interacting multiple model scheme.

    PubMed

    Kim, Kyung Youn; Kim, Bong Seok; Kim, Min Chan; Kim, Sin; Isaacson, David; Newell, Jonathan C

    2005-04-01

    In this paper, an effective dynamical EIT imaging scheme is presented for on-line monitoring of the abruptly changing resistivity distribution inside the object, based on the interacting multiple model (IMM) algorithm. The inverse problem is treated as a stochastic nonlinear state estimation problem with the time-varying resistivity (state) being estimated on-line with the aid of the IMM algorithm. In the design of the IMM algorithm multiple models with different process noise covariance are incorporated to reduce the modeling uncertainty. Simulations and phantom experiments are provided to illustrate the proposed algorithm.

  17. Application of fuzzy theories to formulation of multi-objective design problems. [for helicopters

    NASA Technical Reports Server (NTRS)

    Dhingra, A. K.; Rao, S. S.; Miura, H.

    1988-01-01

    Much of the decision making in real world takes place in an environment in which the goals, the constraints, and the consequences of possible actions are not known precisely. In order to deal with imprecision quantitatively, the tools of fuzzy set theory can by used. This paper demonstrates the effectiveness of fuzzy theories in the formulation and solution of two types of helicopter design problems involving multiple objectives. The first problem deals with the determination of optimal flight parameters to accomplish a specified mission in the presence of three competing objectives. The second problem addresses the optimal design of the main rotor of a helicopter involving eight objective functions. A method of solving these multi-objective problems using nonlinear programming techniques is presented. Results obtained using fuzzy formulation are compared with those obtained using crisp optimization techniques. The outlined procedures are expected to be useful in situations where doubt arises about the exactness of permissible values, degree of credibility, and correctness of statements and judgements.

  18. A pricing approach for mitigating congestion in multimodal transportation systems.

    DOT National Transportation Integrated Search

    2010-02-19

    The problem addressed in this research is to determine usage prices for a system with : multiple modes of transportation with the objective of reducing congestion. With multiple : modes, these prices can take on several forms. On road networks, the u...

  19. Recent advances in applying decision science to managing national forests

    Treesearch

    Bruce G. Marcot; Matthew P. Thompson; Michael C. Runge; Frank R. Thompson; Steven McNulty; David Cleaves; Monica Tomosy; Larry A. Fisher; Andrew Bliss

    2012-01-01

    Management of federal public forests to meet sustainability goals and multiple use regulations is an immense challenge. To succeed, we suggest use of formal decision science procedures and tools in the context of structured decision making (SDM). SDM entails four stages: problem structuring (framing the problem and defining objectives and evaluation criteria), problem...

  20. Computer aided reliability, availability, and safety modeling for fault-tolerant computer systems with commentary on the HARP program

    NASA Technical Reports Server (NTRS)

    Shooman, Martin L.

    1991-01-01

    Many of the most challenging reliability problems of our present decade involve complex distributed systems such as interconnected telephone switching computers, air traffic control centers, aircraft and space vehicles, and local area and wide area computer networks. In addition to the challenge of complexity, modern fault-tolerant computer systems require very high levels of reliability, e.g., avionic computers with MTTF goals of one billion hours. Most analysts find that it is too difficult to model such complex systems without computer aided design programs. In response to this need, NASA has developed a suite of computer aided reliability modeling programs beginning with CARE 3 and including a group of new programs such as: HARP, HARP-PC, Reliability Analysts Workbench (Combination of model solvers SURE, STEM, PAWS, and common front-end model ASSIST), and the Fault Tree Compiler. The HARP program is studied and how well the user can model systems using this program is investigated. One of the important objectives will be to study how user friendly this program is, e.g., how easy it is to model the system, provide the input information, and interpret the results. The experiences of the author and his graduate students who used HARP in two graduate courses are described. Some brief comparisons were made with the ARIES program which the students also used. Theoretical studies of the modeling techniques used in HARP are also included. Of course no answer can be any more accurate than the fidelity of the model, thus an Appendix is included which discusses modeling accuracy. A broad viewpoint is taken and all problems which occurred in the use of HARP are discussed. Such problems include: computer system problems, installation manual problems, user manual problems, program inconsistencies, program limitations, confusing notation, long run times, accuracy problems, etc.

  1. Pursuing the Qualities of a "Good" Test

    ERIC Educational Resources Information Center

    Coniam, David

    2014-01-01

    This article examines the issue of the quality of teacher-produced tests, limiting itself in the current context to objective, multiple-choice tests. The article investigates a short, two-part 20-item English language test. After a brief overview of the key test qualities of reliability and validity, the article examines the two subtests in terms…

  2. View-Invariant Object Category Learning, Recognition, and Search: How Spatial and Object Attention are Coordinated Using Surface-Based Attentional Shrouds

    ERIC Educational Resources Information Center

    Fazl, Arash; Grossberg, Stephen; Mingolla, Ennio

    2009-01-01

    How does the brain learn to recognize an object from multiple viewpoints while scanning a scene with eye movements? How does the brain avoid the problem of erroneously classifying parts of different objects together? How are attention and eye movements intelligently coordinated to facilitate object learning? A neural model provides a unified…

  3. Government/Industry Workshop on Payload Loads Technology

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A fully operational space shuttle is discussed which will offer science the opportunity to explore near earth orbit and finally interplanetary space on nearly a limitless basis. This multiplicity of payload/experiment combinations and frequency of launches places many burdens on dynamicists to predict launch and landing environments accurately and efficiently. Two major problems are apparent in the attempt to design for the diverse environments: (1) balancing the design criteria (loads, etc.) between launch and orbit operations, and (2) developing analytical techniques that are reliable, accurate, efficient, and low cost to meet the challenge of multiple launches and payloads. This paper deals with the key issues inherent in these problems, the key trades required, the basic approaches needed, and a summary of the state-of-the-art techniques.

  4. Distributed computation: the new wave of synthetic biology devices.

    PubMed

    Macía, Javier; Posas, Francesc; Solé, Ricard V

    2012-06-01

    Synthetic biology (SB) offers a unique opportunity for designing complex molecular circuits able to perform predefined functions. But the goal of achieving a flexible toolbox of reusable molecular components has been shown to be limited due to circuit unpredictability, incompatible parts or random fluctuations. Many of these problems arise from the challenges posed by engineering the molecular circuitry: multiple wires are usually difficult to implement reliably within one cell and the resulting systems cannot be reused in other modules. These problems are solved by means of a nonstandard approach to single cell devices, using cell consortia and allowing the output signal to be distributed among different cell types, which can be combined in multiple, reusable and scalable ways. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. An Analysis of the Multiple Objective Capital Budgeting Problem via Fuzzy Linear Integer (0-1) Programming.

    DTIC Science & Technology

    1980-05-31

    34 International Journal of Man- Machine Studies , Vol. 9, No. 1, 1977, pp. 1-68. [16] Zimmermann, H. J., Theory and Applications of Fuzzy Sets, Institut...Boston, Inc., Hingham, MA, 1978. [18] Yager, R. R., "Multiple Objective Decision-Making Using Fuzzy Sets," International Journal of Man- Machine Studies ...Professor of Industria Engineering ... iv t TABLE OF CONTENTS page ABSTRACT .. .. . ...... . .... ...... ........ iii LIST OF TABLES

  6. An experimental investigation of fault tolerant software structures in an avionics application

    NASA Technical Reports Server (NTRS)

    Caglayan, Alper K.; Eckhardt, Dave E., Jr.

    1989-01-01

    The objective of this experimental investigation is to compare the functional performance and software reliability of competing fault tolerant software structures utilizing software diversity. In this experiment, three versions of the redundancy management software for a skewed sensor array have been developed using three diverse failure detection and isolation algorithms and incorporated into various N-version, recovery block and hybrid software structures. The empirical results show that, for maximum functional performance improvement in the selected application domain, the results of diverse algorithms should be voted before being processed by multiple versions without enforced diversity. Results also suggest that when the reliability gain with an N-version structure is modest, recovery block structures are more feasible since higher reliability can be obtained using an acceptance check with a modest reliability.

  7. Test-Retest Reliability of the Multiple Sleep Latency Test in Narcolepsy without Cataplexy and Idiopathic Hypersomnia

    PubMed Central

    Trotti, Lynn Marie; Staab, Beth A.; Rye, David B.

    2013-01-01

    Study Objectives: Differentiation of narcolepsy without cataplexy from idiopathic hypersomnia relies entirely upon the multiple sleep latency test (MSLT). However, the test-retest reliability for these central nervous system hypersomnias has never been determined. Methods: Patients with narcolepsy without cataplexy, idiopathic hypersomnia, and physiologic hypersomnia who underwent two diagnostic multiple sleep latency tests were identified retrospectively. Correlations between the mean sleep latencies on the two studies were evaluated, and we probed for demographic and clinical features associated with reproducibility versus change in diagnosis. Results: Thirty-six patients (58% women, mean age 34 years) were included. Inter -test interval was 4.2 ± 3.8 years (range 2.5 months to 16.9 years). Mean sleep latencies on the first and second tests were 5.5 (± 3.7 SD) and 7.3 (± 3.9) minutes, respectively, with no significant correlation (r = 0.17, p = 0.31). A change in diagnosis occurred in 53% of patients, and was accounted for by a difference in the mean sleep latency (N = 15, 42%) or the number of sleep onset REM periods (N = 11, 31%). The only feature predictive of a diagnosis change was a history of hypnagogic or hypnopompic hallucinations. Conclusions: The multiple sleep latency test demonstrates poor test-retest reliability in a clinical population of patients with central nervous system hypersomnia evaluated in a tertiary referral center. Alternative diagnostic tools are needed. Citation: Trotti LM; Staab BA; Rye DB. Test- retest reliability of the multiple sleep latency test in narcolepsy without cataplexy and idiopathic hypersomnia. J Clin Sleep Med 2013;9(8):789-795. PMID:23946709

  8. Test of Understanding of Vectors: A Reliable Multiple-Choice Vector Concept Test

    ERIC Educational Resources Information Center

    Barniol, Pablo; Zavala, Genaro

    2014-01-01

    In this article we discuss the findings of our research on students' understanding of vector concepts in problems without physical context. First, we develop a complete taxonomy of the most frequent errors made by university students when learning vector concepts. This study is based on the results of several test administrations of open-ended…

  9. Generalizability and Dependability of a Multi-Item Direct Behavior Rating Scale in a Kindergarten Classroom Setting

    ERIC Educational Resources Information Center

    Wickerd, Garry; Hulac, David

    2017-01-01

    Accurate and rapid identification of students displaying behavioral problems requires instrumentation that is user friendly and reliable. The purpose of the study was to evaluate a multi-item direct behavior rating scale called the Direct Behavior Rating-Multiple Item Scale (DBR-MIS) for disruptive behavior to determine the number of…

  10. Validation of the Chinese Handwriting Analysis System (CHAS) for primary school students in Hong Kong.

    PubMed

    Li-Tsang, Cecilia W P; Wong, Agnes S K; Leung, Howard W H; Cheng, Joyce S; Chiu, Billy H W; Tse, Linda F L; Chung, Raymond C K

    2013-09-01

    There are more children diagnosed with specific learning difficulties in recent years as people are more aware of these conditions. Diagnostic tool has been validated to screen out this condition from the population (SpLD test for Hong Kong children). However, for specific assessment on handwriting problem, there seems a lack of standardized and objective evaluation tool to look into the problems. The objective of this study was to validate the Chinese Handwriting Analysis System (CHAS), which is designed to measure both the process and production of handwriting. The construct validity, convergent validity, internal consistency and test-retest reliability of CHAS was analyzed using the data from 734 grade 1-6 students from 6 primary schools in Hong Kong. Principal Component Analysis revealed that measurements of CHAS loaded into 4 components which accounted for 77.73% of the variance. The correlation between the handwriting accuracy obtained from HAS and eyeballing was r=.73. Cronbach's alpha of all measurement items was .65. Except SD of writing time per character, all the measurement items regarding handwriting speed, handwriting accuracy and pen pressure showed good to excellent test-retest reliability (r=.72-.96), while measurement on the numbers of characters which exceeded grid showed moderate reliability (r=.48). Although there are still ergonomic, biomechanical or unspecified aspects which may not be determined by the system, the CHAS can definitely assist therapists in identifying primary school students with handwriting problems and implement interventions accordingly. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Quality of data in multiethnic health surveys.

    PubMed Central

    Pasick, R. J.; Stewart, S. L.; Bird, J. A.; D'Onofrio, C. N.

    2001-01-01

    OBJECTIVE: There has been insufficient research on the influence of ethno-cultural and language differences in public health surveys. Using data from three independent studies, the authors examine methods to assess data quality and to identify causes of problematic survey questions. METHODS: Qualitative and quantitative methods were used in this exploratory study, including secondary analyses of data from three baseline surveys (conducted in English, Spanish, Cantonese, Mandarin, and Vietnamese). Collection of additional data included interviews with investigators and interviewers; observations of item development; focus groups; think-aloud interviews; a test-retest assessment survey; and a pilot test of alternatively worded questions. RESULTS: The authors identify underlying causes for the 12 most problematic variables in three multiethnic surveys and describe them in terms of ethnic differences in reliability, validity, and cognitive processes (interpretation, memory retrieval, judgment formation, and response editing), and differences with regard to cultural appropriateness and translation problems. CONCLUSIONS: Multiple complex elements affect measurement in a multiethnic survey, many of which are neither readily observed nor understood through standard tests of data quality. Multiethnic survey questions are best evaluated using a variety of quantitative and qualitative methods that reveal different types and causes of problems. PMID:11889288

  12. Determinants of Psychosocial Difficulties Experienced by Persons with Brain Disorders: Towards a ‘Horizontal Epidemiology’ Approach

    PubMed Central

    Sabariego, Carla; Coenen, Michaela; Ballert, Carolina; Cabello, Maria; Leonardi, Matilde; Anczewska, Marta; Pitkänen, Tuuli; Raggi, Alberto; Mellor, Blanca; Covelli, Venusia; Świtaj, Piotr; Levola, Jonna; Schiavolin, Silvia; Chrostek, Anna; Bickenbach, Jerome; Chatterji, Somnath; Cieza, Alarcos

    2015-01-01

    Background Persons with brain disorders experience significant psychosocial difficulties (PSD) in daily life, e.g. problems with managing daily routine or emotional lability, and the level of the PSD depends on social, physical and political environments, and psychologic-personal determinants. Our objective is to determine a brief set of environmental and psychologic-personal factors that are shared determinants of PSD among persons with different brain disorders. Methods Cross-sectional study, convenience sample of persons with either dementia, stroke, multiple sclerosis, epilepsy, migraine, depression, schizophrenia, substance dependence or Parkinson’s disease. Random forest regression and classical linear regression were used in the analyses. Results 722 subjects were interviewed in four European countries. The brief set of determinants encompasses presence of comorbidities, health status appraisal, stressful life events, personality changes, adaptation, self-esteem, self-worth, built environment, weather, and health problems in the family. Conclusions The identified brief set of common determinants of PSD can be used to support the implementation of cross-cutting interventions, social actions and policy tools to lower PSD experienced by persons with brain disorders. This set complements a recently proposed reliable and valid direct metric of PSD for brain disorders called PARADISE24. PMID:26675663

  13. Resolving future fire management conflicts using multicriteria decision making.

    PubMed

    Driscoll, Don A; Bode, Michael; Bradstock, Ross A; Keith, David A; Penman, Trent D; Price, Owen F

    2016-02-01

    Management strategies to reduce the risks to human life and property from wildfire commonly involve burning native vegetation. However, planned burning can conflict with other societal objectives such as human health and biodiversity conservation. These conflicts are likely to intensify as fire regimes change under future climates and as growing human populations encroach farther into fire-prone ecosystems. Decisions about managing fire risks are therefore complex and warrant more sophisticated approaches than are typically used. We applied a multicriteria decision making approach (MCDA) with the potential to improve fire management outcomes to the case of a highly populated, biodiverse, and flammable wildland-urban interface. We considered the effects of 22 planned burning options on 8 objectives: house protection, maximizing water quality, minimizing carbon emissions and impacts on human health, and minimizing declines of 5 distinct species types. The MCDA identified a small number of management options (burning forest adjacent to houses) that performed well for most objectives, but not for one species type (arboreal mammal) or for water quality. Although MCDA made the conflict between objectives explicit, resolution of the problem depended on the weighting assigned to each objective. Additive weighting of criteria traded off the arboreal mammal and water quality objectives for other objectives. Multiplicative weighting identified scenarios that avoided poor outcomes for any objective, which is important for avoiding potentially irreversible biodiversity losses. To distinguish reliably among management options, future work should focus on reducing uncertainty in outcomes across a range of objectives. Considering management actions that have more predictable outcomes than landscape fuel management will be important. We found that, where data were adequate, an MCDA can support decision making in the complex and often conflicted area of fire management. © 2015 Society for Conservation Biology.

  14. Utilization of wireless structural health monitoring as decision making tools for a condition and reliability-based assessment of railroad bridges

    NASA Astrophysics Data System (ADS)

    Flanigan, Katherine A.; Johnson, Nephi R.; Hou, Rui; Ettouney, Mohammed; Lynch, Jerome P.

    2017-04-01

    The ability to quantitatively assess the condition of railroad bridges facilitates objective evaluation of their robustness in the face of hazard events. Of particular importance is the need to assess the condition of railroad bridges in networks that are exposed to multiple hazards. Data collected from structural health monitoring (SHM) can be used to better maintain a structure by prompting preventative (rather than reactive) maintenance strategies and supplying quantitative information to aid in recovery. To that end, a wireless monitoring system is validated and installed on the Harahan Bridge which is a hundred-year-old long-span railroad truss bridge that crosses the Mississippi River near Memphis, TN. This bridge is exposed to multiple hazards including scour, vehicle/barge impact, seismic activity, and aging. The instrumented sensing system targets non-redundant structural components and areas of the truss and floor system that bridge managers are most concerned about based on previous inspections and structural analysis. This paper details the monitoring system and the analytical method for the assessment of bridge condition based on automated data-driven analyses. Two primary objectives of monitoring the system performance are discussed: 1) monitoring fatigue accumulation in critical tensile truss elements; and 2) monitoring the reliability index values associated with sub-system limit states of these members. Moreover, since the reliability index is a scalar indicator of the safety of components, quantifiable condition assessment can be used as an objective metric so that bridge owners can make informed damage mitigation strategies and optimize resource management on single bridge or network levels.

  15. Multiple task performance as a predictor of the potential of air traffic controller trainees : a followup study.

    DOT National Transportation Integrated Search

    1974-11-01

    Two hundred and twenty-nine air traffic controller trainees were tested on the CAMI Multiple Task Performance Battery. The battery provides objective measures of monitoring, arithmetical skills, visual discrimination, and group problem solving. The c...

  16. Overview of the SAE G-11 RMSL (Reliability, Maintainability, Supportability, and Logistics) Division Activities and Technical Projects

    NASA Technical Reports Server (NTRS)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL (Reliability, Maintainability, Supportability, and Logistics) Division activities include identification and fulfillment of joint industry, government, and academia needs for development and implementation of RMSL technologies. Four Projects in the Probabilistic Methods area and two in the area of RMSL have been identified. These are: (1) Evaluation of Probabilistic Technology - progress has been made toward the selection of probabilistic application cases. Future effort will focus on assessment of multiple probabilistic softwares in solving selected engineering problems using probabilistic methods. Relevance to Industry & Government - Case studies of typical problems encountering uncertainties, results of solutions to these problems run by different codes, and recommendations on which code is applicable for what problems; (2) Probabilistic Input Preparation - progress has been made in identifying problem cases such as those with no data, little data and sufficient data. Future effort will focus on developing guidelines for preparing input for probabilistic analysis, especially with no or little data. Relevance to Industry & Government - Too often, we get bogged down thinking we need a lot of data before we can quantify uncertainties. Not True. There are ways to do credible probabilistic analysis with little data; (3) Probabilistic Reliability - probabilistic reliability literature search has been completed along with what differentiates it from statistical reliability. Work on computation of reliability based on quantification of uncertainties in primitive variables is in progress. Relevance to Industry & Government - Correct reliability computations both at the component and system level are needed so one can design an item based on its expected usage and life span; (4) Real World Applications of Probabilistic Methods (PM) - A draft of volume 1 comprising aerospace applications has been released. Volume 2, a compilation of real world applications of probabilistic methods with essential information demonstrating application type and timehost savings by the use of probabilistic methods for generic applications is in progress. Relevance to Industry & Government - Too often, we say, 'The Proof is in the Pudding'. With help from many contributors, we hope to produce such a document. Problem is - not too many people are coming forward due to proprietary nature. So, we are asking to document only minimum information including problem description, what method used, did it result in any savings, and how much?; (5) Software Reliability - software reliability concept, program, implementation, guidelines, and standards are being documented. Relevance to Industry & Government - software reliability is a complex issue that must be understood & addressed in all facets of business in industry, government, and other institutions. We address issues, concepts, ways to implement solutions, and guidelines for maximizing software reliability; (6) Maintainability Standards - maintainability/serviceability industry standard/guidelines and industry best practices and methodologies used in performing maintainability/ serviceability tasks are being documented. Relevance to Industry & Government - Any industry or government process, project, and/or tool must be maintained and serviced to realize the life and performance it was designed for. We address issues and develop guidelines for optimum performance & life.

  17. A Microsoft Kinect-Based Point-of-Care Gait Assessment Framework for Multiple Sclerosis Patients.

    PubMed

    Gholami, Farnood; Trojan, Daria A; Kovecses, Jozsef; Haddad, Wassim M; Gholami, Behnood

    2017-09-01

    Gait impairment is a prevalent and important difficulty for patients with multiple sclerosis (MS), a common neurological disorder. An easy to use tool to objectively evaluate gait in MS patients in a clinical setting can assist clinicians to perform an objective assessment. The overall objective of this study is to develop a framework to quantify gait abnormalities in MS patients using the Microsoft Kinect for the Windows sensor; an inexpensive, easy to use, portable camera. Specifically, we aim to evaluate its feasibility for utilization in a clinical setting, assess its reliability, evaluate the validity of gait indices obtained, and evaluate a novel set of gait indices based on the concept of dynamic time warping. In this study, ten ambulatory MS patients, and ten age and sex-matched normal controls were studied at one session in a clinical setting with gait assessment using a Kinect camera. The expanded disability status scale (EDSS) clinical ambulation score was calculated for the MS subjects, and patients completed the Multiple Sclerosis walking scale (MSWS). Based on this study, we established the potential feasibility of using a Microsoft Kinect camera in a clinical setting. Seven out of the eight gait indices obtained using the proposed method were reliable with intraclass correlation coefficients ranging from 0.61 to 0.99. All eight MS gait indices were significantly different from those of the controls (p-values less than 0.05). Finally, seven out of the eight MS gait indices were correlated with the objective and subjective gait measures (Pearson's correlation coefficients greater than 0.40). This study shows that the Kinect camera is an easy to use tool to assess gait in MS patients in a clinical setting.

  18. Characteristics of objective daytime sleep among individuals with earthquake-related posttraumatic stress disorder: A pilot community-based polysomnographic and multiple sleep latency test study.

    PubMed

    Zhang, Yan; Li, Yun; Zhu, Hongru; Cui, Haofei; Qiu, Changjian; Tang, Xiangdong; Zhang, Wei

    2017-01-01

    Little is known about the objective sleep characteristics of patients with posttraumatic stress disorder (PTSD). The present study examines the association between PTSD symptom severity and objective daytime sleep characteristics measured using the Multiple Sleep Latency Test (MSLT) in therapy-naïve patients with earthquake-related PTSD. A total of 23 PTSD patients and 13 trauma-exposed non-PTSD (TEN-PTSD) subjects completed one-night in-lab polysomnography (PSG) followed by a standard MSLT. 8 of the 23 PTSD patients received paroxetine treatment. Compared to the TEN-PTSD subjects, no significant nighttime sleep disturbances were detected by PSG in the subjects with PTSD; however, a shorter mean MSLT value was found in the subjects with PTSD. After adjustment for age, sex, and body mass index, PTSD symptoms, particularly hyperarousal, were found to be independently associated with a shorter MSLT value. Further, the mean MSLT value increased significantly after therapy in PTSD subjects. A shorter MSLT value may be a reliable index of the medical severity of PTSD, while an improvement in MSLT values might also be a reliable marker for evaluating therapeutic efficacy in PTSD patients. Copyright © 2016. Published by Elsevier Ireland Ltd.

  19. Multi-objective Decision Based Available Transfer Capability in Deregulated Power System Using Heuristic Approaches

    NASA Astrophysics Data System (ADS)

    Pasam, Gopi Krishna; Manohar, T. Gowri

    2016-09-01

    Determination of available transfer capability (ATC) requires the use of experience, intuition and exact judgment in order to meet several significant aspects in the deregulated environment. Based on these points, this paper proposes two heuristic approaches to compute ATC. The first proposed heuristic algorithm integrates the five methods known as continuation repeated power flow, repeated optimal power flow, radial basis function neural network, back propagation neural network and adaptive neuro fuzzy inference system to obtain ATC. The second proposed heuristic model is used to obtain multiple ATC values. Out of these, a specific ATC value will be selected based on a number of social, economic, deregulated environmental constraints and related to specific applications like optimization, on-line monitoring, and ATC forecasting known as multi-objective decision based optimal ATC. The validity of results obtained through these proposed methods are scrupulously verified on various buses of the IEEE 24-bus reliable test system. The results presented and derived conclusions in this paper are very useful for planning, operation, maintaining of reliable power in any power system and its monitoring in an on-line environment of deregulated power system. In this way, the proposed heuristic methods would contribute the best possible approach to assess multiple objective ATC using integrated methods.

  20. Reliability-Based Control Design for Uncertain Systems

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.

    2005-01-01

    This paper presents a robust control design methodology for systems with probabilistic parametric uncertainty. Control design is carried out by solving a reliability-based multi-objective optimization problem where the probability of violating design requirements is minimized. Simultaneously, failure domains are optimally enlarged to enable global improvements in the closed-loop performance. To enable an efficient numerical implementation, a hybrid approach for estimating reliability metrics is developed. This approach, which integrates deterministic sampling and asymptotic approximations, greatly reduces the numerical burden associated with complex probabilistic computations without compromising the accuracy of the results. Examples using output-feedback and full-state feedback with state estimation are used to demonstrate the ideas proposed.

  1. On the interrelation of multiplication and division in secondary school children.

    PubMed

    Huber, Stefan; Fischer, Ursula; Moeller, Korbinian; Nuerk, Hans-Christoph

    2013-01-01

    Each division problem can be transformed into as a multiplication problem and vice versa. Recent research has indicated strong developmental parallels between multiplication and division in primary school children. In this study, we were interested in (i) whether these developmental parallels persist into secondary school, (ii) whether similar developmental parallels can be observed for simple and complex problems, (iii) whether skill level modulates this relationship, and (iv) whether the correlations are specific and not driven by general cognitive or arithmetic abilities. Therefore, we assessed performance of 5th and 6th graders attending two secondary school types of the German educational system in simple and complex multiplication as well as division while controlling for non-verbal intelligence, short-term memory, and other arithmetic abilities. Accordingly, we collected data from students differing in skills levels due to either age (5th < 6th grade) or school type (general < intermediate secondary school). We observed moderate to strong bivariate and partial correlations between multiplication and division with correlations being higher for simple tasks but nevertheless reliable for complex tasks. Moreover, the association between simple multiplication and division depended on students' skill levels as reflected by school types, but not by age. Partial correlations were higher for intermediate than for general secondary school children. In sum, these findings emphasize the importance of the inverse relationship between multiplication and division which persists into later developmental stages. However, evidence for skill-related differences in the relationship between multiplication and division was restricted to the differences for school types.

  2. A vacuum microgripping tool with integrated vibration releasing capability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rong, Weibin; Fan, Zenghua, E-mail: zenghua-fan@163.com; Wang, Lefeng

    2014-08-01

    Pick-and-place of micro-objects is a basic task in various micromanipulation demands. Reliable releasing of micro-objects is usually disturbed due to strong scale effects. This paper focuses on a vacuum micro-gripper with vibration releasing functionality, which was designed and assembled for reliable micromanipulation tasks. Accordingly, a vibration releasing strategy of implementing a piezoelectric actuator on the vacuum microgripping tool is presented to address the releasing problem. The releasing mechanism was illustrated using a dynamic micro contact model. This model was developed via theoretical analysis, simulations and pull-off force measurement using atomic force microscopy. Micromanipulation experiments were conducted to verify the performancemore » of the vacuum micro-gripper. The results show that, with the assistance of the vibration releasing, the vacuum microgripping tool can achieve reliable release of micro-objects. A releasing location accuracy of 4.5±0.5 μm and a successful releasing rate of around 100% (which is based on 110 trials) were achieved for manipulating polystyrene microspheres with radius of 35–100 μm.« less

  3. System Design under Uncertainty: Evolutionary Optimization of the Gravity Probe-B Spacecraft

    NASA Technical Reports Server (NTRS)

    Pullen, Samuel P.; Parkinson, Bradford W.

    1994-01-01

    This paper discusses the application of evolutionary random-search algorithms (Simulated Annealing and Genetic Algorithms) to the problem of spacecraft design under performance uncertainty. Traditionally, spacecraft performance uncertainty has been measured by reliability. Published algorithms for reliability optimization are seldom used in practice because they oversimplify reality. The algorithm developed here uses random-search optimization to allow us to model the problem more realistically. Monte Carlo simulations are used to evaluate the objective function for each trial design solution. These methods have been applied to the Gravity Probe-B (GP-B) spacecraft being developed at Stanford University for launch in 1999, Results of the algorithm developed here for GP-13 are shown, and their implications for design optimization by evolutionary algorithms are discussed.

  4. Reach on sound: a key to object permanence in visually impaired children.

    PubMed

    Fazzi, Elisa; Signorini, Sabrina Giovanna; Bomba, Monica; Luparia, Antonella; Lanners, Josée; Balottin, Umberto

    2011-04-01

    The capacity to reach an object presented through sound clue indicates, in the blind child, the acquisition of object permanence and gives information over his/her cognitive development. To assess cognitive development in congenitally blind children with or without multiple disabilities. Cohort study. Thirty-seven congenitally blind subjects (17 with associated multiple disabilities, 20 mainly blind) were enrolled. We used Bigelow's protocol to evaluate "reach on sound" capacity over time (at 6, 12, 18, 24, and 36 months), and a battery of clinical, neurophysiological and cognitive instruments to assess clinical features. Tasks n.1 to 5 were acquired by most of the mainly blind children by 12 months of age. Task 6 coincided with a drop in performance, and the acquisition of the subsequent tasks showed a less agehomogeneous pattern. In blind children with multiple disabilities, task acquisition rates were lower, with the curves dipping in relation to the more complex tasks. The mainly blind subjects managed to overcome Fraiberg's "conceptual problem"--i.e., they acquired the ability to attribute an external object with identity and substance even when it manifested its presence through sound only--and thus developed the ability to reach an object presented through sound. Instead, most of the blind children with multiple disabilities presented poor performances on the "reach on sound" protocol and were unable, before 36 months of age, to develop the strategies needed to resolve Fraiberg's "conceptual problem". Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Behavioral and cognitive outcomes for clinical trials in children with neurofibromatosis type 1.

    PubMed

    van der Vaart, Thijs; Rietman, André B; Plasschaert, Ellen; Legius, Eric; Elgersma, Ype; Moll, Henriëtte A

    2016-01-12

    To evaluate the appropriateness of cognitive and behavioral outcome measures in clinical trials in neurofibromatosis type 1 (NF1) by analyzing the degree of deficits compared to reference groups, test-retest reliability, and how scores correlate between outcome measures. Data were analyzed from the Simvastatin for cognitive deficits and behavioral problems in patients with neurofibromatosis type 1 (NF1-SIMCODA) trial, a randomized placebo-controlled trial of simvastatin for cognitive deficits and behavioral problems in children with NF1. Outcome measures were compared with age-specific reference groups to identify domains of dysfunction. Pearson r was computed for before and after measurements within the placebo group to assess test-retest reliability. Principal component analysis was used to identify the internal structure in the outcome data. Strongest mean score deviations from the reference groups were observed for full-scale intelligence (-1.1 SD), Rey Complex Figure Test delayed recall (-2.0 SD), attention problems (-1.2 SD), and social problems (-1.1 SD). Long-term test-retest reliability were excellent for Wechsler scales (r > 0.88), but poor to moderate for other neuropsychological tests (r range 0.52-0.81) and Child Behavioral Checklist subscales (r range 0.40-0.79). The correlation structure revealed 2 strong components in the outcome measures behavior and cognition, with no correlation between these components. Scores on psychosocial quality of life correlate strongly with behavioral problems and less with cognitive deficits. Children with NF1 show distinct deficits in multiple domains. Many outcome measures showed weak test-retest correlations over the 1-year trial period. Cognitive and behavioral outcomes are complementary. This analysis demonstrates the need to include reliable outcome measures on a variety of cognitive and behavioral domains in clinical trials for NF1. © 2015 American Academy of Neurology.

  6. Target recognition and scene interpretation in image/video understanding systems based on network-symbolic models

    NASA Astrophysics Data System (ADS)

    Kuvich, Gary

    2004-08-01

    Vision is only a part of a system that converts visual information into knowledge structures. These structures drive the vision process, resolving ambiguity and uncertainty via feedback, and provide image understanding, which is an interpretation of visual information in terms of these knowledge models. These mechanisms provide a reliable recognition if the object is occluded or cannot be recognized as a whole. It is hard to split the entire system apart, and reliable solutions to the target recognition problems are possible only within the solution of a more generic Image Understanding Problem. Brain reduces informational and computational complexities, using implicit symbolic coding of features, hierarchical compression, and selective processing of visual information. Biologically inspired Network-Symbolic representation, where both systematic structural/logical methods and neural/statistical methods are parts of a single mechanism, is the most feasible for such models. It converts visual information into relational Network-Symbolic structures, avoiding artificial precise computations of 3-dimensional models. Network-Symbolic Transformations derive abstract structures, which allows for invariant recognition of an object as exemplar of a class. Active vision helps creating consistent models. Attention, separation of figure from ground and perceptual grouping are special kinds of network-symbolic transformations. Such Image/Video Understanding Systems will be reliably recognizing targets.

  7. Video based object representation and classification using multiple covariance matrices.

    PubMed

    Zhang, Yurong; Liu, Quan

    2017-01-01

    Video based object recognition and classification has been widely studied in computer vision and image processing area. One main issue of this task is to develop an effective representation for video. This problem can generally be formulated as image set representation. In this paper, we present a new method called Multiple Covariance Discriminative Learning (MCDL) for image set representation and classification problem. The core idea of MCDL is to represent an image set using multiple covariance matrices with each covariance matrix representing one cluster of images. Firstly, we use the Nonnegative Matrix Factorization (NMF) method to do image clustering within each image set, and then adopt Covariance Discriminative Learning on each cluster (subset) of images. At last, we adopt KLDA and nearest neighborhood classification method for image set classification. Promising experimental results on several datasets show the effectiveness of our MCDL method.

  8. Development and validation of a physics problem-solving assessment rubric

    NASA Astrophysics Data System (ADS)

    Docktor, Jennifer Lynn

    Problem solving is a complex process that is important for everyday life and crucial for learning physics. Although there is a great deal of effort to improve student problem solving throughout the educational system, there is no standard way to evaluate written problem solving that is valid, reliable, and easy to use. Most tests of problem solving performance given in the classroom focus on the correctness of the end result or partial results rather than the quality of the procedures and reasoning leading to the result, which gives an inadequate description of a student's skills. A more detailed and meaningful measure is necessary if different curricular materials or pedagogies are to be compared. This measurement tool could also allow instructors to diagnose student difficulties and focus their coaching. It is important that the instrument be applicable to any problem solving format used by a student and to a range of problem types and topics typically used by instructors. Typically complex processes such as problem solving are assessed by using a rubric, which divides a skill into multiple quasi-independent categories and defines criteria to attain a score in each. This dissertation describes the development of a problem solving rubric for the purpose of assessing written solutions to physics problems and presents evidence for the validity, reliability, and utility of score interpretations on the instrument.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piekarski, D.; Brad, D.

    This report is about a work effort where the overall objectives were to establish a methodology and approach for selected transmission and distribution (T&D) grid modernization; monitor the results; and report on the findings, recommendations, and lessons learned. The work reported addressed T&D problems and solutions, related reliability issues, equipment and operation upgrades, and respective field testing.

  10. Evaluation of the ToxRTool's ability to rate the reliability of toxicological data for human health hazard assessments

    EPA Science Inventory

    Regulatory agencies often utilize results from peer reviewed publications for hazard assessments.A problem in doing so is the lack of well-accepted tools to objectively, efficiently and systematically assess the quality of published toxicological studies. Herein, we evaluated the...

  11. Automatic multiple zebrafish larvae tracking in unconstrained microscopic video conditions.

    PubMed

    Wang, Xiaoying; Cheng, Eva; Burnett, Ian S; Huang, Yushi; Wlodkowic, Donald

    2017-12-14

    The accurate tracking of zebrafish larvae movement is fundamental to research in many biomedical, pharmaceutical, and behavioral science applications. However, the locomotive characteristics of zebrafish larvae are significantly different from adult zebrafish, where existing adult zebrafish tracking systems cannot reliably track zebrafish larvae. Further, the far smaller size differentiation between larvae and the container render the detection of water impurities inevitable, which further affects the tracking of zebrafish larvae or require very strict video imaging conditions that typically result in unreliable tracking results for realistic experimental conditions. This paper investigates the adaptation of advanced computer vision segmentation techniques and multiple object tracking algorithms to develop an accurate, efficient and reliable multiple zebrafish larvae tracking system. The proposed system has been tested on a set of single and multiple adult and larvae zebrafish videos in a wide variety of (complex) video conditions, including shadowing, labels, water bubbles and background artifacts. Compared with existing state-of-the-art and commercial multiple organism tracking systems, the proposed system improves the tracking accuracy by up to 31.57% in unconstrained video imaging conditions. To facilitate the evaluation on zebrafish segmentation and tracking research, a dataset with annotated ground truth is also presented. The software is also publicly accessible.

  12. The Use of Object-Oriented Analysis Methods in Surety Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automaticmore » model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.« less

  13. Object oriented development of engineering software using CLIPS

    NASA Technical Reports Server (NTRS)

    Yoon, C. John

    1991-01-01

    Engineering applications involve numeric complexity and manipulations of a large amount of data. Traditionally, numeric computation has been the concern in developing an engineering software. As engineering application software became larger and more complex, management of resources such as data, rather than the numeric complexity, has become the major software design problem. Object oriented design and implementation methodologies can improve the reliability, flexibility, and maintainability of the resulting software; however, some tasks are better solved with the traditional procedural paradigm. The C Language Integrated Production System (CLIPS), with deffunction and defgeneric constructs, supports the procedural paradigm. The natural blending of object oriented and procedural paradigms has been cited as the reason for the popularity of the C++ language. The CLIPS Object Oriented Language's (COOL) object oriented features are more versatile than C++'s. A software design methodology based on object oriented and procedural approaches appropriate for engineering software, and to be implemented in CLIPS was outlined. A method for sensor placement for Space Station Freedom is being implemented in COOL as a sample problem.

  14. Energy-efficient fault tolerance in multiprocessor real-time systems

    NASA Astrophysics Data System (ADS)

    Guo, Yifeng

    The recent progress in the multiprocessor/multicore systems has important implications for real-time system design and operation. From vehicle navigation to space applications as well as industrial control systems, the trend is to deploy multiple processors in real-time systems: systems with 4 -- 8 processors are common, and it is expected that many-core systems with dozens of processing cores will be available in near future. For such systems, in addition to general temporal requirement common for all real-time systems, two additional operational objectives are seen as critical: energy efficiency and fault tolerance. An intriguing dimension of the problem is that energy efficiency and fault tolerance are typically conflicting objectives, due to the fact that tolerating faults (e.g., permanent/transient) often requires extra resources with high energy consumption potential. In this dissertation, various techniques for energy-efficient fault tolerance in multiprocessor real-time systems have been investigated. First, the Reliability-Aware Power Management (RAPM) framework, which can preserve the system reliability with respect to transient faults when Dynamic Voltage Scaling (DVS) is applied for energy savings, is extended to support parallel real-time applications with precedence constraints. Next, the traditional Standby-Sparing (SS) technique for dual processor systems, which takes both transient and permanent faults into consideration while saving energy, is generalized to support multiprocessor systems with arbitrary number of identical processors. Observing the inefficient usage of slack time in the SS technique, a Preference-Oriented Scheduling Framework is designed to address the problem where tasks are given preferences for being executed as soon as possible (ASAP) or as late as possible (ALAP). A preference-oriented earliest deadline (POED) scheduler is proposed and its application in multiprocessor systems for energy-efficient fault tolerance is investigated, where tasks' main copies are executed ASAP while backup copies ALAP to reduce the overlapped execution of main and backup copies of the same task and thus reduce energy consumption. All proposed techniques are evaluated through extensive simulations and compared with other state-of-the-art approaches. The simulation results confirm that the proposed schemes can preserve the system reliability while still achieving substantial energy savings. Finally, for both SS and POED based Energy-Efficient Fault-Tolerant (EEFT) schemes, a series of recovery strategies are designed when more than one (transient and permanent) faults need to be tolerated.

  15. Applications of numerical methods to simulate the movement of contaminants in groundwater.

    PubMed Central

    Sun, N Z

    1989-01-01

    This paper reviews mathematical models and numerical methods that have been extensively used to simulate the movement of contaminants through the subsurface. The major emphasis is placed on the numerical methods of advection-dominated transport problems and inverse problems. Several mathematical models that are commonly used in field problems are listed. A variety of numerical solutions for three-dimensional models are introduced, including the multiple cell balance method that can be considered a variation of the finite element method. The multiple cell balance method is easy to understand and convenient for solving field problems. When the advection transport dominates the dispersion transport, two kinds of numerical difficulties, overshoot and numerical dispersion, are always involved in solving standard, finite difference methods and finite element methods. To overcome these numerical difficulties, various numerical techniques are developed, such as upstream weighting methods and moving point methods. A complete review of these methods is given and we also mention the problems of parameter identification, reliability analysis, and optimal-experiment design that are absolutely necessary for constructing a practical model. PMID:2695327

  16. Silviculture for multiple objectives in the Douglas-fir region.

    Treesearch

    R.O. Curtis; D.S. DeBell; C.A. Harrington; D.P. Lavender; J.B. St. Clair; J.C. Tappeiner; J.D. Walstad

    1998-01-01

    Silvicultural knowledge and practice have been evolving in the Pacific Northwest for nearly a century. Most research and management activities to date have focused on two major topics: (1) methods to regenerate older, naturally established forests after fire or timber harvest; and (2) growth and management of young stands. Today forest managers can reliably regenerate...

  17. Pareto-front shape in multiobservable quantum control

    NASA Astrophysics Data System (ADS)

    Sun, Qiuyang; Wu, Re-Bing; Rabitz, Herschel

    2017-03-01

    Many scenarios in the sciences and engineering require simultaneous optimization of multiple objective functions, which are usually conflicting or competing. In such problems the Pareto front, where none of the individual objectives can be further improved without degrading some others, shows the tradeoff relations between the competing objectives. This paper analyzes the Pareto-front shape for the problem of quantum multiobservable control, i.e., optimizing the expectation values of multiple observables in the same quantum system. Analytic and numerical results demonstrate that with two commuting observables the Pareto front is a convex polygon consisting of flat segments only, while with noncommuting observables the Pareto front includes convexly curved segments. We also assess the capability of a weighted-sum method to continuously capture the points along the Pareto front. Illustrative examples with realistic physical conditions are presented, including NMR control experiments on a 1H-13C two-spin system with two commuting or noncommuting observables.

  18. Design and Implementation of Replicated Object Layer

    NASA Technical Reports Server (NTRS)

    Koka, Sudhir

    1996-01-01

    One of the widely used techniques for construction of fault tolerant applications is the replication of resources so that if one copy fails sufficient copies may still remain operational to allow the application to continue to function. This thesis involves the design and implementation of an object oriented framework for replicating data on multiple sites and across different platforms. Our approach, called the Replicated Object Layer (ROL) provides a mechanism for consistent replication of data over dynamic networks. ROL uses the Reliable Multicast Protocol (RMP) as a communication protocol that provides for reliable delivery, serialization and fault tolerance. Besides providing type registration, this layer facilitates distributed atomic transactions on replicated data. A novel algorithm called the RMP Commit Protocol, which commits transactions efficiently in reliable multicast environment is presented. ROL provides recovery procedures to ensure that site and communication failures do not corrupt persistent data, and male the system fault tolerant to network partitions. ROL will facilitate building distributed fault tolerant applications by performing the burdensome details of replica consistency operations, and making it completely transparent to the application.Replicated databases are a major class of applications which could be built on top of ROL.

  19. Interrater reliability of the Korean version of the International Spinal Cord Injury Basic Pain Data Set.

    PubMed

    Kim, H R; Kim, H B; Lee, B S; Ko, H Y; Shin, H I

    2014-11-01

    To provide a Korean translation of the International Spinal Cord Injury Basic Pain Data Set (ISCIBPDS) and evaluate the interrater reliability of the translated version. Survey of community-dwelling people with spinal cord injury (SCI) in South Korea. The initial translation was performed by two translators with an in-depth knowledge of SCI, and was then checked by another person with a similar background. A total of 115 SCI participants (87 men, 28 women; 48.4±14.1 years) were evaluated using the Korean version of the ISCIBPDS by two different raters. Intraclass correlation coefficient (ICC) or Cohen's kappa (κ) was used for analysis. All 115 participants had at least one pain problem on both surveys. Seventeen (14.8%) participants described their pain as a single pain problem to one rater while reporting the same pain as two or more different pain problems to the other rater. Twenty-two (19.1%) other participants reported their pain problems in a different order of severity on the surveys. The Korean version of the ISCIBPDS had acceptable interrater reliability, except in the 'limit activities (how much do you limit your activities in order to keep your pain from getting worse?)' item (ICC=0.318). Provision of criteria for pain separation may facilitate the consistent application of ISCIBPDS. In addition, the ISCIBPDS, which evaluated pain problems separately, reflected the multiple and complex characteristics of SCI-related pain; this was a strength of this data set.

  20. Flight Testing of the Capillary Pumped Loop 3 Experiment

    NASA Technical Reports Server (NTRS)

    Ottenstein, Laura; Butler, Dan; Ku, Jentung; Cheung, Kwok; Baldauff, Robert; Hoang, Triem

    2002-01-01

    The Capillary Pumped Loop 3 (CAPL 3) experiment was a multiple evaporator capillary pumped loop experiment that flew in the Space Shuttle payload bay in December 2001 (STS-108). The main objective of CAPL 3 was to demonstrate in micro-gravity a multiple evaporator capillary pumped loop system, capable of reliable start-up, reliable continuous operation, and heat load sharing, with hardware for a deployable radiator. Tests performed on orbit included start-ups, power cycles, low power tests (100 W total), high power tests (up to 1447 W total), heat load sharing, variable/fixed conductance transition tests, and saturation temperature change tests. The majority of the tests were completed successfully, although the experiment did exhibit an unexpected sensitivity to shuttle maneuvers. This paper describes the experiment, the tests performed during the mission, and the test results.

  1. On the interrelation of multiplication and division in secondary school children

    PubMed Central

    Huber, Stefan; Fischer, Ursula; Moeller, Korbinian; Nuerk, Hans-Christoph

    2013-01-01

    Multiplication and division are conceptually inversely related: Each division problem can be transformed into as a multiplication problem and vice versa. Recent research has indicated strong developmental parallels between multiplication and division in primary school children. In this study, we were interested in (i) whether these developmental parallels persist into secondary school, (ii) whether similar developmental parallels can be observed for simple and complex problems, (iii) whether skill level modulates this relationship, and (iv) whether the correlations are specific and not driven by general cognitive or arithmetic abilities. Therefore, we assessed performance of 5th and 6th graders attending two secondary school types of the German educational system in simple and complex multiplication as well as division while controlling for non-verbal intelligence, short-term memory, and other arithmetic abilities. Accordingly, we collected data from students differing in skills levels due to either age (5th < 6th grade) or school type (general < intermediate secondary school). We observed moderate to strong bivariate and partial correlations between multiplication and division with correlations being higher for simple tasks but nevertheless reliable for complex tasks. Moreover, the association between simple multiplication and division depended on students' skill levels as reflected by school types, but not by age. Partial correlations were higher for intermediate than for general secondary school children. In sum, these findings emphasize the importance of the inverse relationship between multiplication and division which persists into later developmental stages. However, evidence for skill-related differences in the relationship between multiplication and division was restricted to the differences for school types. PMID:24133476

  2. Pattern-set generation algorithm for the one-dimensional multiple stock sizes cutting stock problem

    NASA Astrophysics Data System (ADS)

    Cui, Yaodong; Cui, Yi-Ping; Zhao, Zhigang

    2015-09-01

    A pattern-set generation algorithm (PSG) for the one-dimensional multiple stock sizes cutting stock problem (1DMSSCSP) is presented. The solution process contains two stages. In the first stage, the PSG solves the residual problems repeatedly to generate the patterns in the pattern set, where each residual problem is solved by the column-generation approach, and each pattern is generated by solving a single large object placement problem. In the second stage, the integer linear programming model of the 1DMSSCSP is solved using a commercial solver, where only the patterns in the pattern set are considered. The computational results of benchmark instances indicate that the PSG outperforms existing heuristic algorithms and rivals the exact algorithm in solution quality.

  3. Evaluation of multiple muscle loads through multi-objective optimization with prediction of subjective satisfaction level: illustration by an application to handrail position for standing.

    PubMed

    Chihara, Takanori; Seo, Akihiko

    2014-03-01

    Proposed here is an evaluation of multiple muscle loads and a procedure for determining optimum solutions to ergonomic design problems. The simultaneous muscle load evaluation is formulated as a multi-objective optimization problem, and optimum solutions are obtained for each participant. In addition, one optimum solution for all participants, which is defined as the compromise solution, is also obtained. Moreover, the proposed method provides both objective and subjective information to support the decision making of designers. The proposed method was applied to the problem of designing the handrail position for the sit-to-stand movement. The height and distance of the handrails were the design variables, and surface electromyograms of four muscles were measured. The optimization results suggest that the proposed evaluation represents the impressions of participants more completely than an independent use of muscle loads. In addition, the compromise solution is determined, and the benefits of the proposed method are examined. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. Clustering "N" Objects into "K" Groups under Optimal Scaling of Variables.

    ERIC Educational Resources Information Center

    van Buuren, Stef; Heiser, Willem J.

    1989-01-01

    A method based on homogeneity analysis (multiple correspondence analysis or multiple scaling) is proposed to reduce many categorical variables to one variable with "k" categories. The method is a generalization of the sum of squared distances cluster analysis problem to the case of mixed measurement level variables. (SLD)

  5. A robust approach towards unknown transformation, regional adjacency graphs, multigraph matching, segmentation video frames from unnamed aerial vehicles (UAV)

    NASA Astrophysics Data System (ADS)

    Gohatre, Umakant Bhaskar; Patil, Venkat P.

    2018-04-01

    In computer vision application, the multiple object detection and tracking, in real-time operation is one of the important research field, that have gained a lot of attentions, in last few years for finding non stationary entities in the field of image sequence. The detection of object is advance towards following the moving object in video and then representation of object is step to track. The multiple object recognition proof is one of the testing assignment from detection multiple objects from video sequence. The picture enrollment has been for quite some time utilized as a reason for the location the detection of moving multiple objects. The technique of registration to discover correspondence between back to back casing sets in view of picture appearance under inflexible and relative change. The picture enrollment is not appropriate to deal with event occasion that can be result in potential missed objects. In this paper, for address such problems, designs propose novel approach. The divided video outlines utilizing area adjancy diagram of visual appearance and geometric properties. Then it performed between graph sequences by using multi graph matching, then getting matching region labeling by a proposed graph coloring algorithms which assign foreground label to respective region. The plan design is robust to unknown transformation with significant improvement in overall existing work which is related to moving multiple objects detection in real time parameters.

  6. MANGO: a new approach to multiple sequence alignment.

    PubMed

    Zhang, Zefeng; Lin, Hao; Li, Ming

    2007-01-01

    Multiple sequence alignment is a classical and challenging task for biological sequence analysis. The problem is NP-hard. The full dynamic programming takes too much time. The progressive alignment heuristics adopted by most state of the art multiple sequence alignment programs suffer from the 'once a gap, always a gap' phenomenon. Is there a radically new way to do multiple sequence alignment? This paper introduces a novel and orthogonal multiple sequence alignment method, using multiple optimized spaced seeds and new algorithms to handle these seeds efficiently. Our new algorithm processes information of all sequences as a whole, avoiding problems caused by the popular progressive approaches. Because the optimized spaced seeds are provably significantly more sensitive than the consecutive k-mers, the new approach promises to be more accurate and reliable. To validate our new approach, we have implemented MANGO: Multiple Alignment with N Gapped Oligos. Experiments were carried out on large 16S RNA benchmarks showing that MANGO compares favorably, in both accuracy and speed, against state-of-art multiple sequence alignment methods, including ClustalW 1.83, MUSCLE 3.6, MAFFT 5.861, Prob-ConsRNA 1.11, Dialign 2.2.1, DIALIGN-T 0.2.1, T-Coffee 4.85, POA 2.0 and Kalign 2.0.

  7. Multiple degree of freedom object recognition using optical relational graph decision nets

    NASA Technical Reports Server (NTRS)

    Casasent, David P.; Lee, Andrew J.

    1988-01-01

    Multiple-degree-of-freedom object recognition concerns objects with no stable rest position with all scale, rotation, and aspect distortions possible. It is assumed that the objects are in a fairly benign background, so that feature extractors are usable. In-plane distortion invariance is provided by use of a polar-log coordinate transform feature space, and out-of-plane distortion invariance is provided by linear discriminant function design. Relational graph decision nets are considered for multiple-degree-of-freedom pattern recognition. The design of Fisher (1936) linear discriminant functions and synthetic discriminant function for use at the nodes of binary and multidecision nets is discussed. Case studies are detailed for two-class and multiclass problems. Simulation results demonstrate the robustness of the processors to quantization of the filter coefficients and to noise.

  8. Interactive Multiple Object Tracking (iMOT)

    PubMed Central

    Thornton, Ian M.; Bülthoff, Heinrich H.; Horowitz, Todd S.; Rynning, Aksel; Lee, Seong-Whan

    2014-01-01

    We introduce a new task for exploring the relationship between action and attention. In this interactive multiple object tracking (iMOT) task, implemented as an iPad app, participants were presented with a display of multiple, visually identical disks which moved independently. The task was to prevent any collisions during a fixed duration. Participants could perturb object trajectories via the touchscreen. In Experiment 1, we used a staircase procedure to measure the ability to control moving objects. Object speed was set to 1°/s. On average participants could control 8.4 items without collision. Individual control strategies were quite variable, but did not predict overall performance. In Experiment 2, we compared iMOT with standard MOT performance using identical displays. Object speed was set to 2°/s. Participants could reliably control more objects (M = 6.6) than they could track (M = 4.0), but performance in the two tasks was positively correlated. In Experiment 3, we used a dual-task design. Compared to single-task baseline, iMOT performance decreased and MOT performance increased when the two tasks had to be completed together. Overall, these findings suggest: 1) There is a clear limit to the number of items that can be simultaneously controlled, for a given speed and display density; 2) participants can control more items than they can track; 3) task-relevant action appears not to disrupt MOT performance in the current experimental context. PMID:24498288

  9. Reliability and Validity of Three Instruments (DSM-IV, CPGI, and PPGM) in the Assessment of Problem Gambling in South Korea.

    PubMed

    Back, Ki-Joon; Williams, Robert J; Lee, Choong-Ki

    2015-09-01

    Most research on the assessment, epidemiology, and treatment of problem gambling has occurred in Western jurisdictions. This potentially limits the cross-cultural validity of problem gambling assessment instruments as well as etiological models of problem gambling. The primary objective of the present research was to investigate the reliability and validity of three problem gambling assessment instruments within a South Korean context. A total of 4,330 South Korean adults participated in a comprehensive assessment of their gambling behavior that included the administration of the DSM-IV criteria for pathological gambling (NODS), the Canadian Problem Gambling Index (CPGI), and the Problem and Pathological Gambling Measure (PPGM). Cronbach alpha showed that all three instruments had good internal consistency. Concurrent validity was established by the significant associations observed between scores on the instruments and measures of gambling involvement (number of gambling formats engaged in; frequency of gambling; and gambling expenditure). Most importantly, kappa statistics showed that all instruments have satisfactory classification accuracy against clinical assessment of problem gambling conducted by South Korean clinicians (NODS κ = .66; PPGM κ = .62; CPGI κ = .51). These results confirm that Western-derived operationalizations of problem gambling have applicability in a South Korean setting.

  10. A note on the estimation of the Pareto efficient set for multiobjective matrix permutation problems.

    PubMed

    Brusco, Michael J; Steinley, Douglas

    2012-02-01

    There are a number of important problems in quantitative psychology that require the identification of a permutation of the n rows and columns of an n × n proximity matrix. These problems encompass applications such as unidimensional scaling, paired-comparison ranking, and anti-Robinson forms. The importance of simultaneously incorporating multiple objective criteria in matrix permutation applications is well recognized in the literature; however, to date, there has been a reliance on weighted-sum approaches that transform the multiobjective problem into a single-objective optimization problem. Although exact solutions to these single-objective problems produce supported Pareto efficient solutions to the multiobjective problem, many interesting unsupported Pareto efficient solutions may be missed. We illustrate the limitation of the weighted-sum approach with an example from the psychological literature and devise an effective heuristic algorithm for estimating both the supported and unsupported solutions of the Pareto efficient set. © 2011 The British Psychological Society.

  11. A methodology to find the elementary landscape decomposition of combinatorial optimization problems.

    PubMed

    Chicano, Francisco; Whitley, L Darrell; Alba, Enrique

    2011-01-01

    A small number of combinatorial optimization problems have search spaces that correspond to elementary landscapes, where the objective function f is an eigenfunction of the Laplacian that describes the neighborhood structure of the search space. Many problems are not elementary; however, the objective function of a combinatorial optimization problem can always be expressed as a superposition of multiple elementary landscapes if the underlying neighborhood used is symmetric. This paper presents theoretical results that provide the foundation for algebraic methods that can be used to decompose the objective function of an arbitrary combinatorial optimization problem into a sum of subfunctions, where each subfunction is an elementary landscape. Many steps of this process can be automated, and indeed a software tool could be developed that assists the researcher in finding a landscape decomposition. This methodology is then used to show that the subset sum problem is a superposition of two elementary landscapes, and to show that the quadratic assignment problem is a superposition of three elementary landscapes.

  12. Symmetric caging formation for convex polygonal object transportation by multiple mobile robots based on fuzzy sliding mode control.

    PubMed

    Dai, Yanyan; Kim, YoonGu; Wee, SungGil; Lee, DongHa; Lee, SukGyu

    2016-01-01

    In this paper, the problem of object caging and transporting is considered for multiple mobile robots. With the consideration of minimizing the number of robots and decreasing the rotation of the object, the proper points are calculated and assigned to the multiple mobile robots to allow them to form a symmetric caging formation. The caging formation guarantees that all of the Euclidean distances between any two adjacent robots are smaller than the minimal width of the polygonal object so that the object cannot escape. In order to avoid collision among robots, the parameter of the robots radius is utilized to design the caging formation, and the A⁎ algorithm is used so that mobile robots can move to the proper points. In order to avoid obstacles, the robots and the object are regarded as a rigid body to apply artificial potential field method. The fuzzy sliding mode control method is applied for tracking control of the nonholonomic mobile robots. Finally, the simulation and experimental results show that multiple mobile robots are able to cage and transport the polygonal object to the goal position, avoiding obstacles. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Reliability of Autism-Tics, AD/HD, and other Comorbidities (A-TAC) inventory in a test-retest design.

    PubMed

    Larson, Tomas; Kerekes, Nóra; Selinus, Eva Norén; Lichtenstein, Paul; Gumpert, Clara Hellner; Anckarsäter, Henrik; Nilsson, Thomas; Lundström, Sebastian

    2014-02-01

    The Autism-Tics, AD/HD, and other Comorbidities (A-TAC) inventory is used in epidemiological research to assess neurodevelopmental problems and coexisting conditions. Although the A-TAC has been applied in various populations, data on retest reliability are limited. The objective of the present study was to present additional reliability data. The A-TAC was administered by lay assessors and was completed on two occasions by parents of 400 individual twins, with an average interval of 70 days between test sessions. Intra- and inter-rater reliability were analysed with intraclass correlations and Cohen's kappa. A-TAC showed excellent test-retest intraclass correlations for both autism spectrum disorder and attention deficit hyperactivity disorder (each at .84). Most modules in the A-TAC had intra- and inter-rater reliability intraclass correlation coefficients of > or = .60. Cohen's kappa indi- cated acceptable reliability. The current study provides statistical evidence that the A-TAC yields good test-retest reliability in a population-based cohort of children.

  14. Optimal design of groundwater remediation system using a probabilistic multi-objective fast harmony search algorithm under uncertainty

    NASA Astrophysics Data System (ADS)

    Luo, Qiankun; Wu, Jianfeng; Yang, Yun; Qian, Jiazhong; Wu, Jichun

    2014-11-01

    This study develops a new probabilistic multi-objective fast harmony search algorithm (PMOFHS) for optimal design of groundwater remediation systems under uncertainty associated with the hydraulic conductivity (K) of aquifers. The PMOFHS integrates the previously developed deterministic multi-objective optimization method, namely multi-objective fast harmony search algorithm (MOFHS) with a probabilistic sorting technique to search for Pareto-optimal solutions to multi-objective optimization problems in a noisy hydrogeological environment arising from insufficient K data. The PMOFHS is then coupled with the commonly used flow and transport codes, MODFLOW and MT3DMS, to identify the optimal design of groundwater remediation systems for a two-dimensional hypothetical test problem and a three-dimensional Indiana field application involving two objectives: (i) minimization of the total remediation cost through the engineering planning horizon, and (ii) minimization of the mass remaining in the aquifer at the end of the operational period, whereby the pump-and-treat (PAT) technology is used to clean up contaminated groundwater. Also, Monte Carlo (MC) analysis is employed to evaluate the effectiveness of the proposed methodology. Comprehensive analysis indicates that the proposed PMOFHS can find Pareto-optimal solutions with low variability and high reliability and is a potentially effective tool for optimizing multi-objective groundwater remediation problems under uncertainty.

  15. M-OSCE as a method to measure dental hygiene students' critical thinking: a pilot study.

    PubMed

    McComas, Martha J; Wright, Rebecca A; Mann, Nancy K; Cooper, Mary D; Jacks, Mary E

    2013-04-01

    Educators in all academic disciplines have been encouraged to utilize assessment strategies to evaluate students' critical thinking. The purpose of this study was to assess the viability of the modified objective structured clinical examination (m-OSCE) to evaluate critical thinking in dental hygiene education. This evaluation utilized a convenience sample of senior dental hygiene students. Students participated in the m-OSCE in which portions of a patient case were revealed at four stations. The exam consisted of multiple-choice questions intended to measure students' ability to utilize critical thinking skills. Additionally, there was one fill-in-the-blank question and a treatment plan that was completed at the fifth station. The results of this study revealed that the m-OSCE did not reliably measure dental hygiene students' critical thinking. Statistical analysis found no satisfactory reliability within the multiple-choice questions and moderately reliable results within the treatment planning portion of the examination. In addition, the item analysis found gaps in students' abilities to transfer clinical evidence/data to basic biomedical knowledge as demonstrated through the multiple-choice questioning results. This outcome warrants further investigation of the utility of the m-OSCE, with a focus on modifications to the evaluation questions, grading rubric, and patient case.

  16. Biomarkers: background, classification and guidelines for applications in nutritional epidemiology

    USDA-ARS?s Scientific Manuscript database

    One of the main problems in nutritional epidemiology is to assess food intake as well as nutrient/food component intake to a high level of validity and reliability. To help in this process, the need to have good biomarkers that more objectively allow us to evaluate the diet consumed in a more standa...

  17. Analysis of Open-Ended Statistics Questions with Many Facet Rasch Model

    ERIC Educational Resources Information Center

    Güler, Nese

    2014-01-01

    Problem Statement: The most significant disadvantage of open-ended items that allow the valid measurement of upper level cognitive behaviours, such as synthesis and evaluation, is scoring. The difficulty associated with objectively scoring the answers to the items contributes to the reduction of the reliability of the scores. Moreover, other…

  18. Flexible Unicast-Based Group Communication for CoAP-Enabled Devices †

    PubMed Central

    Ishaq, Isam; Hoebeke, Jeroen; Van den Abeele, Floris; Rossey, Jen; Moerman, Ingrid; Demeester, Piet

    2014-01-01

    Smart embedded objects will become an important part of what is called the Internet of Things. Applications often require concurrent interactions with several of these objects and their resources. Existing solutions have several limitations in terms of reliability, flexibility and manageability of such groups of objects. To overcome these limitations we propose an intermediately level of intelligence to easily manipulate a group of resources across multiple smart objects, building upon the Constrained Application Protocol (CoAP). We describe the design of our solution to create and manipulate a group of CoAP resources using a single client request. Furthermore we introduce the concept of profiles for the created groups. The use of profiles allows the client to specify in more detail how the group should behave. We have implemented our solution and demonstrate that it covers the complete group life-cycle, i.e., creation, validation, flexible usage and deletion. Finally, we quantitatively analyze the performance of our solution and compare it against multicast-based CoAP group communication. The results show that our solution improves reliability and flexibility with a trade-off in increased communication overhead. PMID:24901978

  19. Electricity system expansion studies to consider uncertainties and interactions in restructured markets

    NASA Astrophysics Data System (ADS)

    Jin, Shan

    This dissertation concerns power system expansion planning under different market mechanisms. The thesis follows a three paper format, in which each paper emphasizes a different perspective. The first paper investigates the impact of market uncertainties on a long term centralized generation expansion planning problem. The problem is modeled as a two-stage stochastic program with uncertain fuel prices and demands, which are represented as probabilistic scenario paths in a multi-period tree. Two measurements, expected cost (EC) and Conditional Value-at-Risk (CVaR), are used to minimize, respectively, the total expected cost among scenarios and the risk of incurring high costs in unfavorable scenarios. We sample paths from the scenario tree to reduce the problem scale and determine the sufficient number of scenarios by computing confidence intervals on the objective values. The second paper studies an integrated electricity supply system including generation, transmission and fuel transportation with a restructured wholesale electricity market. This integrated system expansion problem is modeled as a bi-level program in which a centralized system expansion decision is made in the upper level and the operational decisions of multiple market participants are made in the lower level. The difficulty of solving a bi-level programming problem to global optimality is discussed and three problem relaxations obtained by reformulation are explored. The third paper solves a more realistic market-based generation and transmission expansion problem. It focuses on interactions among a centralized transmission expansion decision and decentralized generation expansion decisions. It allows each generator to make its own strategic investment and operational decisions both in response to a transmission expansion decision and in anticipation of a market price settled by an Independent System Operator (ISO) market clearing problem. The model poses a complicated tri-level structure including an equilibrium problem with equilibrium constraints (EPEC) sub-problem. A hybrid iterative algorithm is proposed to solve the problem efficiently and reliably.

  20. A dissociation of objective and subjective workload measures in assessing the impact of speech controls in advanced helicopters

    NASA Technical Reports Server (NTRS)

    Vidulich, Michael A.; Bortolussi, Michael R.

    1988-01-01

    Among the new technologies that are expected to aid helicopter designers are speech controls. Proponents suggest that speech controls could reduce the potential for manual control overloads and improve time-sharing performance in environments that have heavy demands for manual control. This was tested in a simulation of an advanced single-pilot, scout/attack helicopter. Objective performance indicated that the speech controls were effective in decreasing the interference of discrete responses during moments of heavy flight control activity. However, subjective ratings indicated that the use of speech controls required extra effort to speak precisely and to attend to feedback. Although the operational reliability of speech controls must be improved, the present results indicate that reliable speech controls could enhance the time-sharing efficiency of helicopter pilots. Furthermore, the results demonstrated the importance of using multiple assessment techniques to completely assess a task. Neither the objective nor the subjective measures alone provided complete information. It was the contrast between the measures that was most informative.

  1. Near-Field Scanning Optical Microscopy of Soft, Biological, or Rough Objects in Aqueous Environment: Challenges and some Remedies to Circumvent

    NASA Technical Reports Server (NTRS)

    Vikram, C. S.; Witherow, W. K.

    1999-01-01

    Near-field scanning optical microscopy is an established technique for sub-wavelength spatial resolution in imaging, spectroscopy, material science, surface chemistry, polarimetry, etc. A significant amount of confidence has been established for thin hard specimens in air. However when soft, biological, rough, in aqueous environment object, or a combination is involved, the progress has been slow. The tip-sample mechanical interaction, heat effects to sample, drag effects to the probe, difficulty in controlling tip-sample separation in case of rough objects, light scattering from sample thickness, etc. create problems. Although these problems are not even fully understood, there have been attempts to study them with the aim of performing reliable operations. In this review we describe these attempts. Starting with general problems encountered, various effects like polarization, thermal, and media are covered. The roles of independent tip-sample distance control tools in the relevant situations are then described. Finally progress in fluid cell aspect has been summarized.

  2. A multiple objective optimization approach to quality control

    NASA Technical Reports Server (NTRS)

    Seaman, Christopher Michael

    1991-01-01

    The use of product quality as the performance criteria for manufacturing system control is explored. The goal in manufacturing, for economic reasons, is to optimize product quality. The problem is that since quality is a rather nebulous product characteristic, there is seldom an analytic function that can be used as a measure. Therefore standard control approaches, such as optimal control, cannot readily be applied. A second problem with optimizing product quality is that it is typically measured along many dimensions: there are many apsects of quality which must be optimized simultaneously. Very often these different aspects are incommensurate and competing. The concept of optimality must now include accepting tradeoffs among the different quality characteristics. These problems are addressed using multiple objective optimization. It is shown that the quality control problem can be defined as a multiple objective optimization problem. A controller structure is defined using this as the basis. Then, an algorithm is presented which can be used by an operator to interactively find the best operating point. Essentially, the algorithm uses process data to provide the operator with two pieces of information: (1) if it is possible to simultaneously improve all quality criteria, then determine what changes to the process input or controller parameters should be made to do this; and (2) if it is not possible to improve all criteria, and the current operating point is not a desirable one, select a criteria in which a tradeoff should be made, and make input changes to improve all other criteria. The process is not operating at an optimal point in any sense if no tradeoff has to be made to move to a new operating point. This algorithm ensures that operating points are optimal in some sense and provides the operator with information about tradeoffs when seeking the best operating point. The multiobjective algorithm was implemented in two different injection molding scenarios: tuning of process controllers to meet specified performance objectives and tuning of process inputs to meet specified quality objectives. Five case studies are presented.

  3. Reliable Control Using Disturbance Observer and Equivalent Transfer Function for Position Servo System in Current Feedback Loop Failure

    NASA Astrophysics Data System (ADS)

    Ishikawa, Kaoru; Nakamura, Taro; Osumi, Hisashi

    A reliable control method is proposed for multiple loop control system. After a feedback loop failure, such as case of the sensor break down, the control system becomes unstable and has a big fluctuation even if it has a disturbance observer. To cope with this problem, the proposed method uses an equivalent transfer function (ETF) as active redundancy compensation after the loop failure. The ETF is designed so that it does not change the transfer function of the whole system before and after the loop failure. In this paper, the characteristic of reliable control system that uses an ETF and a disturbance observer is examined by the experiment that uses the DC servo motor for the current feedback loop failure in the position servo system.

  4. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  5. Objectivity and reliability in qualitative analysis: realist, contextualist and radical constructionist epistemologies.

    PubMed

    Madill, A; Jordan, A; Shirley, C

    2000-02-01

    The effect of the individual analyst on research findings can create a credibility problem for qualitative approaches from the perspective of evaluative criteria utilized in quantitative psychology. This paper explicates the ways in which objectivity and reliability are understood in qualitative analysis conducted from within three distinct epistemological frameworks: realism, contextual constructionism, and radical constructionism. It is argued that quality criteria utilized in quantitative psychology are appropriate to the evaluation of qualitative analysis only to the extent that it is conducted within a naive or scientific realist framework. The discussion is illustrated with reference to the comparison of two independent grounded theory analyses of identical material. An implication of this illustration is to identify the potential to develop a radical constructionist strand of grounded theory.

  6. Design guidelines for use of adhesives and organic coatings in hybrid microcircuits

    NASA Technical Reports Server (NTRS)

    Caruso, S. V.; Licari, J. J.; Perkins, K. L.; Schramm, W. A.

    1974-01-01

    A study was conducted to investigate the reliability of organic adhesives in hybrid microcircuits. The objectives were twofold: (1) to identify and investigate problem areas that could result from the use of organic adhesives and (2) to develop evaluation tests to quantify the extent to which these problems occur for commercially available adhesives. Efforts were focused on electrically conductive adhesives. Also, a study was made to evaluate selected organic coatings for contamination protection for hybrid microcircuits.

  7. Smart Acoustic Network Using Combined FSK-PSK, Adaptive Beamforming and Equalization

    DTIC Science & Technology

    2002-09-30

    sonar data transmission from underwater vehicle during mission. The two-year objectives for the high-reliability acoustic network using multiple... sonar laboratory) and used for acoustic networking during underwater vehicle operation. The joint adaptive coherent path beamformer method consists...broadband communications transducer, while the low noise preamplifier conditions received signals for analog to digital conversion. External user

  8. Smart Acoustic Network Using Combined FSK-PSK, Adaptive, Beamforming and Equalization

    DTIC Science & Technology

    2001-09-30

    sonar data transmission from underwater vehicle during mission. The two-year objectives for the high-reliability acoustic network using multiple... sonar laboratory) and used for acoustic networking during underwater vehicle operation. The joint adaptive coherent path beamformer method consists...broadband communications transducer, while the low noise preamplifier conditions received signals for analog to digital conversion. External user

  9. High-quality slab-based intermixing method for fusion rendering of multiple medical objects.

    PubMed

    Kim, Dong-Joon; Kim, Bohyoung; Lee, Jeongjin; Shin, Juneseuk; Kim, Kyoung Won; Shin, Yeong-Gil

    2016-01-01

    The visualization of multiple 3D objects has been increasingly required for recent applications in medical fields. Due to the heterogeneity in data representation or data configuration, it is difficult to efficiently render multiple medical objects in high quality. In this paper, we present a novel intermixing scheme for fusion rendering of multiple medical objects while preserving the real-time performance. First, we present an in-slab visibility interpolation method for the representation of subdivided slabs. Second, we introduce virtual zSlab, which extends an infinitely thin boundary (such as polygonal objects) into a slab with a finite thickness. Finally, based on virtual zSlab and in-slab visibility interpolation, we propose a slab-based visibility intermixing method with the newly proposed rendering pipeline. Experimental results demonstrate that the proposed method delivers more effective multiple-object renderings in terms of rendering quality, compared to conventional approaches. And proposed intermixing scheme provides high-quality intermixing results for the visualization of intersecting and overlapping surfaces by resolving aliasing and z-fighting problems. Moreover, two case studies are presented that apply the proposed method to the real clinical applications. These case studies manifest that the proposed method has the outstanding advantages of the rendering independency and reusability. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. Behavioral Modeling of Adversaries with Multiple Objectives in Counterterrorism.

    PubMed

    Mazicioglu, Dogucan; Merrick, Jason R W

    2018-05-01

    Attacker/defender models have primarily assumed that each decisionmaker optimizes the cost of the damage inflicted and its economic repercussions from their own perspective. Two streams of recent research have sought to extend such models. One stream suggests that it is more realistic to consider attackers with multiple objectives, but this research has not included the adaption of the terrorist with multiple objectives to defender actions. The other stream builds off experimental studies that show that decisionmakers deviate from optimal rational behavior. In this article, we extend attacker/defender models to incorporate multiple objectives that a terrorist might consider in planning an attack. This includes the tradeoffs that a terrorist might consider and their adaption to defender actions. However, we must also consider experimental evidence of deviations from the rationality assumed in the commonly used expected utility model in determining such adaption. Thus, we model the attacker's behavior using multiattribute prospect theory to account for the attacker's multiple objectives and deviations from rationality. We evaluate our approach by considering an attacker with multiple objectives who wishes to smuggle radioactive material into the United States and a defender who has the option to implement a screening process to hinder the attacker. We discuss the problems with implementing such an approach, but argue that research in this area must continue to avoid misrepresenting terrorist behavior in determining optimal defensive actions. © 2017 Society for Risk Analysis.

  11. Enhanced Multiobjective Optimization Technique for Comprehensive Aerospace Design. Part A

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Rajadas, John N.

    1997-01-01

    A multidisciplinary design optimization procedure which couples formal multiobjectives based techniques and complex analysis procedures (such as computational fluid dynamics (CFD) codes) developed. The procedure has been demonstrated on a specific high speed flow application involving aerodynamics and acoustics (sonic boom minimization). In order to account for multiple design objectives arising from complex performance requirements, multiobjective formulation techniques are used to formulate the optimization problem. Techniques to enhance the existing Kreisselmeier-Steinhauser (K-S) function multiobjective formulation approach have been developed. The K-S function procedure used in the proposed work transforms a constrained multiple objective functions problem into an unconstrained problem which then is solved using the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm. Weight factors are introduced during the transformation process to each objective function. This enhanced procedure will provide the designer the capability to emphasize specific design objectives during the optimization process. The demonstration of the procedure utilizes a computational Fluid dynamics (CFD) code which solves the three-dimensional parabolized Navier-Stokes (PNS) equations for the flow field along with an appropriate sonic boom evaluation procedure thus introducing both aerodynamic performance as well as sonic boom as the design objectives to be optimized simultaneously. Sensitivity analysis is performed using a discrete differentiation approach. An approximation technique has been used within the optimizer to improve the overall computational efficiency of the procedure in order to make it suitable for design applications in an industrial setting.

  12. Predicting MHC-II binding affinity using multiple instance regression

    PubMed Central

    EL-Manzalawy, Yasser; Dobbs, Drena; Honavar, Vasant

    2011-01-01

    Reliably predicting the ability of antigen peptides to bind to major histocompatibility complex class II (MHC-II) molecules is an essential step in developing new vaccines. Uncovering the amino acid sequence correlates of the binding affinity of MHC-II binding peptides is important for understanding pathogenesis and immune response. The task of predicting MHC-II binding peptides is complicated by the significant variability in their length. Most existing computational methods for predicting MHC-II binding peptides focus on identifying a nine amino acids core region in each binding peptide. We formulate the problems of qualitatively and quantitatively predicting flexible length MHC-II peptides as multiple instance learning and multiple instance regression problems, respectively. Based on this formulation, we introduce MHCMIR, a novel method for predicting MHC-II binding affinity using multiple instance regression. We present results of experiments using several benchmark datasets that show that MHCMIR is competitive with the state-of-the-art methods for predicting MHC-II binding peptides. An online web server that implements the MHCMIR method for MHC-II binding affinity prediction is freely accessible at http://ailab.cs.iastate.edu/mhcmir. PMID:20855923

  13. Reliability of Classifying Multiple Sclerosis Disease Activity Using Magnetic Resonance Imaging in a Multiple Sclerosis Clinic

    PubMed Central

    Altay, Ebru Erbayat; Fisher, Elizabeth; Jones, Stephen E.; Hara-Cleaver, Claire; Lee, Jar-Chi; Rudick, Richard A.

    2013-01-01

    Objective To assess the reliability of new magnetic resonance imaging (MRI) lesion counts by clinicians in a multiple sclerosis specialty clinic. Design An observational study. Setting A multiple sclerosis specialty clinic. Patients Eighty-five patients with multiple sclerosis participating in a National Institutes of Health–supported longitudinal study were included. Intervention Each patient had a brain MRI scan at entry and 6 months later using a standardized protocol. Main Outcome Measures The number of new T2 lesions, newly enlarging T2 lesions, and gadolinium-enhancing lesions were measured on the 6-month MRI using a computer-based image analysis program for the original study. For this study, images were reanalyzed by an expert neuroradiologist and 3 clinician raters. The neuroradiologist evaluated the original image pairs; the clinicians evaluated image pairs that were modified to simulate clinical practice. New lesion counts were compared across raters, as was classification of patients as MRI active or inactive. Results Agreement on lesion counts was highest for gadolinium-enhancing lesions, intermediate for new T2 lesions, and poor for enlarging T2 lesions. In 18% to 25% of the cases, MRI activity was classified differently by the clinician raters compared with the neuroradiologist or computer program. Variability among the clinical raters for estimates of new T2 lesions was affected most strongly by the image modifications that simulated low image quality and different head position. Conclusions Between-rater variability in new T2 lesion counts may be reduced by improved standardization of image acquisitions, but this approach may not be practical in most clinical environments. Ultimately, more reliable, robust, and accessible image analysis methods are needed for accurate multiple sclerosis disease-modifying drug monitoring and decision making in the routine clinic setting. PMID:23599930

  14. Acoustic and elastic waveform inversion best practices

    NASA Astrophysics Data System (ADS)

    Modrak, Ryan T.

    Reaching the global minimum of a waveform misfit function requires careful choices about the nonlinear optimization, preconditioning and regularization methods underlying an inversion. Because waveform inversion problems are susceptible to erratic convergence, one or two test cases are not enough to reliably inform such decisions. We identify best practices instead using two global, one regional and four near-surface acoustic test problems. To obtain meaningful quantitative comparisons, we carry out hundreds acoustic inversions, varying one aspect of the implementation at a time. Comparing nonlinear optimization algorithms, we find that L-BFGS provides computational savings over nonlinear conjugate gradient methods in a wide variety of test cases. Comparing preconditioners, we show that a new diagonal scaling derived from the adjoint of the forward operator provides better performance than two conventional preconditioning schemes. Comparing regularization strategies, we find that projection, convolution, Tikhonov regularization, and total variation regularization are effective in different contexts. Besides these issues, reliability and efficiency in waveform inversion depend on close numerical attention and care. Implementation details have a strong effect on computational cost, regardless of the chosen material parameterization or nonlinear optimization algorithm. Building on the acoustic inversion results, we carry out elastic experiments with four test problems, three objective functions, and four material parameterizations. The choice of parameterization for isotropic elastic media is found to be more complicated than previous studies suggests, with "wavespeed-like'' parameters performing well with phase-based objective functions and Lame parameters performing well with amplitude-based objective functions. Reliability and efficiency can be even harder to achieve in transversely isotropic elastic inversions because rotation angle parameters describing fast-axis direction are difficult to recover. Using Voigt or Chen-Tromp parameters avoids the need to include rotation angles explicitly and provides an effective strategy for anisotropic inversion. The need for flexible and portable workflow management tools for seismic inversion also poses a major challenge. In a final chapter, the software used to the carry out the above experiments is described and instructions for reproducing experimental results are given.

  15. Photographs and Committees: Activities That Help Students Discover Permutations and Combinations.

    ERIC Educational Resources Information Center

    Szydlik, Jennifer Earles

    2000-01-01

    Presents problem situations that support students when discovering the multiplication principle, permutations, combinations, Pascal's triangle, and relationships among those objects in a concrete context. (ASK)

  16. Evaluation of the Validity and Reliability of the Waterlow Pressure Ulcer Risk Assessment Scale.

    PubMed

    Charalambous, Charalambos; Koulori, Agoritsa; Vasilopoulos, Aristidis; Roupa, Zoe

    2018-04-01

    Prevention is the ideal strategy to tackle the problem of pressure ulcers. Pressure ulcer risk assessment scales are one of the most pivotal measures applied to tackle the problem, much criticisms has been developed regarding the validity and reliability of these scales. To investigate the validity and reliability of the Waterlow pressure ulcer risk assessment scale. The methodology used is a narrative literature review, the bibliography was reviewed through Cinahl, Pubmed, EBSCO, Medline and Google scholar, 26 scientific articles where identified. The articles where chosen due to their direct correlation with the objective under study and their scientific relevance. The construct and face validity of the Waterlow appears adequate, but with regards to content validity changes in the category age and gender can be beneficial. The concurrent validity cannot be assessed. The predictive validity of the Waterlow is characterized by high specificity and low sensitivity. The inter-rater reliability has been demonstrated to be inadequate, this may be due to lack of clear definitions within the categories and differentiating level of knowledge between the users. Due to the limitations presented regarding the validity and reliability of the Waterlow pressure ulcer risk assessment scale, the scale should be used in conjunction with clinical assessment to provide optimum results.

  17. Multiobjective evolutionary optimization of water distribution systems: Exploiting diversity with infeasible solutions.

    PubMed

    Tanyimboh, Tiku T; Seyoum, Alemtsehay G

    2016-12-01

    This article investigates the computational efficiency of constraint handling in multi-objective evolutionary optimization algorithms for water distribution systems. The methodology investigated here encourages the co-existence and simultaneous development including crossbreeding of subpopulations of cost-effective feasible and infeasible solutions based on Pareto dominance. This yields a boundary search approach that also promotes diversity in the gene pool throughout the progress of the optimization by exploiting the full spectrum of non-dominated infeasible solutions. The relative effectiveness of small and moderate population sizes with respect to the number of decision variables is investigated also. The results reveal the optimization algorithm to be efficient, stable and robust. It found optimal and near-optimal solutions reliably and efficiently. The real-world system based optimization problem involved multiple variable head supply nodes, 29 fire-fighting flows, extended period simulation and multiple demand categories including water loss. The least cost solutions found satisfied the flow and pressure requirements consistently. The best solutions achieved indicative savings of 48.1% and 48.2% based on the cost of the pipes in the existing network, for populations of 200 and 1000, respectively. The population of 1000 achieved slightly better results overall. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Clinical and Epidemiological Aspects of Multiple Sclerosis in Children

    PubMed Central

    NASEHI, Mohammad Mehdi; SAHRAIAN, Mohammad Ali; NASER MOGHADASI, Abdorreza; GHOFRANI, Mohammad; ASHTARI, Fereshteh; TAGHDIRI, Mohammad Mahdi; TONEKABONI, Seyed Hassan; KARIMZADEH, Parvaneh; AFSHARI, Mahdi; MOOSAZADEH, Mahmood

    2017-01-01

    Objective Overall, 2%-5% of patients with multiple sclerosis (MS) experienced the first episode of disease before the age 18 years old. Since the age of onset among children is not similar to that in general population, clinicians often fail to early diagnose the disease. This study aimed to determine the epidemiological and clinical patterns of MS among Iranian children. Materials & Methods In this cross-sectional study carried out in Iran in 2014-2015, information was collected using a checklist with approved reliability and validity. Method sampling was consensus. Data were analyzed using frequency, mean and standard deviation indices by means of SPSS ver. 20 software. Results Totally, 177 MS children were investigated. 75.7% of them were female. Mean (SD), minimum and maximum age of subjects were 15.9 (2), 7 and 18 yr, respectively. The most reported symptoms were sensory (28.2%), motor (29.4%), diplopia (20.3%) and visual (32.8%). Primary MRI results showed 91.5% and 53.1% periventricular and spinal cord lesions, respectively. Conclusion MS is significantly more common among women. The most common age of onset is during the second decades. Visual and motor problems are the most symptoms, while, periventricular and spinal cord lesions are the most MRI results. PMID:28698726

  19. A Method for the Microanalysis of Pre-Algebra Transfer

    ERIC Educational Resources Information Center

    Pavlik, Philip I., Jr.; Yudelson, Michael; Koedinger, Kenneth R.

    2011-01-01

    The objective of this research was to better understand the transfer of learning between different variations of pre-algebra problems. While the authors could have addressed a specific variation that might address transfer, they were interested in developing a general model of transfer, so we gathered data from multiple problem types and their…

  20. The quandaries and promise of risk management: a scientist's perspective on integration of science and management.

    Treesearch

    B.G. Marcot

    2007-01-01

    This paper briefly lists constraints and problems of traditional approaches to natural resource risk analysis and risk management. Such problems include disparate definitions of risk, multiple and conflicting objectives and decisions, conflicting interpretations of uncertainty, and failure of articulating decision criteria, risk attitudes, modeling assumptions, and...

  1. Multiple object tracking with non-unique data-to-object association via generalized hypothesis testing. [tracking several aircraft near each other or ships at sea

    NASA Technical Reports Server (NTRS)

    Porter, D. W.; Lefler, R. M.

    1979-01-01

    A generalized hypothesis testing approach is applied to the problem of tracking several objects where several different associations of data with objects are possible. Such problems occur, for instance, when attempting to distinctly track several aircraft maneuvering near each other or when tracking ships at sea. Conceptually, the problem is solved by first, associating data with objects in a statistically reasonable fashion and then, tracking with a bank of Kalman filters. The objects are assumed to have motion characterized by a fixed but unknown deterministic portion plus a random process portion modeled by a shaping filter. For example, the object might be assumed to have a mean straight line path about which it maneuvers in a random manner. Several hypothesized associations of data with objects are possible because of ambiguity as to which object the data comes from, false alarm/detection errors, and possible uncertainty in the number of objects being tracked. The statistical likelihood function is computed for each possible hypothesized association of data with objects. Then the generalized likelihood is computed by maximizing the likelihood over parameters that define the deterministic motion of the object.

  2. An efficient and accurate solution methodology for bilevel multi-objective programming problems using a hybrid evolutionary-local-search algorithm.

    PubMed

    Deb, Kalyanmoy; Sinha, Ankur

    2010-01-01

    Bilevel optimization problems involve two optimization tasks (upper and lower level), in which every feasible upper level solution must correspond to an optimal solution to a lower level optimization problem. These problems commonly appear in many practical problem solving tasks including optimal control, process optimization, game-playing strategy developments, transportation problems, and others. However, they are commonly converted into a single level optimization problem by using an approximate solution procedure to replace the lower level optimization task. Although there exist a number of theoretical, numerical, and evolutionary optimization studies involving single-objective bilevel programming problems, not many studies look at the context of multiple conflicting objectives in each level of a bilevel programming problem. In this paper, we address certain intricate issues related to solving multi-objective bilevel programming problems, present challenging test problems, and propose a viable and hybrid evolutionary-cum-local-search based algorithm as a solution methodology. The hybrid approach performs better than a number of existing methodologies and scales well up to 40-variable difficult test problems used in this study. The population sizing and termination criteria are made self-adaptive, so that no additional parameters need to be supplied by the user. The study indicates a clear niche of evolutionary algorithms in solving such difficult problems of practical importance compared to their usual solution by a computationally expensive nested procedure. The study opens up many issues related to multi-objective bilevel programming and hopefully this study will motivate EMO and other researchers to pay more attention to this important and difficult problem solving activity.

  3. Motion and force control of multiple robotic manipulators

    NASA Technical Reports Server (NTRS)

    Wen, John T.; Kreutz-Delgado, Kenneth

    1992-01-01

    This paper addresses the motion and force control problem of multiple robot arms manipulating a cooperatively held object. A general control paradigm is introduced which decouples the motion and force control problems. For motion control, different control strategies are constructed based on the variables used as the control input in the controller design. There are three natural choices; acceleration of a generalized coordinate, arm tip force vectors, and the joint torques. The first two choices require full model information but produce simple models for the control design problem. The last choice results in a class of relatively model independent control laws by exploiting the Hamiltonian structure of the open loop system. The motion control only determines the joint torque to within a manifold, due to the multiple-arm kinematic constraint. To resolve the nonuniqueness of the joint torques, two methods are introduced. If the arm and object models are available, an optimization can be performed to best allocate the desired and effector control force to the joint actuators. The other possibility is to control the internal force about some set point. It is shown that effective force regulation can be achieved even if little model information is available.

  4. Adaptive object tracking via both positive and negative models matching

    NASA Astrophysics Data System (ADS)

    Li, Shaomei; Gao, Chao; Wang, Yawen

    2015-03-01

    To improve tracking drift which often occurs in adaptive tracking, an algorithm based on the fusion of tracking and detection is proposed in this paper. Firstly, object tracking is posed as abinary classification problem and is modeled by partial least squares (PLS) analysis. Secondly, tracking object frame by frame via particle filtering. Thirdly, validating the tracking reliability based on both positive and negative models matching. Finally, relocating the object based on SIFT features matching and voting when drift occurs. Object appearance model is updated at the same time. The algorithm can not only sense tracking drift but also relocate the object whenever needed. Experimental results demonstrate that this algorithm outperforms state-of-the-art algorithms on many challenging sequences.

  5. An adaptive evolutionary multi-objective approach based on simulated annealing.

    PubMed

    Li, H; Landa-Silva, D

    2011-01-01

    A multi-objective optimization problem can be solved by decomposing it into one or more single objective subproblems in some multi-objective metaheuristic algorithms. Each subproblem corresponds to one weighted aggregation function. For example, MOEA/D is an evolutionary multi-objective optimization (EMO) algorithm that attempts to optimize multiple subproblems simultaneously by evolving a population of solutions. However, the performance of MOEA/D highly depends on the initial setting and diversity of the weight vectors. In this paper, we present an improved version of MOEA/D, called EMOSA, which incorporates an advanced local search technique (simulated annealing) and adapts the search directions (weight vectors) corresponding to various subproblems. In EMOSA, the weight vector of each subproblem is adaptively modified at the lowest temperature in order to diversify the search toward the unexplored parts of the Pareto-optimal front. Our computational results show that EMOSA outperforms six other well established multi-objective metaheuristic algorithms on both the (constrained) multi-objective knapsack problem and the (unconstrained) multi-objective traveling salesman problem. Moreover, the effects of the main algorithmic components and parameter sensitivities on the search performance of EMOSA are experimentally investigated.

  6. Balancing Exploration, Uncertainty Representation and Computational Time in Many-Objective Reservoir Policy Optimization

    NASA Astrophysics Data System (ADS)

    Zatarain-Salazar, J.; Reed, P. M.; Quinn, J.; Giuliani, M.; Castelletti, A.

    2016-12-01

    As we confront the challenges of managing river basin systems with a large number of reservoirs and increasingly uncertain tradeoffs impacting their operations (due to, e.g. climate change, changing energy markets, population pressures, ecosystem services, etc.), evolutionary many-objective direct policy search (EMODPS) solution strategies will need to address the computational demands associated with simulating more uncertainties and therefore optimizing over increasingly noisy objective evaluations. Diagnostic assessments of state-of-the-art many-objective evolutionary algorithms (MOEAs) to support EMODPS have highlighted that search time (or number of function evaluations) and auto-adaptive search are key features for successful optimization. Furthermore, auto-adaptive MOEA search operators are themselves sensitive to having a sufficient number of function evaluations to learn successful strategies for exploring complex spaces and for escaping from local optima when stagnation is detected. Fortunately, recent parallel developments allow coordinated runs that enhance auto-adaptive algorithmic learning and can handle scalable and reliable search with limited wall-clock time, but at the expense of the total number of function evaluations. In this study, we analyze this tradeoff between parallel coordination and depth of search using different parallelization schemes of the Multi-Master Borg on a many-objective stochastic control problem. We also consider the tradeoff between better representing uncertainty in the stochastic optimization, and simplifying this representation to shorten the function evaluation time and allow for greater search. Our analysis focuses on the Lower Susquehanna River Basin (LSRB) system where multiple competing objectives for hydropower production, urban water supply, recreation and environmental flows need to be balanced. Our results provide guidance for balancing exploration, uncertainty, and computational demands when using the EMODPS framework to discover key tradeoffs within the LSRB system.

  7. Addressing subjective decision-making inherent in GLUE-based multi-criteria rainfall-runoff model calibration

    NASA Astrophysics Data System (ADS)

    Shafii, Mahyar; Tolson, Bryan; Shawn Matott, L.

    2015-04-01

    GLUE is one of the most commonly used informal methodologies for uncertainty estimation in hydrological modelling. Despite the ease-of-use of GLUE, it involves a number of subjective decisions such as the strategy for identifying the behavioural solutions. This study evaluates the impact of behavioural solution identification strategies in GLUE on the quality of model output uncertainty. Moreover, two new strategies are developed to objectively identify behavioural solutions. The first strategy considers Pareto-based ranking of parameter sets, while the second one is based on ranking the parameter sets based on an aggregated criterion. The proposed strategies, as well as the traditional strategies in the literature, are evaluated with respect to reliability (coverage of observations by the envelope of model outcomes) and sharpness (width of the envelope of model outcomes) in different numerical experiments. These experiments include multi-criteria calibration and uncertainty estimation of three rainfall-runoff models with different number of parameters. To demonstrate the importance of behavioural solution identification strategy more appropriately, GLUE is also compared with two other informal multi-criteria calibration and uncertainty estimation methods (Pareto optimization and DDS-AU). The results show that the model output uncertainty varies with the behavioural solution identification strategy, and furthermore, a robust GLUE implementation would require considering multiple behavioural solution identification strategies and choosing the one that generates the desired balance between sharpness and reliability. The proposed objective strategies prove to be the best options in most of the case studies investigated in this research. Implementing such an approach for a high-dimensional calibration problem enables GLUE to generate robust results in comparison with Pareto optimization and DDS-AU.

  8. Cross-cultural adaptation and validation to Brazil of the Obesity-related Problems Scale

    PubMed Central

    Brasil, Andreia Mara Brolezzi; Brasil, Fábio; Maurício, Angélica Aparecida; Vilela, Regina Maria

    2017-01-01

    ABSTRACT Objective To validate a reliable version of the Obesity-related Problems Scale in Portuguese to use it in Brazil. Methods The Obesity-related Problems Scale was translated and transculturally adapted. Later it was simultaneously self-applied with a 12-item version of the World Health Organization Disability Assessment Schedule 2.0 (WHODAS 2.0), to 50 obese patients and 50 non-obese individuals, and applied again to half of them after 14 days. Results The Obesity-related Problems scale was able to differentiate obese from non-obese individuals with higher accuracy than WHODAS 2.0, correlating with this scale and with body mass index. The factor analysis determined a two-dimensional structure, which was confirmed with χ2/df=1.81, SRMR=0.05, and CFI=0.97. The general a coefficient was 0.90 and the inter-item intra-class correlation, in the reapplication, ranged from 0.75 to 0.87. Conclusion The scale proved to be valid and reliable for use in the Brazilian population, without the need to exclude items. PMID:29091155

  9. Validating Neuro-QoL short forms and targeted scales with people who have multiple sclerosis.

    PubMed

    Miller, Deborah M; Bethoux, Francois; Victorson, David; Nowinski, Cindy J; Buono, Sarah; Lai, Jin-Shei; Wortman, Katy; Burns, James L; Moy, Claudia; Cella, David

    2016-05-01

    Multiple sclerosis (MS) is a chronic, progressive, and disabling disease of the central nervous system with dramatic variations in the combination and severity of symptoms it can produce. The lack of reliable disease-specific health-related quality of life (HRQL) measures for use in clinical trials prompted the development of the Neurology Quality of Life (Neuro-QOL) instrument, which includes 13 scales that assess physical, emotional, cognitive, and social domains, for use in a variety of neurological illnesses. The objective of this research paper is to conduct an initial assessment of the reliability and validation of the Neuro-QOL short forms (SFs) in MS. We assessed reliability, concurrent validity, known groups validity, and responsiveness between cross-sectional and longitudinal data in 161 recruited MS patients. Internal consistency was high for all measures (α = 0.81-0.95) and ICCs were within the acceptable range (0.76-0.91); concurrent and known groups validity were highest with the Global HRQL question. Longitudinal assessment was limited by the lack of disease progression in the group. The Neuro-QOL SFs demonstrate good internal consistency, test-re-test reliability, and concurrent and known groups validity in this MS population, supporting the validity of Neuro-QOL in adults with MS. © The Author(s), 2015.

  10. Chapter 6: The scientific basis for conserving forest carnivores: considerations for management

    Treesearch

    L. Jack Lyon; Keith B. Aubry; William J. Zielinski; Steven W. Buskirk; Leonard F. Ruggiero

    1994-01-01

    The reviews presented in previous chapters reveal substantial gaps in our knowledge about marten, fisher, lynx, and wolverine. These gaps severely constrain our ability to design reliable conservation strategies. This problem will be explored in depth in Chapter 7. In this chapter, our objective is to discuss management considerations resulting from what we currently...

  11. African American Dementia Caregiver Problem Inventory: Descriptive analysis and initial psychometric evaluation.

    PubMed

    Wells, Brittny A; Glueckauf, Robert L; Bernabe, Daniel; Kazmer, Michelle M; Schettini, Gabriel; Springer, Jane; Sharma, Dinesh; Meng, Hongdao; Willis, Floyd B; Graff-Radford, Neill

    2017-02-01

    The primary objectives of the present study were: (a) to develop the African American Dementia Caregiver Problem Inventory (DCPI-A) that assesses the types and frequency of problems reported by African American dementia caregivers seeking cognitive-behavioral intervention, (b) to evaluate the intercoder reliability of the DCPI-A, and (c) to measure the perceived severity of common problems reported by this caregiver population. The development of the DCPI-A was divided into 3 major steps: (a) creating an initial sample pool of caregiver problems derived from 2 parent randomized clinical trials, (b) formulating a preliminary version of the DCPI-A, and (c) finalizing the development of the DCPI-A that includes 20 problem categories with explicit coding rules, definitions, and illustrative examples. The most commonly reported caregiver problems fell into 5 major categories: (a) communication problems with care recipients, family members, and/or significant others, (b) problems with socialization, recreation, and personal enhancement time; (c) problems with physical health and health maintenance, (d) problems in managing care recipients' activities of daily living; and (e) problems with care recipients' difficult behaviors. Intercoder reliability was moderately high for both percent agreement and Cronbach's kappa. A similar positive pattern of results was obtained for the analysis of coder drift. The descriptive analysis of the types and frequency of problems of African American dementia caregivers coupled with the outcomes of the psychometric evaluation bode well for the adoption of the DCPI-A in clinical settings. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  12. A scalable parallel algorithm for multiple objective linear programs

    NASA Technical Reports Server (NTRS)

    Wiecek, Malgorzata M.; Zhang, Hong

    1994-01-01

    This paper presents an ADBASE-based parallel algorithm for solving multiple objective linear programs (MOLP's). Job balance, speedup and scalability are of primary interest in evaluating efficiency of the new algorithm. Implementation results on Intel iPSC/2 and Paragon multiprocessors show that the algorithm significantly speeds up the process of solving MOLP's, which is understood as generating all or some efficient extreme points and unbounded efficient edges. The algorithm gives specially good results for large and very large problems. Motivation and justification for solving such large MOLP's are also included.

  13. PubMed Central

    Labrecque, M; Dostaler, L P; Dumont, H; Huard, G; Laflamme, L

    1993-01-01

    OBJECTIVE: To determine the interobserver reliability of tympanograms obtained with the MicroTymp, a portable tympanometer. SETTING: Family medicine teaching unit in a tertiary care hospital. PATIENTS: Thirty-three patients who presented to the ear, nose and throat clinic in August 1990 for an ear problem. INTERVENTION: Three residents in family medicine independently attempted to record with the MicroTymp one tympanogram for the 66 ears. We excluded the results for seven ears for which tympanograms could not be obtained. MAIN OUTCOME MEASURE: Using objective criteria, two family physicians and two residents in family medicine independently classified the 177 tympanograms into five categories (normal, possible effusion, possible perforation, possible tympano-ossicular dysfunction and unclassifiable). Reliability was estimated by means of the kappa (kappa) coefficient on 161 tympanograms from 59 ears for which the interpretation of the three tympanograms agreed. MAIN RESULTS: The interpretation of the three tympanograms agreed for 34 of the 59 ears (0.58) (kappa = 0.52, 95% confidence limits 0.45 and 0.59). There was no significant difference in interobserver reliability between pairs of observers or between symptomatic and asymptomatic ears. CONCLUSIONS: The interobserver reliability of the MicroTymp is moderate. The tympanograms obtained with the instrument should be interpreted in the context of the clinical findings. PMID:8431817

  14. Hybrid Optimization Parallel Search PACKage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2009-11-10

    HOPSPACK is open source software for solving optimization problems without derivatives. Application problems may have a fully nonlinear objective function, bound constraints, and linear and nonlinear constraints. Problem variables may be continuous, integer-valued, or a mixture of both. The software provides a framework that supports any derivative-free type of solver algorithm. Through the framework, solvers request parallel function evaluation, which may use MPI (multiple machines) or multithreading (multiple processors/cores on one machine). The framework provides a Cache and Pending Cache of saved evaluations that reduces execution time and facilitates restarts. Solvers can dynamically create other algorithms to solve subproblems, amore » useful technique for handling multiple start points and integer-valued variables. HOPSPACK ships with the Generating Set Search (GSS) algorithm, developed at Sandia as part of the APPSPACK open source software project.« less

  15. The Multiple Doppler Radar Workshop, November 1979.

    NASA Astrophysics Data System (ADS)

    Carbone, R. E.; Harris, F. I.; Hildebrand, P. H.; Kropfli, R. A.; Miller, L. J.; Moninger, W.; Strauch, R. G.; Doviak, R. J.; Johnson, K. W.; Nelson, S. P.; Ray, P. S.; Gilet, M.

    1980-10-01

    The findings of the Multiple Doppler Radar Workshop are summarized by a series of six papers. Part I of this series briefly reviews the history of multiple Doppler experimentation, fundamental concepts of Doppler signal theory, and organization and objectives of the Workshop. Invited presentations by dynamicists and cloud physicists are also summarized.Experimental design and procedures (Part II) are shown to be of critical importance. Well-defined and limited experimental objectives are necessary in view of technological limitations. Specified radar scanning procedures that balance temporal and spatial resolution considerations are discussed in detail. Improved siting for suppression of ground clutter as well as scanning procedures to minimize errors at echo boundaries are discussed. The need for accelerated research using numerically simulated proxy data sets is emphasized.New technology to eliminate various sampling limitations is cited as an eventual solution to many current problems in Part III. Ground clutter contamination may be curtailed by means of full spectral processing, digital filters in real time, and/or variable pulse repetition frequency. Range and velocity ambiguities also may be minimized by various pulsing options as well as random phase transmission. Sidelobe contamination can be reduced through improvements in radomes, illumination patterns, and antenna feed types. Radar volume-scan time can be sharply reduced by means of wideband transmission, phased array antennas, multiple beam antennas, and frequency agility.Part IV deals with synthesis of data from several radars in the context of scientific requirements in cumulus clouds, widespread precipitation, and severe convective storms. The important temporal and spatial scales are examined together with the accuracy required for vertical air motion in each phenomenon. Factors that introduce errors in the vertical velocity field are identified and synthesis techniques are discussed separately for the dual Doppler and multiple Doppler cases. Various filters and techniques, including statistical and variational approaches, are mentioned. Emphasis is placed on the importance of experiment design and procedures, technological improvements, incorporation of all information from supporting sensors, and analysis priority for physically simple cases. Integrated reliability is proposed as an objective tool for radar siting.Verification of multiple Doppler-derived vertical velocity is discussed in Part V. Three categories of verification are defined as direct, deductive, and theoretical/numerical. Direct verification consists of zenith-pointing radar measurements (from either airborne or ground-based systems), air motion sensing aircraft, instrumented towers, and tracking of radar chaff. Deductive sources include mesonetworks, aircraft (thermodynamic and microphysical) measurements, satellite observations, radar reflectivity, multiple Doppler consistency, and atmospheric soundings. Theoretical/numerical sources of verification include proxy data simulation, momentum checking, and numerical cloud models. New technology, principally in the form of wide bandwidth radars, is seen as a development that may reduce the need for extensive verification of multiple Doppler-derived vertical air motions. Airborne Doppler radar is perceived as the single most important source of verification within the bounds of existing technology.Nine stages of data processing and display are identified in Part VI. The stages are identified as field checks, archival, selection, editing, coordinate transformation, synthesis of Cartesian fields, filtering, display, and physical analysis. Display of data is considered to be a problem critical to assimilation of data at all stages. Interactive computing systems and software are concluded to be very important, particularly for the editing stage. Three- and 4-dimensional displays are considered essential for data assimilation, particularly at the physical analysis stage. The concept of common data tape formats is approved both for data in radar spherical space as well as for synthesized Cartesian output.1169

  16. The medication appropriateness index at 20: where it started, where it has been, and where it may be going.

    PubMed

    Hanlon, Joseph T; Schmader, Kenneth E

    2013-11-01

    Potentially inappropriate prescribing for older adults is a major public health concern. While there are multiple measures of potentially inappropriate prescribing, the medication appropriateness index (MAI) is one of the most common implicit approaches published in the scientific literature. The objective of this narrative review is to describe findings regarding the MAI's reliability, comparison of the MAI with other quality measures of potentially inappropriate prescribing, its predictive validity with important health outcomes, and its responsiveness to change within the framework of randomized controlled trials. A search restricted to English-language literature involving humans aged 65+ years from January 1992 to June 2013 was conducted using MEDLINE and EMBASE databases using the search term 'medication appropriateness index'. A manual search of the reference lists from identified articles and the authors' article files, book chapters, and recent reviews was conducted to identify additional articles. A total of 26 articles were identified for inclusion in this narrative review. The main findings were that the MAI has acceptable inter- and intra-rater reliability, it more frequently detects potentially inappropriate prescribing than a commonly used set of explicit criteria, it predicts adverse health outcomes, and it is able to demonstrate the positive impact of interventions to improve this public health problem. We conclude that the MAI may serve as a valuable tool for measuring potentially inappropriate prescribing in older adults.

  17. Development and validation of a visual grading scale for assessing image quality of AP pelvis radiographic images

    PubMed Central

    England, Andrew; Cassidy, Simon; Eachus, Peter; Dominguez, Alejandro; Hogg, Peter

    2016-01-01

    Objective: The aim of this article was to apply psychometric theory to develop and validate a visual grading scale for assessing the visual perception of digital image quality anteroposterior (AP) pelvis. Methods: Psychometric theory was used to guide scale development. Seven phantom and seven cadaver images of visually and objectively predetermined quality were used to help assess scale reliability and validity. 151 volunteers scored phantom images, and 184 volunteers scored cadaver images. Factor analysis and Cronbach's alpha were used to assess scale validity and reliability. Results: A 24-item scale was produced. Aggregated mean volunteer scores for each image correlated with the rank order of the visually and objectively predetermined image qualities. Scale items had good interitem correlation (≥0.2) and high factor loadings (≥0.3). Cronbach's alpha (reliability) revealed that the scale has acceptable levels of internal reliability for both phantom and cadaver images (α = 0.8 and 0.9, respectively). Factor analysis suggested that the scale is multidimensional (assessing multiple quality themes). Conclusion: This study represents the first full development and validation of a visual image quality scale using psychometric theory. It is likely that this scale will have clinical, training and research applications. Advances in knowledge: This article presents data to create and validate visual grading scales for radiographic examinations. The visual grading scale, for AP pelvis examinations, can act as a validated tool for future research, teaching and clinical evaluations of image quality. PMID:26943836

  18. Rarity-weighted richness: a simple and reliable alternative to integer programming and heuristic algorithms for minimum set and maximum coverage problems in conservation planning.

    PubMed

    Albuquerque, Fabio; Beier, Paul

    2015-01-01

    Here we report that prioritizing sites in order of rarity-weighted richness (RWR) is a simple, reliable way to identify sites that represent all species in the fewest number of sites (minimum set problem) or to identify sites that represent the largest number of species within a given number of sites (maximum coverage problem). We compared the number of species represented in sites prioritized by RWR to numbers of species represented in sites prioritized by the Zonation software package for 11 datasets in which the size of individual planning units (sites) ranged from <1 ha to 2,500 km2. On average, RWR solutions were more efficient than Zonation solutions. Integer programming remains the only guaranteed way find an optimal solution, and heuristic algorithms remain superior for conservation prioritizations that consider compactness and multiple near-optimal solutions in addition to species representation. But because RWR can be implemented easily and quickly in R or a spreadsheet, it is an attractive alternative to integer programming or heuristic algorithms in some conservation prioritization contexts.

  19. FY04 Advanced Life Support Architecture and Technology Studies: Mid-Year Presentation

    NASA Technical Reports Server (NTRS)

    Lange, Kevin; Anderson, Molly; Duffield, Bruce; Hanford, Tony; Jeng, Frank

    2004-01-01

    Long-Term Objective: Identify optimal advanced life support system designs that meet existing and projected requirements for future human spaceflight missions. a) Include failure-tolerance, reliability, and safe-haven requirements. b) Compare designs based on multiple criteria including equivalent system mass (ESM), technology readiness level (TRL), simplicity, commonality, etc. c) Develop and evaluate new, more optimal, architecture concepts and technology applications.

  20. Implications of Preference and Problem Formulation on the Operating Policies of Complex Multi-Reservoir Systems

    NASA Astrophysics Data System (ADS)

    Quinn, J.; Reed, P. M.; Giuliani, M.; Castelletti, A.

    2016-12-01

    Optimizing the operations of multi-reservoir systems poses several challenges: 1) the high dimension of the problem's states and controls, 2) the need to balance conflicting multi-sector objectives, and 3) understanding how uncertainties impact system performance. These difficulties motivated the development of the Evolutionary Multi-Objective Direct Policy Search (EMODPS) framework, in which multi-reservoir operating policies are parameterized in a given family of functions and then optimized for multiple objectives through simulation over a set of stochastic inputs. However, properly framing these objectives remains a severe challenge and a neglected source of uncertainty. Here, we use EMODPS to optimize operating policies for a 4-reservoir system in the Red River Basin in Vietnam, exploring the consequences of optimizing to different sets of objectives related to 1) hydropower production, 2) meeting multi-sector water demands, and 3) providing flood protection to the capital city of Hanoi. We show how coordinated operation of the reservoirs can differ markedly depending on how decision makers weigh these concerns. Moreover, we illustrate how formulation choices that emphasize the mean, tail, or variability of performance across objective combinations must be evaluated carefully. Our results show that these choices can significantly improve attainable system performance, or yield severe unintended consequences. Finally, we show that satisfactory validation of the operating policies on a set of out-of-sample stochastic inputs depends as much or more on the formulation of the objectives as on effective optimization of the policies. These observations highlight the importance of carefully considering how we abstract stakeholders' objectives and of iteratively optimizing and visualizing multiple problem formulation hypotheses to ensure that we capture the most important tradeoffs that emerge from different stakeholder preferences.

  1. Toward automated formation of microsphere arrangements using multiplexed optical tweezers

    NASA Astrophysics Data System (ADS)

    Rajasekaran, Keshav; Bollavaram, Manasa; Banerjee, Ashis G.

    2016-09-01

    Optical tweezers offer certain advantages such as multiplexing using a programmable spatial light modulator, flexibility in the choice of the manipulated object and the manipulation medium, precise control, easy object release, and minimal object damage. However, automated manipulation of multiple objects in parallel, which is essential for efficient and reliable formation of micro-scale assembly structures, poses a difficult challenge. There are two primary research issues in addressing this challenge. First, the presence of stochastic Langevin force giving rise to Brownian motion requires motion control for all the manipulated objects at fast rates of several Hz. Second, the object dynamics is non-linear and even difficult to represent analytically due to the interaction of multiple optical traps that are manipulating neighboring objects. As a result, automated controllers have not been realized for tens of objects, particularly with three dimensional motions with guaranteed collision avoidances. In this paper, we model the effect of interacting optical traps on microspheres with significant Brownian motions in stationary fluid media, and develop simplified state-space representations. These representations are used to design a model predictive controller to coordinate the motions of several spheres in real time. Preliminary experiments demonstrate the utility of the controller in automatically forming desired arrangements of varying configurations starting with randomly dispersed microspheres.

  2. Genetic Algorithms Applied to Multi-Objective Aerodynamic Shape Optimization

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.

    2004-01-01

    A genetic algorithm approach suitable for solving multi-objective optimization problems is described and evaluated using a series of aerodynamic shape optimization problems. Several new features including two variations of a binning selection algorithm and a gene-space transformation procedure are included. The genetic algorithm is suitable for finding pareto optimal solutions in search spaces that are defined by any number of genes and that contain any number of local extrema. A new masking array capability is included allowing any gene or gene subset to be eliminated as decision variables from the design space. This allows determination of the effect of a single gene or gene subset on the pareto optimal solution. Results indicate that the genetic algorithm optimization approach is flexible in application and reliable. The binning selection algorithms generally provide pareto front quality enhancements and moderate convergence efficiency improvements for most of the problems solved.

  3. Genetic Algorithms Applied to Multi-Objective Aerodynamic Shape Optimization

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.

    2005-01-01

    A genetic algorithm approach suitable for solving multi-objective problems is described and evaluated using a series of aerodynamic shape optimization problems. Several new features including two variations of a binning selection algorithm and a gene-space transformation procedure are included. The genetic algorithm is suitable for finding Pareto optimal solutions in search spaces that are defined by any number of genes and that contain any number of local extrema. A new masking array capability is included allowing any gene or gene subset to be eliminated as decision variables from the design space. This allows determination of the effect of a single gene or gene subset on the Pareto optimal solution. Results indicate that the genetic algorithm optimization approach is flexible in application and reliable. The binning selection algorithms generally provide Pareto front quality enhancements and moderate convergence efficiency improvements for most of the problems solved.

  4. Multi-view video segmentation and tracking for video surveillance

    NASA Astrophysics Data System (ADS)

    Mohammadi, Gelareh; Dufaux, Frederic; Minh, Thien Ha; Ebrahimi, Touradj

    2009-05-01

    Tracking moving objects is a critical step for smart video surveillance systems. Despite the complexity increase, multiple camera systems exhibit the undoubted advantages of covering wide areas and handling the occurrence of occlusions by exploiting the different viewpoints. The technical problems in multiple camera systems are several: installation, calibration, objects matching, switching, data fusion, and occlusion handling. In this paper, we address the issue of tracking moving objects in an environment covered by multiple un-calibrated cameras with overlapping fields of view, typical of most surveillance setups. Our main objective is to create a framework that can be used to integrate objecttracking information from multiple video sources. Basically, the proposed technique consists of the following steps. We first perform a single-view tracking algorithm on each camera view, and then apply a consistent object labeling algorithm on all views. In the next step, we verify objects in each view separately for inconsistencies. Correspondent objects are extracted through a Homography transform from one view to the other and vice versa. Having found the correspondent objects of different views, we partition each object into homogeneous regions. In the last step, we apply the Homography transform to find the region map of first view in the second view and vice versa. For each region (in the main frame and mapped frame) a set of descriptors are extracted to find the best match between two views based on region descriptors similarity. This method is able to deal with multiple objects. Track management issues such as occlusion, appearance and disappearance of objects are resolved using information from all views. This method is capable of tracking rigid and deformable objects and this versatility lets it to be suitable for different application scenarios.

  5. Distribution System Reliability Analysis for Smart Grid Applications

    NASA Astrophysics Data System (ADS)

    Aljohani, Tawfiq Masad

    Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

  6. Blinded evaluation of interrater reliability of an operative competency assessment tool for direct laryngoscopy and rigid bronchoscopy.

    PubMed

    Ishman, Stacey L; Benke, James R; Johnson, Kaalan Erik; Zur, Karen B; Jacobs, Ian N; Thorne, Marc C; Brown, David J; Lin, Sandra Y; Bhatti, Nasir; Deutsch, Ellen S

    2012-10-01

    OBJECTIVES To confirm interrater reliability using blinded evaluation of a skills-assessment instrument to assess the surgical performance of resident and fellow trainees performing pediatric direct laryngoscopy and rigid bronchoscopy in simulated models. DESIGN Prospective, paired, blinded observational validation study. SUBJECTS Paired observers from multiple institutions simultaneously evaluated residents and fellows who were performing surgery in an animal laboratory or using high-fidelity manikins. The evaluators had no previous affiliation with the residents and fellows and did not know their year of training. INTERVENTIONS One- and 2-page versions of an objective structured assessment of technical skills (OSATS) assessment instrument composed of global and a task-specific surgical items were used to evaluate surgical performance. RESULTS Fifty-two evaluations were completed by 17 attending evaluators. The instrument agreement for the 2-page assessment was 71.4% when measured as a binary variable (ie, competent vs not competent) (κ = 0.38; P = .08). Evaluation as a continuous variable revealed a 42.9% percentage agreement (κ = 0.18; P = .14). The intraclass correlation was 0.53, considered substantial/good interrater reliability (69% reliable). For the 1-page instrument, agreement was 77.4% when measured as a binary variable (κ = 0.53, P = .0015). Agreement when evaluated as a continuous measure was 71.0% (κ = 0.54, P < .001). The intraclass correlation was 0.73, considered high interrater reliability (85% reliable). CONCLUSIONS The OSATS assessment instrument is an effective tool for evaluating surgical performance among trainees with acceptable interrater reliability in a simulator setting. Reliability was good for both the 1- and 2-page OSATS checklists, and both serve as excellent tools to provide immediate formative feedback on operational competency.

  7. Problems of Implementing SCORM in an Enterprise Distance Learning Architecture: SCORM Incompatibility across Multiple Web Domains.

    ERIC Educational Resources Information Center

    Engelbrecht, Jeffrey C.

    2003-01-01

    Delivering content to distant users located in dispersed networks, separated by firewalls and different web domains requires extensive customization and integration. This article outlines some of the problems of implementing the Sharable Content Object Reference Model (SCORM) in the Marine Corps' Distance Learning System (MarineNet) and extends…

  8. Measurement of in vivo local shear modulus using MR elastography multiple-phase patchwork offsets.

    PubMed

    Suga, Mikio; Matsuda, Tetsuya; Minato, Kotaro; Oshiro, Osamu; Chihara, Kunihiro; Okamoto, Jun; Takizawa, Osamu; Komori, Masaru; Takahashi, Takashi

    2003-07-01

    Magnetic resonance elastography (MRE) is a method that can visualize the propagating and standing shear waves in an object being measured. The quantitative value of a shear modulus can be calculated by estimating the local shear wavelength. Low-frequency mechanical motion must be used for soft, tissue-like objects because a propagating shear wave rapidly attenuates at a higher frequency. Moreover, a propagating shear wave is distorted by reflections from the boundaries of objects. However, the distortions are minimal around the wave front of the propagating shear wave. Therefore, we can avoid the effect of reflection on a region of interest (ROI) by adjusting the duration of mechanical vibrations. Thus, the ROI is often shorter than the propagating shear wavelength. In the MRE sequence, a motion-sensitizing gradient (MSG) is synchronized with mechanical cyclic motion. MRE images with multiple initial phase offsets can be generated with increasing delays between the MSG and mechanical vibrations. This paper proposes a method for measuring the local shear wavelength using MRE multiple initial phase patchwork offsets that can be used when the size of the object being measured is shorter than the local wavelength. To confirm the reliability of the proposed method, computer simulations, a simulated tissue study and in vitro and in vivo studies were performed.

  9. Parallel Computing for Probabilistic Response Analysis of High Temperature Composites

    NASA Technical Reports Server (NTRS)

    Sues, R. H.; Lua, Y. J.; Smith, M. D.

    1994-01-01

    The objective of this Phase I research was to establish the required software and hardware strategies to achieve large scale parallelism in solving PCM problems. To meet this objective, several investigations were conducted. First, we identified the multiple levels of parallelism in PCM and the computational strategies to exploit these parallelisms. Next, several software and hardware efficiency investigations were conducted. These involved the use of three different parallel programming paradigms and solution of two example problems on both a shared-memory multiprocessor and a distributed-memory network of workstations.

  10. Universal approximators for multi-objective direct policy search in water reservoir management problems: a comparative analysis

    NASA Astrophysics Data System (ADS)

    Giuliani, Matteo; Mason, Emanuele; Castelletti, Andrea; Pianosi, Francesca

    2014-05-01

    The optimal operation of water resources systems is a wide and challenging problem due to non-linearities in the model and the objectives, high dimensional state-control space, and strong uncertainties in the hydroclimatic regimes. The application of classical optimization techniques (e.g., SDP, Q-learning, gradient descent-based algorithms) is strongly limited by the dimensionality of the system and by the presence of multiple, conflicting objectives. This study presents a novel approach which combines Direct Policy Search (DPS) and Multi-Objective Evolutionary Algorithms (MOEAs) to solve high-dimensional state and control space problems involving multiple objectives. DPS, also known as parameterization-simulation-optimization in the water resources literature, is a simulation-based approach where the reservoir operating policy is first parameterized within a given family of functions and, then, the parameters optimized with respect to the objectives of the management problem. The selection of a suitable class of functions to which the operating policy belong to is a key step, as it might restrict the search for the optimal policy to a subspace of the decision space that does not include the optimal solution. In the water reservoir literature, a number of classes have been proposed. However, many of these rules are based largely on empirical or experimental successes and they were designed mostly via simulation and for single-purpose reservoirs. In a multi-objective context similar rules can not easily inferred from the experience and the use of universal function approximators is generally preferred. In this work, we comparatively analyze two among the most common universal approximators: artificial neural networks (ANN) and radial basis functions (RBF) under different problem settings to estimate their scalability and flexibility in dealing with more and more complex problems. The multi-purpose HoaBinh water reservoir in Vietnam, accounting for hydropower production and flood control, is used as a case study. Preliminary results show that the RBF policy parametrization is more effective than the ANN one. In particular, the approximated Pareto front obtained with RBF control policies successfully explores the full tradeoff space between the two conflicting objectives, while most of the ANN solutions results to be Pareto-dominated by the RBF ones.

  11. Gender approaches to evolutionary multi-objective optimization using pre-selection of criteria

    NASA Astrophysics Data System (ADS)

    Kowalczuk, Zdzisław; Białaszewski, Tomasz

    2018-01-01

    A novel idea to perform evolutionary computations (ECs) for solving highly dimensional multi-objective optimization (MOO) problems is proposed. Following the general idea of evolution, it is proposed that information about gender is used to distinguish between various groups of objectives and identify the (aggregate) nature of optimality of individuals (solutions). This identification is drawn out of the fitness of individuals and applied during parental crossover in the processes of evolutionary multi-objective optimization (EMOO). The article introduces the principles of the genetic-gender approach (GGA) and virtual gender approach (VGA), which are not just evolutionary techniques, but constitute a completely new rule (philosophy) for use in solving MOO tasks. The proposed approaches are validated against principal representatives of the EMOO algorithms of the state of the art in solving benchmark problems in the light of recognized EC performance criteria. The research shows the superiority of the gender approach in terms of effectiveness, reliability, transparency, intelligibility and MOO problem simplification, resulting in the great usefulness and practicability of GGA and VGA. Moreover, an important feature of GGA and VGA is that they alleviate the 'curse' of dimensionality typical of many engineering designs.

  12. The merits and problems of Neuropsychiatric Inventory as an assessment tool in people with dementia and other neurological disorders.

    PubMed

    Lai, Claudia K Y

    2014-01-01

    The Neuropsychiatric Inventory (NPI) is one of the most commonly used assessment scales for assessing symptoms in people with dementia and other neurological disorders. This paper analyzes its conceptual framework, measurement mode, psychometric properties, and merits and problems. All articles discussing the psychometric properties and factor structure of the NPI were searched for in Medline via Ovid. The abstracts of these papers were read to determine their relevance to the purpose of this paper. If deemed appropriate, a full paper was then obtained and read. The NPI has reasonably good content validity and internal consistency, and good test-retest and interrater reliability. There is limited information about its sensitivity, specificity, positive and negative predictive values, and, in particular, responsiveness. Merits of the NPI include being comprehensive, avoiding symptom overlap, ease of use, and flexibility. It has problems in scoring (no multiples of 5, 7, and 11) and, therefore, analysis using parametric tests may not be appropriate. The use of individual subscales also warrants further investigation. In terms of its content and concurrent validity, intra- and interrater reliability, test-retest reliability, and internal consistency, the NPI can be considered as valid and reliable, and can be used across different ethnic groups. The tool is most likely unable to deliver as good a performance in terms of discriminating between different disorders. More studies are required to further evaluate its psychometric properties, particularly in the areas of factor structure and responsiveness. The clinical utility of the NPI also needs to be further explored.

  13. The reliability and validity study of the Kinesthetic and Visual Imagery Questionnaire in individuals with Multiple Sclerosis

    PubMed Central

    Tabrizi, Yousef Moghadas; Zangiabadi, Nasser; Mazhari, Shahrzad; Zolala, Farzaneh

    2013-01-01

    Objective Motor imagery (MI) has been recently considered as an adjunct to physical rehabilitation in patients with multiple sclerosis (MS). It is necessary to assess MI abilities and benefits in patients with MS by using a reliable tool. The Kinesthetic and Visual Imagery Questionnaire (KVIQ) was recently developed to assess MI ability in patients with stroke and other disabilities. Considering the different underlying pathologies, the present study aimed to examine the validity and reliability of the KVIQ in MS patients. Method Fifteen MS patients were assessed using the KVIQ in 2 sessions (5-14days apart) by the same examiner. In the second session, the participants also completed a revised MI questionnaire (MIQ-R) as the gold standard. Intra-class correlation coefficients (ICCs) were measured to determine test-retest reliability. Spearman's correlation analysis was performed to assess concurrent validity with the MIQ-R. Furthermore, the internal consistency (Cronbach's alpha) and factorial structure of the KVIQ were studied. Results The test-retest reliability for the KVIQ was good (ICCs: total KVIQ=0.89, visual KVIQ=0.85, and kinesthetic KVIQ=0.93), and the concurrent validity between the KVIQ and MIQ-R was good (r=0.79). The KVIQ had good internal consistency, with high Cronbach's alpha (alpha=0.84). Factorial analysis showed the bi-factorial structure of the KVIQ, which was explained by visual=57.6% and kinesthetic=32.4%. Conclusions The results of the present study revealed that the KVIQ is a valid and reliable tool for assessing MI in MS patients. PMID:24271091

  14. The One to Multiple Automatic High Accuracy Registration of Terrestrial LIDAR and Optical Images

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Hu, C.; Xia, G.; Xue, H.

    2018-04-01

    The registration of ground laser point cloud and close-range image is the key content of high-precision 3D reconstruction of cultural relic object. In view of the requirement of high texture resolution in the field of cultural relic at present, The registration of point cloud and image data in object reconstruction will result in the problem of point cloud to multiple images. In the current commercial software, the two pairs of registration of the two kinds of data are realized by manually dividing point cloud data, manual matching point cloud and image data, manually selecting a two - dimensional point of the same name of the image and the point cloud, and the process not only greatly reduces the working efficiency, but also affects the precision of the registration of the two, and causes the problem of the color point cloud texture joint. In order to solve the above problems, this paper takes the whole object image as the intermediate data, and uses the matching technology to realize the automatic one-to-one correspondence between the point cloud and multiple images. The matching of point cloud center projection reflection intensity image and optical image is applied to realize the automatic matching of the same name feature points, and the Rodrigo matrix spatial similarity transformation model and weight selection iteration are used to realize the automatic registration of the two kinds of data with high accuracy. This method is expected to serve for the high precision and high efficiency automatic 3D reconstruction of cultural relic objects, which has certain scientific research value and practical significance.

  15. Boosting Bayesian parameter inference of nonlinear stochastic differential equation models by Hamiltonian scale separation.

    PubMed

    Albert, Carlo; Ulzega, Simone; Stoop, Ruedi

    2016-04-01

    Parameter inference is a fundamental problem in data-driven modeling. Given observed data that is believed to be a realization of some parameterized model, the aim is to find parameter values that are able to explain the observed data. In many situations, the dominant sources of uncertainty must be included into the model for making reliable predictions. This naturally leads to stochastic models. Stochastic models render parameter inference much harder, as the aim then is to find a distribution of likely parameter values. In Bayesian statistics, which is a consistent framework for data-driven learning, this so-called posterior distribution can be used to make probabilistic predictions. We propose a novel, exact, and very efficient approach for generating posterior parameter distributions for stochastic differential equation models calibrated to measured time series. The algorithm is inspired by reinterpreting the posterior distribution as a statistical mechanics partition function of an object akin to a polymer, where the measurements are mapped on heavier beads compared to those of the simulated data. To arrive at distribution samples, we employ a Hamiltonian Monte Carlo approach combined with a multiple time-scale integration. A separation of time scales naturally arises if either the number of measurement points or the number of simulation points becomes large. Furthermore, at least for one-dimensional problems, we can decouple the harmonic modes between measurement points and solve the fastest part of their dynamics analytically. Our approach is applicable to a wide range of inference problems and is highly parallelizable.

  16. A two-stage path planning approach for multiple car-like robots based on PH curves and a modified harmony search algorithm

    NASA Astrophysics Data System (ADS)

    Zeng, Wenhui; Yi, Jin; Rao, Xiao; Zheng, Yun

    2017-11-01

    In this article, collision-avoidance path planning for multiple car-like robots with variable motion is formulated as a two-stage objective optimization problem minimizing both the total length of all paths and the task's completion time. Accordingly, a new approach based on Pythagorean Hodograph (PH) curves and Modified Harmony Search algorithm is proposed to solve the two-stage path-planning problem subject to kinematic constraints such as velocity, acceleration, and minimum turning radius. First, a method of path planning based on PH curves for a single robot is proposed. Second, a mathematical model of the two-stage path-planning problem for multiple car-like robots with variable motion subject to kinematic constraints is constructed that the first-stage minimizes the total length of all paths and the second-stage minimizes the task's completion time. Finally, a modified harmony search algorithm is applied to solve the two-stage optimization problem. A set of experiments demonstrate the effectiveness of the proposed approach.

  17. Competing risk models in reliability systems, a weibull distribution model with bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, Ismed; Satria Gondokaryono, Yudi

    2016-02-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range between the true value and the maximum likelihood estimated value lines.

  18. Real-time multiple objects tracking on Raspberry-Pi-based smart embedded camera

    NASA Astrophysics Data System (ADS)

    Dziri, Aziz; Duranton, Marc; Chapuis, Roland

    2016-07-01

    Multiple-object tracking constitutes a major step in several computer vision applications, such as surveillance, advanced driver assistance systems, and automatic traffic monitoring. Because of the number of cameras used to cover a large area, these applications are constrained by the cost of each node, the power consumption, the robustness of the tracking, the processing time, and the ease of deployment of the system. To meet these challenges, the use of low-power and low-cost embedded vision platforms to achieve reliable tracking becomes essential in networks of cameras. We propose a tracking pipeline that is designed for fixed smart cameras and which can handle occlusions between objects. We show that the proposed pipeline reaches real-time processing on a low-cost embedded smart camera composed of a Raspberry-Pi board and a RaspiCam camera. The tracking quality and the processing speed obtained with the proposed pipeline are evaluated on publicly available datasets and compared to the state-of-the-art methods.

  19. On the optimization of electromagnetic geophysical data: Application of the PSO algorithm

    NASA Astrophysics Data System (ADS)

    Godio, A.; Santilano, A.

    2018-01-01

    Particle Swarm optimization (PSO) algorithm resolves constrained multi-parameter problems and is suitable for simultaneous optimization of linear and nonlinear problems, with the assumption that forward modeling is based on good understanding of ill-posed problem for geophysical inversion. We apply PSO for solving the geophysical inverse problem to infer an Earth model, i.e. the electrical resistivity at depth, consistent with the observed geophysical data. The method doesn't require an initial model and can be easily constrained, according to external information for each single sounding. The optimization process to estimate the model parameters from the electromagnetic soundings focuses on the discussion of the objective function to be minimized. We discuss the possibility to introduce in the objective function vertical and lateral constraints, with an Occam-like regularization. A sensitivity analysis allowed us to check the performance of the algorithm. The reliability of the approach is tested on synthetic, real Audio-Magnetotelluric (AMT) and Long Period MT data. The method appears able to solve complex problems and allows us to estimate the a posteriori distribution of the model parameters.

  20. Using a Pareto-optimal solution set to characterize trade-offs between a broad range of values and preferences in climate risk management

    NASA Astrophysics Data System (ADS)

    Garner, Gregory; Reed, Patrick; Keller, Klaus

    2015-04-01

    Integrated assessment models (IAMs) are often used to inform the design of climate risk management strategies. Previous IAM studies have broken important new ground on analyzing the effects of parametric uncertainties, but they are often silent on the implications of uncertainties regarding the problem formulation. Here we use the Dynamic Integrated model of Climate and the Economy (DICE) to analyze the effects of uncertainty surrounding the definition of the objective(s). The standard DICE model adopts a single objective to maximize a weighted sum of utilities of per-capita consumption. Decision makers, however, are often concerned with a broader range of values and preferences that may be poorly captured by this a priori definition of utility. We reformulate the problem by introducing three additional objectives that represent values such as (i) reliably limiting global average warming to two degrees Celsius and minimizing (ii) the costs of abatement and (iii) the climate change damages. We use advanced multi-objective optimization methods to derive a set of Pareto-optimal solutions over which decision makers can trade-off and assess performance criteria a posteriori. We illustrate the potential for myopia in the traditional problem formulation and discuss the capability of this multiobjective formulation to provide decision support.

  1. A cross-sectional assessment of the prevalence of multiple chronic conditions and medication use in a sample of community-dwelling adults with fibromyalgia in Olmsted County, Minnesota

    PubMed Central

    Vincent, Ann; Whipple, Mary O; McAllister, Samantha J; Aleman, Katherine M; St Sauver, Jennifer L

    2015-01-01

    Objectives The objective of this study was to evaluate the problem of multiple chronic conditions and polypharmacy in patients with fibromyalgia. Design Retrospective medical record review. Setting Olmsted County, Minnesota. Participants 1111 adults with fibromyalgia. Primary and secondary outcome measures Number and type of chronic medical and psychiatric conditions, medication use. Results Medical record review demonstrated that greater than 50% of the sample had seven or more chronic conditions. Chronic joint pain/degenerative arthritis was the most frequent comorbidity (88.7%), followed by depression (75.1%), migraines/chronic headaches (62.4%) and anxiety (56.5%). Approximately, 40% of patients were taking three or more medications for symptoms of fibromyalgia. Sleep aids were the most commonly prescribed medications in our sample (33.3%) followed by selective serotonin reuptake inhibitors (28.7%), opioids (22.4%) and serotonin norepinephrine reuptake inhibitors (21.0%). Conclusions The results of our study highlight the problem of multiple chronic conditions and high prevalence of polypharmacy in fibromyalgia. Clinicians who care for patients with fibromyalgia should take into consideration the presence of multiple chronic conditions when recommending medications. PMID:25735301

  2. Human Factors in Financial Trading

    PubMed Central

    Leaver, Meghan; Reader, Tom W.

    2016-01-01

    Objective This study tests the reliability of a system (FINANS) to collect and analyze incident reports in the financial trading domain and is guided by a human factors taxonomy used to describe error in the trading domain. Background Research indicates the utility of applying human factors theory to understand error in finance, yet empirical research is lacking. We report on the development of the first system for capturing and analyzing human factors–related issues in operational trading incidents. Method In the first study, 20 incidents are analyzed by an expert user group against a referent standard to establish the reliability of FINANS. In the second study, 750 incidents are analyzed using distribution, mean, pathway, and associative analysis to describe the data. Results Kappa scores indicate that categories within FINANS can be reliably used to identify and extract data on human factors–related problems underlying trading incidents. Approximately 1% of trades (n = 750) lead to an incident. Slip/lapse (61%), situation awareness (51%), and teamwork (40%) were found to be the most common problems underlying incidents. For the most serious incidents, problems in situation awareness and teamwork were most common. Conclusion We show that (a) experts in the trading domain can reliably and accurately code human factors in incidents, (b) 1% of trades incur error, and (c) poor teamwork skills and situation awareness underpin the most critical incidents. Application This research provides data crucial for ameliorating risk within financial trading organizations, with implications for regulation and policy. PMID:27142394

  3. Online Hierarchical Sparse Representation of Multifeature for Robust Object Tracking

    PubMed Central

    Qu, Shiru

    2016-01-01

    Object tracking based on sparse representation has given promising tracking results in recent years. However, the trackers under the framework of sparse representation always overemphasize the sparse representation and ignore the correlation of visual information. In addition, the sparse coding methods only encode the local region independently and ignore the spatial neighborhood information of the image. In this paper, we propose a robust tracking algorithm. Firstly, multiple complementary features are used to describe the object appearance; the appearance model of the tracked target is modeled by instantaneous and stable appearance features simultaneously. A two-stage sparse-coded method which takes the spatial neighborhood information of the image patch and the computation burden into consideration is used to compute the reconstructed object appearance. Then, the reliability of each tracker is measured by the tracking likelihood function of transient and reconstructed appearance models. Finally, the most reliable tracker is obtained by a well established particle filter framework; the training set and the template library are incrementally updated based on the current tracking results. Experiment results on different challenging video sequences show that the proposed algorithm performs well with superior tracking accuracy and robustness. PMID:27630710

  4. Confirmatory Factor Analysis of the PKBS-2 Subscales for Assessing Social Skills and Behavioral Problems in Preschool Education

    ERIC Educational Resources Information Center

    Fernandez, Maria; Benitez, Juan L.; Pichardo, M. Carmen; Fernandez, Eduardo; Justicia, Fernando; Garcia, Trinidad; Garcia-Berben, Ana; Justicia, Ana; Alba, Guadalupe

    2010-01-01

    Introduction: Different research studies point out the importance of social competence as a protective factor against antisocial behavior. They likewise alert us of the importance of having valid, reliable instruments that measure these constructs in early childhood. Method: The objective of this research is to validate the subscales of the…

  5. Efficient Network Coding-Based Loss Recovery for Reliable Multicast in Wireless Networks

    NASA Astrophysics Data System (ADS)

    Chi, Kaikai; Jiang, Xiaohong; Ye, Baoliu; Horiguchi, Susumu

    Recently, network coding has been applied to the loss recovery of reliable multicast in wireless networks [19], where multiple lost packets are XOR-ed together as one packet and forwarded via single retransmission, resulting in a significant reduction of bandwidth consumption. In this paper, we first prove that maximizing the number of lost packets for XOR-ing, which is the key part of the available network coding-based reliable multicast schemes, is actually a complex NP-complete problem. To address this limitation, we then propose an efficient heuristic algorithm for finding an approximately optimal solution of this optimization problem. Furthermore, we show that the packet coding principle of maximizing the number of lost packets for XOR-ing sometimes cannot fully exploit the potential coding opportunities, and we then further propose new heuristic-based schemes with a new coding principle. Simulation results demonstrate that the heuristic-based schemes have very low computational complexity and can achieve almost the same transmission efficiency as the current coding-based high-complexity schemes. Furthermore, the heuristic-based schemes with the new coding principle not only have very low complexity, but also slightly outperform the current high-complexity ones.

  6. Data Reduction Algorithm Using Nonnegative Matrix Factorization with Nonlinear Constraints

    NASA Astrophysics Data System (ADS)

    Sembiring, Pasukat

    2017-12-01

    Processing ofdata with very large dimensions has been a hot topic in recent decades. Various techniques have been proposed in order to execute the desired information or structure. Non- Negative Matrix Factorization (NMF) based on non-negatives data has become one of the popular methods for shrinking dimensions. The main strength of this method is non-negative object, the object model by a combination of some basic non-negative parts, so as to provide a physical interpretation of the object construction. The NMF is a dimension reduction method thathasbeen used widely for numerous applications including computer vision,text mining, pattern recognitions,and bioinformatics. Mathematical formulation for NMF did not appear as a convex optimization problem and various types of algorithms have been proposed to solve the problem. The Framework of Alternative Nonnegative Least Square(ANLS) are the coordinates of the block formulation approaches that have been proven reliable theoretically and empirically efficient. This paper proposes a new algorithm to solve NMF problem based on the framework of ANLS.This algorithm inherits the convergenceproperty of the ANLS framework to nonlinear constraints NMF formulations.

  7. An experiment in software reliability: Additional analyses using data from automated replications

    NASA Technical Reports Server (NTRS)

    Dunham, Janet R.; Lauterbach, Linda A.

    1988-01-01

    A study undertaken to collect software error data of laboratory quality for use in the development of credible methods for predicting the reliability of software used in life-critical applications is summarized. The software error data reported were acquired through automated repetitive run testing of three independent implementations of a launch interceptor condition module of a radar tracking problem. The results are based on 100 test applications to accumulate a sufficient sample size for error rate estimation. The data collected is used to confirm the results of two Boeing studies reported in NASA-CR-165836 Software Reliability: Repetitive Run Experimentation and Modeling, and NASA-CR-172378 Software Reliability: Additional Investigations into Modeling With Replicated Experiments, respectively. That is, the results confirm the log-linear pattern of software error rates and reject the hypothesis of equal error rates per individual fault. This rejection casts doubt on the assumption that the program's failure rate is a constant multiple of the number of residual bugs; an assumption which underlies some of the current models of software reliability. data raises new questions concerning the phenomenon of interacting faults.

  8. A multi-objective optimization model for hub network design under uncertainty: An inexact rough-interval fuzzy approach

    NASA Astrophysics Data System (ADS)

    Niakan, F.; Vahdani, B.; Mohammadi, M.

    2015-12-01

    This article proposes a multi-objective mixed-integer model to optimize the location of hubs within a hub network design problem under uncertainty. The considered objectives include minimizing the maximum accumulated travel time, minimizing the total costs including transportation, fuel consumption and greenhouse emissions costs, and finally maximizing the minimum service reliability. In the proposed model, it is assumed that for connecting two nodes, there are several types of arc in which their capacity, transportation mode, travel time, and transportation and construction costs are different. Moreover, in this model, determining the capacity of the hubs is part of the decision-making procedure and balancing requirements are imposed on the network. To solve the model, a hybrid solution approach is utilized based on inexact programming, interval-valued fuzzy programming and rough interval programming. Furthermore, a hybrid multi-objective metaheuristic algorithm, namely multi-objective invasive weed optimization (MOIWO), is developed for the given problem. Finally, various computational experiments are carried out to assess the proposed model and solution approaches.

  9. Interactive multi-objective path planning through a palette-based user interface

    NASA Astrophysics Data System (ADS)

    Shaikh, Meher T.; Goodrich, Michael A.; Yi, Daqing; Hoehne, Joseph

    2016-05-01

    n a problem where a human uses supervisory control to manage robot path-planning, there are times when human does the path planning, and if satisfied commits those paths to be executed by the robot, and the robot executes that plan. In planning a path, the robot often uses an optimization algorithm that maximizes or minimizes an objective. When a human is assigned the task of path planning for robot, the human may care about multiple objectives. This work proposes a graphical user interface (GUI) designed for interactive robot path-planning when an operator may prefer one objective over others or care about how multiple objectives are traded off. The GUI represents multiple objectives using the metaphor of an artist's palette. A distinct color is used to represent each objective, and tradeoffs among objectives are balanced in a manner that an artist mixes colors to get the desired shade of color. Thus, human intent is analogous to the artist's shade of color. We call the GUI an "Adverb Palette" where the word "Adverb" represents a specific type of objective for the path, such as the adverbs "quickly" and "safely" in the commands: "travel the path quickly", "make the journey safely". The novel interactive interface provides the user an opportunity to evaluate various alternatives (that tradeoff between different objectives) by allowing her to visualize the instantaneous outcomes that result from her actions on the interface. In addition to assisting analysis of various solutions given by an optimization algorithm, the palette has additional feature of allowing the user to define and visualize her own paths, by means of waypoints (guiding locations) thereby spanning variety for planning. The goal of the Adverb Palette is thus to provide a way for the user and robot to find an acceptable solution even though they use very different representations of the problem. Subjective evaluations suggest that even non-experts in robotics can carry out the planning tasks with a great deal of flexibility using the adverb palette.

  10. Evaluation of the CEAS model for barley yields in North Dakota and Minnesota

    NASA Technical Reports Server (NTRS)

    Barnett, T. L. (Principal Investigator)

    1981-01-01

    The CEAS yield model is based upon multiple regression analysis at the CRD and state levels. For the historical time series, yield is regressed on a set of variables derived from monthly mean temperature and monthly precipitation. Technological trend is represented by piecewise linear and/or quadriatic functions of year. Indicators of yield reliability obtained from a ten-year bootstrap test (1970-79) demonstrated that biases are small and performance as indicated by the root mean square errors are acceptable for intended application, however, model response for individual years particularly unusual years, is not very reliable and shows some large errors. The model is objective, adequate, timely, simple and not costly. It considers scientific knowledge on a broad scale but not in detail, and does not provide a good current measure of modeled yield reliability.

  11. The reliability of photoneutron cross sections for 90,91,92,94Zr

    NASA Astrophysics Data System (ADS)

    Varlamov, V. V.; Davydov, A. I.; Ishkhanov, B. S.; Orlin, V. N.

    2018-05-01

    Data on partial photoneutron reaction cross sections (γ,1n) and (γ,2n) for 90,91,92,94Zr obtained at Livermore (USA) and for 90Zr obtained at Saclay (France) were analyzed. Experimental data were obtained using quasimonoenergetic photon beams from the annihilation in flight of relativistic positrons. The method of photoneutron multiplicity sorting based on the neutron energy measuring was used to separate partial reactions. The research carried out is based on the objective of using the physical criteria of data reliability. The large systematic uncertainties were found in partial cross sections, since they do not satisfy those criteria. To obtain the reliable cross sections of the partial (γ,1n) and (γ,2n) and total (γ,1n) + (γ,2n) reactions on 90,91,92,94Zr and (γ,3n) reaction on 94Zr, the experimental-theoretical method was used. It is based on the experimental data for neutron yield cross section rather independent from the neutron multiplicity and theoretical equations of the combined photonucleon reaction model (CPNRM). Newly evaluated data are compared with experimental ones. The reasons of noticeable disagreements between those are discussed.

  12. Preliminary Development of an Object-Oriented Optimization Tool

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi

    2011-01-01

    The National Aeronautics and Space Administration Dryden Flight Research Center has developed a FORTRAN-based object-oriented optimization (O3) tool that leverages existing tools and practices and allows easy integration and adoption of new state-of-the-art software. The object-oriented framework can integrate the analysis codes for multiple disciplines, as opposed to relying on one code to perform analysis for all disciplines. Optimization can thus take place within each discipline module, or in a loop between the central executive module and the discipline modules, or both. Six sample optimization problems are presented. The first four sample problems are based on simple mathematical equations; the fifth and sixth problems consider a three-bar truss, which is a classical example in structural synthesis. Instructions for preparing input data for the O3 tool are presented.

  13. Multi-object segmentation using coupled nonparametric shape and relative pose priors

    NASA Astrophysics Data System (ADS)

    Uzunbas, Mustafa Gökhan; Soldea, Octavian; Çetin, Müjdat; Ünal, Gözde; Erçil, Aytül; Unay, Devrim; Ekin, Ahmet; Firat, Zeynep

    2009-02-01

    We present a new method for multi-object segmentation in a maximum a posteriori estimation framework. Our method is motivated by the observation that neighboring or coupling objects in images generate configurations and co-dependencies which could potentially aid in segmentation if properly exploited. Our approach employs coupled shape and inter-shape pose priors that are computed using training images in a nonparametric multi-variate kernel density estimation framework. The coupled shape prior is obtained by estimating the joint shape distribution of multiple objects and the inter-shape pose priors are modeled via standard moments. Based on such statistical models, we formulate an optimization problem for segmentation, which we solve by an algorithm based on active contours. Our technique provides significant improvements in the segmentation of weakly contrasted objects in a number of applications. In particular for medical image analysis, we use our method to extract brain Basal Ganglia structures, which are members of a complex multi-object system posing a challenging segmentation problem. We also apply our technique to the problem of handwritten character segmentation. Finally, we use our method to segment cars in urban scenes.

  14. A 3-year prospective study of the effects of adjuvant treatments on cognition in women with early stage breast cancer

    PubMed Central

    Jenkins, V; Shilling, V; Deutsch, G; Bloomfield, D; Morris, R; Allan, S; Bishop, H; Hodson, N; Mitra, S; Sadler, G; Shah, E; Stein, R; Whitehead, S; Winstanley, J

    2006-01-01

    The neuropsychological performance of 85 women with early stage breast cancer scheduled for chemotherapy, 43 women scheduled for endocrine therapy and/or radiotherapy and 49 healthy control subjects was assessed at baseline (T1), postchemotherapy (or 6 months) (T2) and at 18 months (T3). Repeated measures analysis found no significant interactions or main effect of group after controlling for age and intelligence. Using a calculation to examine performance at an individual level, reliable decline on multiple tasks was seen in 20% of chemotherapy patients, 26% of nonchemotherapy patients and 18% of controls at T2 (18%, 14 and 11%, respectively, at T3). Patients who had experienced a treatment-induced menopause were more likely to show reliable decline on multiple measures at T2 (OR=2.6, 95% confidence interval (CI) 0.823–8.266 P=0.086). Psychological distress, quality of life measures and self-reported cognitive failures did not impact on objective tests of cognitive function, but were significantly associated with each other. The results show that a few women experienced objective measurable change in their concentration and memory following standard adjuvant therapy, but the majority were either unaffected or even improve over time. PMID:16523200

  15. SAPT units turn-on in an interference-dominant environment. [Stand Alone Pressure Transducer

    NASA Technical Reports Server (NTRS)

    Peng, W.-C.; Yang, C.-C.; Lichtenberg, C.

    1990-01-01

    A stand alone pressure transducer (SAPT) is a credit-card-sized smart pressure sensor inserted between the tile and the aluminum skin of a space shuttle. Reliably initiating the SAPT units via RF signals in a prelaunch environment is a challenging problem. Multiple-source interference may exist if more than one GSE (ground support equipment) antenna is turned on at the same time to meet the simultaneity requirement of 10 ms. A polygon model for orbiter, external tank, solid rocket booster, and tail service masts is used to simulate the prelaunch environment. Geometric optics is then applied to identify the coverage areas and the areas which are vulnerable to multipath and/or multiple-source interference. Simulation results show that the underside areas of an orbiter have incidence angles exceeding 80 deg. For multipath interference, both sides of the cargo bay areas are found to be vulnerable to a worst-case multipath loss exceeding 20 dB. Multiple-source interference areas are also identified. Mitigation methods for the coverage and interference problem are described. It is shown that multiple-source interference can be eliminated (or controlled) using the time-division-multiplexing method or the time-stamp approach.

  16. Bayes Error Rate Estimation Using Classifier Ensembles

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Ghosh, Joydeep

    2003-01-01

    The Bayes error rate gives a statistical lower bound on the error achievable for a given classification problem and the associated choice of features. By reliably estimating th is rate, one can assess the usefulness of the feature set that is being used for classification. Moreover, by comparing the accuracy achieved by a given classifier with the Bayes rate, one can quantify how effective that classifier is. Classical approaches for estimating or finding bounds for the Bayes error, in general, yield rather weak results for small sample sizes; unless the problem has some simple characteristics, such as Gaussian class-conditional likelihoods. This article shows how the outputs of a classifier ensemble can be used to provide reliable and easily obtainable estimates of the Bayes error with negligible extra computation. Three methods of varying sophistication are described. First, we present a framework that estimates the Bayes error when multiple classifiers, each providing an estimate of the a posteriori class probabilities, a recombined through averaging. Second, we bolster this approach by adding an information theoretic measure of output correlation to the estimate. Finally, we discuss a more general method that just looks at the class labels indicated by ensem ble members and provides error estimates based on the disagreements among classifiers. The methods are illustrated for artificial data, a difficult four-class problem involving underwater acoustic data, and two problems from the Problem benchmarks. For data sets with known Bayes error, the combiner-based methods introduced in this article outperform existing methods. The estimates obtained by the proposed methods also seem quite reliable for the real-life data sets for which the true Bayes rates are unknown.

  17. Fuzzy probabilistic design of water distribution networks

    NASA Astrophysics Data System (ADS)

    Fu, Guangtao; Kapelan, Zoran

    2011-05-01

    The primary aim of this paper is to present a fuzzy probabilistic approach for optimal design and rehabilitation of water distribution systems, combining aleatoric and epistemic uncertainties in a unified framework. The randomness and imprecision in future water consumption are characterized using fuzzy random variables whose realizations are not real but fuzzy numbers, and the nodal head requirements are represented by fuzzy sets, reflecting the imprecision in customers' requirements. The optimal design problem is formulated as a two-objective optimization problem, with minimization of total design cost and maximization of system performance as objectives. The system performance is measured by the fuzzy random reliability, defined as the probability that the fuzzy head requirements are satisfied across all network nodes. The satisfactory degree is represented by necessity measure or belief measure in the sense of the Dempster-Shafer theory of evidence. An efficient algorithm is proposed, within a Monte Carlo procedure, to calculate the fuzzy random system reliability and is effectively combined with the nondominated sorting genetic algorithm II (NSGAII) to derive the Pareto optimal design solutions. The newly proposed methodology is demonstrated with two case studies: the New York tunnels network and Hanoi network. The results from both cases indicate that the new methodology can effectively accommodate and handle various aleatoric and epistemic uncertainty sources arising from the design process and can provide optimal design solutions that are not only cost-effective but also have higher reliability to cope with severe future uncertainties.

  18. Label-free, single-object sensing with a microring resonator: FDTD simulation.

    PubMed

    Nguyen, Dan T; Norwood, Robert A

    2013-01-14

    Label-free, single-object sensing with a microring resonator is investigated numerically using the finite difference time-domain (FDTD) method. A pulse with ultra-wide bandwidth that spans over several resonant modes of the ring and of the sensing object is used for simulation, enabling a single-shot simulation of the microring sensing. The FDTD simulation not only can describe the circulation of the light in a whispering-gallery-mode (WGM) microring and multiple interactions between the light and the sensing object, but also other important factors of the sensing system, such as scattering and radiation losses. The FDTD results show that the simulation can yield a resonant shift of the WGM cavity modes. Furthermore, it can also extract eigenmodes of the sensing object, and therefore information from deep inside the object. The simulation method is not only suitable for a single object (single molecule, nano-, micro-scale particle) but can be extended to the problem of multiple objects as well.

  19. Students' Problem Solving as Mediated by Their Cognitive Tool Use: A Study of Tool Use Patterns

    ERIC Educational Resources Information Center

    Liu, M.; Horton, L. R.; Corliss, S. B.; Svinicki, M. D.; Bogard, T.; Kim, J.; Chang, M.

    2009-01-01

    The purpose of this study was to use multiple data sources, both objective and subjective, to capture students' thinking processes as they were engaged in problem solving, examine the cognitive tool use patterns, and understand what tools were used and why they were used. The findings of this study confirmed previous research and provided clear…

  20. Do Personality Problems Improve During Psychodynamic Supportive-Expressive Psychotherapy? Secondary Outcome Results From a Randomized Controlled Trial for Psychiatric Outpatients with Personality Disorders

    PubMed Central

    Vinnars, Bo; Thormählen, Barbro; Gallop, Robert; Norén, Kristina; Barber, Jacques P.

    2009-01-01

    Studies involving patients with personality disorders (PD) have not focused on improvement of core aspects of the PD. This paper examines changes in quality of object relations, interpersonal problems, psychological mindedness, and personality traits in a sample of 156 patients with DSM-IV PD diagnoses being randomized to either manualized or non manualized dynamic psychotherapy. Effect sizes adjusted for symptomatic change and reliable change indices were calculated. We found that both treatments were equally effective at reducing personality pathology. Only in neuroticism did the non manualized group do better during the follow-up period. The largest improvement was found in quality of object relations. For the remaining variables only small and clinically insignificant magnitudes of change were found. PMID:20161588

  1. Towards a multilevel cognitive probabilistic representation of space

    NASA Astrophysics Data System (ADS)

    Tapus, Adriana; Vasudevan, Shrihari; Siegwart, Roland

    2005-03-01

    This paper addresses the problem of perception and representation of space for a mobile agent. A probabilistic hierarchical framework is suggested as a solution to this problem. The method proposed is a combination of probabilistic belief with "Object Graph Models" (OGM). The world is viewed from a topological optic, in terms of objects and relationships between them. The hierarchical representation that we propose permits an efficient and reliable modeling of the information that the mobile agent would perceive from its environment. The integration of both navigational and interactional capabilities through efficient representation is also addressed. Experiments on a set of images taken from the real world that validate the approach are reported. This framework draws on the general understanding of human cognition and perception and contributes towards the overall efforts to build cognitive robot companions.

  2. Exploring Google Earth Engine platform for big data processing: classification of multi-temporal satellite imagery for crop mapping

    NASA Astrophysics Data System (ADS)

    Shelestov, Andrii; Lavreniuk, Mykola; Kussul, Nataliia; Novikov, Alexei; Skakun, Sergii

    2017-02-01

    Many applied problems arising in agricultural monitoring and food security require reliable crop maps at national or global scale. Large scale crop mapping requires processing and management of large amount of heterogeneous satellite imagery acquired by various sensors that consequently leads to a “Big Data” problem. The main objective of this study is to explore efficiency of using the Google Earth Engine (GEE) platform when classifying multi-temporal satellite imagery with potential to apply the platform for a larger scale (e.g. country level) and multiple sensors (e.g. Landsat-8 and Sentinel-2). In particular, multiple state-of-the-art classifiers available in the GEE platform are compared to produce a high resolution (30 m) crop classification map for a large territory ( 28,100 km2 and 1.0 M ha of cropland). Though this study does not involve large volumes of data, it does address efficiency of the GEE platform to effectively execute complex workflows of satellite data processing required with large scale applications such as crop mapping. The study discusses strengths and weaknesses of classifiers, assesses accuracies that can be achieved with different classifiers for the Ukrainian landscape, and compares them to the benchmark classifier using a neural network approach that was developed in our previous studies. The study is carried out for the Joint Experiment of Crop Assessment and Monitoring (JECAM) test site in Ukraine covering the Kyiv region (North of Ukraine) in 2013. We found that Google Earth Engine (GEE) provides very good performance in terms of enabling access to the remote sensing products through the cloud platform and providing pre-processing; however, in terms of classification accuracy, the neural network based approach outperformed support vector machine (SVM), decision tree and random forest classifiers available in GEE.

  3. Dynamic Educational e-Content Selection Using Multiple Criteria in Web-Based Personalized Learning Environments.

    ERIC Educational Resources Information Center

    Manouselis, Nikos; Sampson, Demetrios

    This paper focuses on the way a multi-criteria decision making methodology is applied in the case of agent-based selection of offered learning objects. The problem of selection is modeled as a decision making one, with the decision variables being the learner model and the learning objects' educational description. In this way, selection of…

  4. Motion and force control for multiple cooperative manipulators

    NASA Technical Reports Server (NTRS)

    Wen, John T.; Kreutz, Kenneth

    1989-01-01

    The motion and force control of multiple robot arms manipulating a commonly held object is addressed. A general control paradigm that decouples the motion and force control problems is introduced. For motion control, there are three natural choices: (1) joint torques, (2) arm-tip force vectors, and (3) the acceleration of a generalized coordinate. Choice (1) allows a class of relatively model-independent control laws by exploiting the Hamiltonian structure of the open-loop system; (2) and (3) require the full model information but produce simpler problems. To resolve the nonuniqueness of the joint torques, two methods are introduced. If the arm and object models are available, the allocation of the desired end-effector control force to the joint actuators can be optimized; otherwise the internal force can be controlled about some set point. It is shown that effective force regulation can be achieved even if little model information is available.

  5. Psychometric characteristics of Clinical Reasoning Problems (CRPs) and its correlation with routine multiple choice question (MCQ) in Cardiology department.

    PubMed

    Derakhshandeh, Zahra; Amini, Mitra; Kojuri, Javad; Dehbozorgian, Marziyeh

    2018-01-01

    Clinical reasoning is one of the most important skills in the process of training a medical student to become an efficient physician. Assessment of the reasoning skills in a medical school program is important to direct students' learning. One of the tests for measuring the clinical reasoning ability is Clinical Reasoning Problems (CRPs). The major aim of this study is to measure psychometric qualities of CRPs and define correlation between this test and routine MCQ in cardiology department of Shiraz medical school. This study was a descriptive study conducted on total cardiology residents of Shiraz Medical School. The study population consists of 40 residents in 2014. The routine CRPs and the MCQ tests was designed based on similar objectives and were carried out simultaneously. Reliability, item difficulty, item discrimination, and correlation between each item and the total score of CRPs were all measured by Excel and SPSS software for checking psycometeric CRPs test. Furthermore, we calculated the correlation between CRPs test and MCQ test. The mean differences of CRPs test score between residents' academic year [second, third and fourth year] were also evaluated by Analysis of variances test (One Way ANOVA) using SPSS software (version 20)(α=0.05). The mean and standard deviation of score in CRPs was 10.19 ±3.39 out of 20; in MCQ, it was 13.15±3.81 out of 20. Item difficulty was in the range of 0.27-0.72; item discrimination was 0.30-0.75 with question No.3 being the exception (that was 0.24). The correlation between each item and the total score of CRP was 0.26-0.87; the correlation between CRPs test and MCQ test was 0.68 (p<0.001). The reliability of the CRPs was 0.72 as calculated by using Cronbach's alpha. The mean score of CRPs was different among residents based on their academic year and this difference was statistically significant (p<0.001). The results of this present investigation revealed that CRPs could be reliable test for measuring clinical reasoning in residents. It can be included in cardiology residency assessment programs.

  6. Genetic Particle Swarm Optimization-Based Feature Selection for Very-High-Resolution Remotely Sensed Imagery Object Change Detection.

    PubMed

    Chen, Qiang; Chen, Yunhao; Jiang, Weiguo

    2016-07-30

    In the field of multiple features Object-Based Change Detection (OBCD) for very-high-resolution remotely sensed images, image objects have abundant features and feature selection affects the precision and efficiency of OBCD. Through object-based image analysis, this paper proposes a Genetic Particle Swarm Optimization (GPSO)-based feature selection algorithm to solve the optimization problem of feature selection in multiple features OBCD. We select the Ratio of Mean to Variance (RMV) as the fitness function of GPSO, and apply the proposed algorithm to the object-based hybrid multivariate alternative detection model. Two experiment cases on Worldview-2/3 images confirm that GPSO can significantly improve the speed of convergence, and effectively avoid the problem of premature convergence, relative to other feature selection algorithms. According to the accuracy evaluation of OBCD, GPSO is superior at overall accuracy (84.17% and 83.59%) and Kappa coefficient (0.6771 and 0.6314) than other algorithms. Moreover, the sensitivity analysis results show that the proposed algorithm is not easily influenced by the initial parameters, but the number of features to be selected and the size of the particle swarm would affect the algorithm. The comparison experiment results reveal that RMV is more suitable than other functions as the fitness function of GPSO-based feature selection algorithm.

  7. Development of a decision support system for analysis and solutions of prolonged standing in the workplace.

    PubMed

    Halim, Isa; Arep, Hambali; Kamat, Seri Rahayu; Abdullah, Rohana; Omar, Abdul Rahman; Ismail, Ahmad Rasdan

    2014-06-01

    Prolonged standing has been hypothesized as a vital contributor to discomfort and muscle fatigue in the workplace. The objective of this study was to develop a decision support system that could provide systematic analysis and solutions to minimize the discomfort and muscle fatigue associated with prolonged standing. The integration of object-oriented programming and a Model Oriented Simultaneous Engineering System were used to design the architecture of the decision support system. Validation of the decision support system was carried out in two manufacturing companies. The validation process showed that the decision support system produced reliable results. The decision support system is a reliable advisory tool for providing analysis and solutions to problems related to the discomfort and muscle fatigue associated with prolonged standing. Further testing of the decision support system is suggested before it is used commercially.

  8. On the reliability of self-reported health: evidence from Albanian data.

    PubMed

    Vaillant, Nicolas; Wolff, François-Charles

    2012-06-01

    This paper investigates the reliability of self-assessed measures of health using panel data collected in Albania by the World Bank in 2002, 2003 and 2004 through the Living Standard Measurement Study project. As the survey includes questions on a self-assessed measure of health and on more objective health problems, both types of information are combined with a view to understanding how respondents change their answers to the self-reported measures over time. Estimates from random effects ordered Probit models show that differences in self-reported subjective health between individuals are much more marked than those over time, suggesting a strong state dependence in subjective health status. The empirical analysis also reveals respondent consistency, from both a subjective and an objective viewpoint. Self-reported health is much more influenced by permanent shocks than by more transitory illness or injury. Copyright © 2012 Ministry of Health, Saudi Arabia. Published by Elsevier Ltd. All rights reserved.

  9. Development of a Decision Support System for Analysis and Solutions of Prolonged Standing in the Workplace

    PubMed Central

    Halim, Isa; Arep, Hambali; Kamat, Seri Rahayu; Abdullah, Rohana; Omar, Abdul Rahman; Ismail, Ahmad Rasdan

    2014-01-01

    Background Prolonged standing has been hypothesized as a vital contributor to discomfort and muscle fatigue in the workplace. The objective of this study was to develop a decision support system that could provide systematic analysis and solutions to minimize the discomfort and muscle fatigue associated with prolonged standing. Methods The integration of object-oriented programming and a Model Oriented Simultaneous Engineering System were used to design the architecture of the decision support system. Results Validation of the decision support system was carried out in two manufacturing companies. The validation process showed that the decision support system produced reliable results. Conclusion The decision support system is a reliable advisory tool for providing analysis and solutions to problems related to the discomfort and muscle fatigue associated with prolonged standing. Further testing of the decision support system is suggested before it is used commercially. PMID:25180141

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Efroymson, Rebecca Ann; Dale, Virginia H; Kline, Keith L

    Indicators of the environmental sustainability of biofuel production, distribution, and use should be selected, measured, and interpreted with respect to the context in which they are used. These indicators include measures of soil quality, water quality and quantity, greenhouse-gas emissions, biodiversity, air quality, and vegetation productivity. Contextual considerations include the purpose for the sustainability analysis, the particular biofuel production and distribution system (including supply chain, management aspects, and system viability), policy conditions, stakeholder values, location, temporal influences, spatial scale, baselines, and reference scenarios. Recommendations presented in this paper include formulating the problem for particular analyses, selecting appropriate context-specific indicators ofmore » environmental sustainability, and developing indicators that can reflect multiple environmental properties at low cost within a defined context. In addition, contextual considerations such as technical objectives, varying values and perspectives of stakeholder groups, and availability and reliability of data need to be understood and considered. Sustainability indicators for biofuels are most useful if adequate historical data are available, information can be collected at appropriate spatial and temporal scales, organizations are committed to use indicator information in the decision-making process, and indicators can effectively guide behavior toward more sustainable practices.« less

  11. Dynamic modelling and parameter estimation of a hydraulic robot manipulator using a multi-objective genetic algorithm

    NASA Astrophysics Data System (ADS)

    Montazeri, A.; West, C.; Monk, S. D.; Taylor, C. J.

    2017-04-01

    This paper concerns the problem of dynamic modelling and parameter estimation for a seven degree of freedom hydraulic manipulator. The laboratory example is a dual-manipulator mobile robotic platform used for research into nuclear decommissioning. In contrast to earlier control model-orientated research using the same machine, the paper develops a nonlinear, mechanistic simulation model that can subsequently be used to investigate physically meaningful disturbances. The second contribution is to optimise the parameters of the new model, i.e. to determine reliable estimates of the physical parameters of a complex robotic arm which are not known in advance. To address the nonlinear and non-convex nature of the problem, the research relies on the multi-objectivisation of an output error single-performance index. The developed algorithm utilises a multi-objective genetic algorithm (GA) in order to find a proper solution. The performance of the model and the GA is evaluated using both simulated (i.e. with a known set of 'true' parameters) and experimental data. Both simulation and experimental results show that multi-objectivisation has improved convergence of the estimated parameters compared to the single-objective output error problem formulation. This is achieved by integrating the validation phase inside the algorithm implicitly and exploiting the inherent structure of the multi-objective GA for this specific system identification problem.

  12. Reliability analysis of the objective structured clinical examination using generalizability theory.

    PubMed

    Trejo-Mejía, Juan Andrés; Sánchez-Mendiola, Melchor; Méndez-Ramírez, Ignacio; Martínez-González, Adrián

    2016-01-01

    The objective structured clinical examination (OSCE) is a widely used method for assessing clinical competence in health sciences education. Studies using this method have shown evidence of validity and reliability. There are no published studies of OSCE reliability measurement with generalizability theory (G-theory) in Latin America. The aims of this study were to assess the reliability of an OSCE in medical students using G-theory and explore its usefulness for quality improvement. An observational cross-sectional study was conducted at National Autonomous University of Mexico (UNAM) Faculty of Medicine in Mexico City. A total of 278 fifth-year medical students were assessed with an 18-station OSCE in a summative end-of-career final examination. There were four exam versions. G-theory with a crossover random effects design was used to identify the main sources of variance. Examiners, standardized patients, and cases were considered as a single facet of analysis. The exam was applied to 278 medical students. The OSCE had a generalizability coefficient of 0.93. The major components of variance were stations, students, and residual error. The sites and the versions of the tests had minimum variance. Our study achieved a G coefficient similar to that found in other reports, which is acceptable for summative tests. G-theory allows the estimation of the magnitude of multiple sources of error and helps decision makers to determine the number of stations, test versions, and examiners needed to obtain reliable measurements.

  13. Reliability analysis of the objective structured clinical examination using generalizability theory.

    PubMed

    Trejo-Mejía, Juan Andrés; Sánchez-Mendiola, Melchor; Méndez-Ramírez, Ignacio; Martínez-González, Adrián

    2016-01-01

    Background The objective structured clinical examination (OSCE) is a widely used method for assessing clinical competence in health sciences education. Studies using this method have shown evidence of validity and reliability. There are no published studies of OSCE reliability measurement with generalizability theory (G-theory) in Latin America. The aims of this study were to assess the reliability of an OSCE in medical students using G-theory and explore its usefulness for quality improvement. Methods An observational cross-sectional study was conducted at National Autonomous University of Mexico (UNAM) Faculty of Medicine in Mexico City. A total of 278 fifth-year medical students were assessed with an 18-station OSCE in a summative end-of-career final examination. There were four exam versions. G-theory with a crossover random effects design was used to identify the main sources of variance. Examiners, standardized patients, and cases were considered as a single facet of analysis. Results The exam was applied to 278 medical students. The OSCE had a generalizability coefficient of 0.93. The major components of variance were stations, students, and residual error. The sites and the versions of the tests had minimum variance. Conclusions Our study achieved a G coefficient similar to that found in other reports, which is acceptable for summative tests. G-theory allows the estimation of the magnitude of multiple sources of error and helps decision makers to determine the number of stations, test versions, and examiners needed to obtain reliable measurements.

  14. Model selection with multiple regression on distance matrices leads to incorrect inferences.

    PubMed

    Franckowiak, Ryan P; Panasci, Michael; Jarvis, Karl J; Acuña-Rodriguez, Ian S; Landguth, Erin L; Fortin, Marie-Josée; Wagner, Helene H

    2017-01-01

    In landscape genetics, model selection procedures based on Information Theoretic and Bayesian principles have been used with multiple regression on distance matrices (MRM) to test the relationship between multiple vectors of pairwise genetic, geographic, and environmental distance. Using Monte Carlo simulations, we examined the ability of model selection criteria based on Akaike's information criterion (AIC), its small-sample correction (AICc), and the Bayesian information criterion (BIC) to reliably rank candidate models when applied with MRM while varying the sample size. The results showed a serious problem: all three criteria exhibit a systematic bias toward selecting unnecessarily complex models containing spurious random variables and erroneously suggest a high level of support for the incorrectly ranked best model. These problems effectively increased with increasing sample size. The failure of AIC, AICc, and BIC was likely driven by the inflated sample size and different sum-of-squares partitioned by MRM, and the resulting effect on delta values. Based on these findings, we strongly discourage the continued application of AIC, AICc, and BIC for model selection with MRM.

  15. A pragmatic decision model for inventory management with heterogeneous suppliers

    NASA Astrophysics Data System (ADS)

    Nakandala, Dilupa; Lau, Henry; Zhang, Jingjing; Gunasekaran, Angappa

    2018-05-01

    For enterprises, it is imperative that the trade-off between the cost of inventory and risk implications is managed in the most efficient manner. To explore this, we use the common example of a wholesaler operating in an environment where suppliers demonstrate heterogeneous reliability. The wholesaler has partial orders with dual suppliers and uses lateral transshipments. While supplier reliability is a key concern in inventory management, reliable suppliers are more expensive and investment in strategic approaches that improve supplier performance carries a high cost. Here we consider the operational strategy of dual sourcing with reliable and unreliable suppliers and model the total inventory cost where the likely scenario lead-time of the unreliable suppliers extends beyond the scheduling period. We then develop a Customized Integer Programming Optimization Model to determine the optimum size of partial orders with multiple suppliers. In addition to the objective of total cost optimization, this study takes into account the volatility of the cost associated with the uncertainty of an inventory system.

  16. Enhanced Decision Analysis Support System.

    DTIC Science & Technology

    1981-03-01

    autorrares "i., the method for determining preferences when multiple and competing attributes are involved. Worth assessment is used as the model which...1967 as a method for determining preferenoe when multiple and competing attributes are involved (Rf 10). The tern worth can be - equated to other... competing objectives. After some discussion, the group decided that the problem could best be decided using the worth assessment procedure. They

  17. Associating optical measurements of MEO and GEO objects using Population-Based Meta-Heuristic methods

    NASA Astrophysics Data System (ADS)

    Zittersteijn, M.; Vananti, A.; Schildknecht, T.; Dolado Perez, J. C.; Martinot, V.

    2016-11-01

    Currently several thousands of objects are being tracked in the MEO and GEO regions through optical means. The problem faced in this framework is that of Multiple Target Tracking (MTT). The MTT problem quickly becomes an NP-hard combinatorial optimization problem. This means that the effort required to solve the MTT problem increases exponentially with the number of tracked objects. In an attempt to find an approximate solution of sufficient quality, several Population-Based Meta-Heuristic (PBMH) algorithms are implemented and tested on simulated optical measurements. These first results show that one of the tested algorithms, namely the Elitist Genetic Algorithm (EGA), consistently displays the desired behavior of finding good approximate solutions before reaching the optimum. The results further suggest that the algorithm possesses a polynomial time complexity, as the computation times are consistent with a polynomial model. With the advent of improved sensors and a heightened interest in the problem of space debris, it is expected that the number of tracked objects will grow by an order of magnitude in the near future. This research aims to provide a method that can treat the association and orbit determination problems simultaneously, and is able to efficiently process large data sets with minimal manual intervention.

  18. Single- and Multiple-Objective Optimization with Differential Evolution and Neural Networks

    NASA Technical Reports Server (NTRS)

    Rai, Man Mohan

    2006-01-01

    Genetic and evolutionary algorithms have been applied to solve numerous problems in engineering design where they have been used primarily as optimization procedures. These methods have an advantage over conventional gradient-based search procedures became they are capable of finding global optima of multi-modal functions and searching design spaces with disjoint feasible regions. They are also robust in the presence of noisy data. Another desirable feature of these methods is that they can efficiently use distributed and parallel computing resources since multiple function evaluations (flow simulations in aerodynamics design) can be performed simultaneously and independently on ultiple processors. For these reasons genetic and evolutionary algorithms are being used more frequently in design optimization. Examples include airfoil and wing design and compressor and turbine airfoil design. They are also finding increasing use in multiple-objective and multidisciplinary optimization. This lecture will focus on an evolutionary method that is a relatively new member to the general class of evolutionary methods called differential evolution (DE). This method is easy to use and program and it requires relatively few user-specified constants. These constants are easily determined for a wide class of problems. Fine-tuning the constants will off course yield the solution to the optimization problem at hand more rapidly. DE can be efficiently implemented on parallel computers and can be used for continuous, discrete and mixed discrete/continuous optimization problems. It does not require the objective function to be continuous and is noise tolerant. DE and applications to single and multiple-objective optimization will be included in the presentation and lecture notes. A method for aerodynamic design optimization that is based on neural networks will also be included as a part of this lecture. The method offers advantages over traditional optimization methods. It is more flexible than other methods in dealing with design in the context of both steady and unsteady flows, partial and complete data sets, combined experimental and numerical data, inclusion of various constraints and rules of thumb, and other issues that characterize the aerodynamic design process. Neural networks provide a natural framework within which a succession of numerical solutions of increasing fidelity, incorporating more realistic flow physics, can be represented and utilized for optimization. Neural networks also offer an excellent framework for multiple-objective and multi-disciplinary design optimization. Simulation tools from various disciplines can be integrated within this framework and rapid trade-off studies involving one or many disciplines can be performed. The prospect of combining neural network based optimization methods and evolutionary algorithms to obtain a hybrid method with the best properties of both methods will be included in this presentation. Achieving solution diversity and accurate convergence to the exact Pareto front in multiple objective optimization usually requires a significant computational effort with evolutionary algorithms. In this lecture we will also explore the possibility of using neural networks to obtain estimates of the Pareto optimal front using non-dominated solutions generated by DE as training data. Neural network estimators have the potential advantage of reducing the number of function evaluations required to obtain solution accuracy and diversity, thus reducing cost to design.

  19. Reliability of psychophysiological responses across multiple motion sickness stimulation tests

    NASA Technical Reports Server (NTRS)

    Stout, C. S.; Toscano, W. B.; Cowings, P. S.

    1995-01-01

    Although there is general agreement that a high degree of variability exists between subjects in their autonomic nervous system responses to motion sickness stimulation, very little evidence exists that examines the reproducibility of autonomic responses within subjects during motion sickness stimulation. Our objectives were to examine the reliability of autonomic responses and symptom levels across five testing occasions using the (1) final minute of testing, (2) change in autonomic response and the change in symptom level, and (3) strength of the relationship between the change in symptom level and the change in autonomic responses across the entire motion sickness test. The results indicate that, based on the final minute of testing, the autonomic responses of heart rate, blood volume pulse, and respiration rate are moderately stable across multiple tests. Changes in heart rate, blood volume pulse, respiration rate, and symptoms throughout the test duration are less stable across the tests. Finally, autonomic responses and symptom levels are significantly related across the entire motion sickness test.

  20. Graphical correlation of gaging-station records

    USGS Publications Warehouse

    Searcy, James K.

    1960-01-01

    A gaging-station record is a sample of the rate of flow of a stream at a given site. This sample can be used to estimate the magnitude and distribution of future flows if the record is long enough to be representative of the long-term flow of the stream. The reliability of a short-term record for estimating future flow characteristics can be improved through correlation with a long-term record. Correlation can be either numerical or graphical, but graphical correlation of gaging-station records has several advantages. The graphical correlation method is described in a step-by-step procedure with an illustrative problem of simple correlation, illustrative problems of three examples of multiple correlation--removing seasonal effect--and two examples of correlation of one record with two other records. Except in the problem on removal of seasonal effect, the same group of stations is used in the illustrative problems. The purpose of the problems is to illustrate the method--not to show the improvement that can result from multiple correlation as compared with simple correlation. Hydrologic factors determine whether a usable relation exists between gaging-station records. Statistics is only a tool for evaluating and using an existing relation, and the investigator must be guided by a knowledge of hydrology.

  1. Resolving occlusion and segmentation errors in multiple video object tracking

    NASA Astrophysics Data System (ADS)

    Cheng, Hsu-Yung; Hwang, Jenq-Neng

    2009-02-01

    In this work, we propose a method to integrate the Kalman filter and adaptive particle sampling for multiple video object tracking. The proposed framework is able to detect occlusion and segmentation error cases and perform adaptive particle sampling for accurate measurement selection. Compared with traditional particle filter based tracking methods, the proposed method generates particles only when necessary. With the concept of adaptive particle sampling, we can avoid degeneracy problem because the sampling position and range are dynamically determined by parameters that are updated by Kalman filters. There is no need to spend time on processing particles with very small weights. The adaptive appearance for the occluded object refers to the prediction results of Kalman filters to determine the region that should be updated and avoids the problem of using inadequate information to update the appearance under occlusion cases. The experimental results have shown that a small number of particles are sufficient to achieve high positioning and scaling accuracy. Also, the employment of adaptive appearance substantially improves the positioning and scaling accuracy on the tracking results.

  2. Real-time electroholography using a multiple-graphics processing unit cluster system with a single spatial light modulator and the InfiniBand network

    NASA Astrophysics Data System (ADS)

    Niwase, Hiroaki; Takada, Naoki; Araki, Hiromitsu; Maeda, Yuki; Fujiwara, Masato; Nakayama, Hirotaka; Kakue, Takashi; Shimobaba, Tomoyoshi; Ito, Tomoyoshi

    2016-09-01

    Parallel calculations of large-pixel-count computer-generated holograms (CGHs) are suitable for multiple-graphics processing unit (multi-GPU) cluster systems. However, it is not easy for a multi-GPU cluster system to accomplish fast CGH calculations when CGH transfers between PCs are required. In these cases, the CGH transfer between the PCs becomes a bottleneck. Usually, this problem occurs only in multi-GPU cluster systems with a single spatial light modulator. To overcome this problem, we propose a simple method using the InfiniBand network. The computational speed of the proposed method using 13 GPUs (NVIDIA GeForce GTX TITAN X) was more than 3000 times faster than that of a CPU (Intel Core i7 4770) when the number of three-dimensional (3-D) object points exceeded 20,480. In practice, we achieved ˜40 tera floating point operations per second (TFLOPS) when the number of 3-D object points exceeded 40,960. Our proposed method was able to reconstruct a real-time movie of a 3-D object comprising 95,949 points.

  3. A new assessment tool for patients with multiple sclerosis from Spanish-speaking countries: validation of the Brief International Cognitive Assessment for MS (BICAMS) in Argentina.

    PubMed

    Vanotti, Sandra; Smerbeck, Audrey; Benedict, Ralph H B; Caceres, Fernando

    2016-10-01

    The Brief International Cognitive Assessment for Multiple Sclerosis (BICAMS) is an international assessment tool for monitoring cognitive function in multiple sclerosis (MS) patients. BICAMS comprises the Symbol Digit Modalities Test (SDMT), the California Verbal Learning Test - Second Edition (CVLT II) and the Brief Visuospatial Memory Test - Revised (BVMT-R). Our objective was to validate and assess the reliability of BICAMS as applied in Argentina and to obtain normative data in Spanish for this population. The sample composed of 50 MS patients and 100 healthy controls (HC). In order to test its reliability, BICAMS was re-administered in a subset of 25 patients. The sample's average age was 43.42 ± 10.17 years old, and average years of schooling were 14.86 ± 2.78. About 74% of the participants were women. The groups did not differ in age, years of schooling, or gender. The MS group performed significantly worse than the HC group across the three neuropsychological tests, yielding the following Cohen's d values: SDMT: .85; CVLT I: .87; and BVMT-R: .40. The mean raw scores for Argentina normative data were as follows: SDMT: 56.71 ± 10.85; CVLT I: 60.88 ± 10.46; and BVMT-R: 23.44 ± 5.84. Finally, test-retest reliability coefficients for each test were as follows: SDMT: r = .95; CVLT I: r = .87; and BVMT-R: r = .82. This BICAMS version is reliable and useful as a monitoring tool for identifying MS patients with cognitive impairment.

  4. Sensitivity-Informed De Novo Programming for Many-Objective Water Portfolio Planning Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Kasprzyk, J. R.; Reed, P. M.; Kirsch, B. R.; Characklis, G. W.

    2009-12-01

    Risk-based water supply management presents severe cognitive, computational, and social challenges to planning in a changing world. Decision aiding frameworks must confront the cognitive biases implicit to risk, the severe uncertainties associated with long term planning horizons, and the consequent ambiguities that shape how we define and solve water resources planning and management problems. This paper proposes and demonstrates a new interactive framework for sensitivity informed de novo programming. The theoretical focus of our many-objective de novo programming is to promote learning and evolving problem formulations to enhance risk-based decision making. We have demonstrated our proposed de novo programming framework using a case study for a single city’s water supply in the Lower Rio Grande Valley (LRGV) in Texas. Key decisions in this case study include the purchase of permanent rights to reservoir inflows and anticipatory thresholds for acquiring transfers of water through optioning and spot leases. A 10-year Monte Carlo simulation driven by historical data is used to provide performance metrics for the supply portfolios. The three major components of our methodology include Sobol globoal sensitivity analysis, many-objective evolutionary optimization and interactive tradeoff visualization. The interplay between these components allows us to evaluate alternative design metrics, their decision variable controls and the consequent system vulnerabilities. Our LRGV case study measures water supply portfolios’ efficiency, reliability, and utilization of transfers in the water supply market. The sensitivity analysis is used interactively over interannual, annual, and monthly time scales to indicate how the problem controls change as a function of the timescale of interest. These results have been used then to improve our exploration and understanding of LRGV costs, vulnerabilities, and the water portfolios’ critical reliability constraints. These results demonstrate how we can adaptively improve the value and robustness of our problem formulations by evolving our definition of optimality to discover key tradeoffs.

  5. Joint Geophysical Inversion With Multi-Objective Global Optimization Methods

    NASA Astrophysics Data System (ADS)

    Lelievre, P. G.; Bijani, R.; Farquharson, C. G.

    2015-12-01

    Pareto multi-objective global optimization (PMOGO) methods generate a suite of solutions that minimize multiple objectives (e.g. data misfits and regularization terms) in a Pareto-optimal sense. Providing a suite of models, as opposed to a single model that minimizes a weighted sum of objectives, allows a more complete assessment of the possibilities and avoids the often difficult choice of how to weight each objective. We are applying PMOGO methods to three classes of inverse problems. The first class are standard mesh-based problems where the physical property values in each cell are treated as continuous variables. The second class of problems are also mesh-based but cells can only take discrete physical property values corresponding to known or assumed rock units. In the third class we consider a fundamentally different type of inversion in which a model comprises wireframe surfaces representing contacts between rock units; the physical properties of each rock unit remain fixed while the inversion controls the position of the contact surfaces via control nodes. This third class of problem is essentially a geometry inversion, which can be used to recover the unknown geometry of a target body or to investigate the viability of a proposed Earth model. Joint inversion is greatly simplified for the latter two problem classes because no additional mathematical coupling measure is required in the objective function. PMOGO methods can solve numerically complicated problems that could not be solved with standard descent-based local minimization methods. This includes the latter two classes of problems mentioned above. There are significant increases in the computational requirements when PMOGO methods are used but these can be ameliorated using parallelization and problem dimension reduction strategies.

  6. Tag-to-Tag Interference Suppression Technique Based on Time Division for RFID.

    PubMed

    Khadka, Grishma; Hwang, Suk-Seung

    2017-01-01

    Radio-frequency identification (RFID) is a tracking technology that enables immediate automatic object identification and rapid data sharing for a wide variety of modern applications using radio waves for data transmission from a tag to a reader. RFID is already well established in technical areas, and many companies have developed corresponding standards and measurement techniques. In the construction industry, effective monitoring of materials and equipment is an important task, and RFID helps to improve monitoring and controlling capabilities, in addition to enabling automation for construction projects. However, on construction sites, there are many tagged objects and multiple RFID tags that may interfere with each other's communications. This reduces the reliability and efficiency of the RFID system. In this paper, we propose an anti-collision algorithm for communication between multiple tags and a reader. In order to suppress interference signals from multiple neighboring tags, the proposed algorithm employs the time-division (TD) technique, where tags in the interrogation zone are assigned a specific time slot so that at every instance in time, a reader communicates with tags using the specific time slot. We present representative computer simulation examples to illustrate the performance of the proposed anti-collision technique for multiple RFID tags.

  7. Web-Based Problem-Solving Assignment and Grading System

    NASA Astrophysics Data System (ADS)

    Brereton, Giles; Rosenberg, Ronald

    2014-11-01

    In engineering courses with very specific learning objectives, such as fluid mechanics and thermodynamics, it is conventional to reinforce concepts and principles with problem-solving assignments and to measure success in problem solving as an indicator of student achievement. While the modern-day ease of copying and searching for online solutions can undermine the value of traditional assignments, web-based technologies also provide opportunities to generate individualized well-posed problems with an infinite number of different combinations of initial/final/boundary conditions, so that the probability of any two students being assigned identical problems in a course is vanishingly small. Such problems can be designed and programmed to be: single or multiple-step, self-grading, allow students single or multiple attempts; provide feedback when incorrect; selectable according to difficulty; incorporated within gaming packages; etc. In this talk, we discuss the use of a homework/exam generating program of this kind in a single-semester course, within a web-based client-server system that ensures secure operation.

  8. Weighted Fuzzy Risk Priority Number Evaluation of Turbine and Compressor Blades Considering Failure Mode Correlations

    NASA Astrophysics Data System (ADS)

    Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong

    2014-06-01

    Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural system under failure correlations.

  9. Multiobjective optimization in structural design with uncertain parameters and stochastic processes

    NASA Technical Reports Server (NTRS)

    Rao, S. S.

    1984-01-01

    The application of multiobjective optimization techniques to structural design problems involving uncertain parameters and random processes is studied. The design of a cantilever beam with a tip mass subjected to a stochastic base excitation is considered for illustration. Several of the problem parameters are assumed to be random variables and the structural mass, fatigue damage, and negative of natural frequency of vibration are considered for minimization. The solution of this three-criteria design problem is found by using global criterion, utility function, game theory, goal programming, goal attainment, bounded objective function, and lexicographic methods. It is observed that the game theory approach is superior in finding a better optimum solution, assuming the proper balance of the various objective functions. The procedures used in the present investigation are expected to be useful in the design of general dynamic systems involving uncertain parameters, stochastic process, and multiple objectives.

  10. Advantages of Task-Specific Multi-Objective Optimisation in Evolutionary Robotics.

    PubMed

    Trianni, Vito; López-Ibáñez, Manuel

    2015-01-01

    The application of multi-objective optimisation to evolutionary robotics is receiving increasing attention. A survey of the literature reveals the different possibilities it offers to improve the automatic design of efficient and adaptive robotic systems, and points to the successful demonstrations available for both task-specific and task-agnostic approaches (i.e., with or without reference to the specific design problem to be tackled). However, the advantages of multi-objective approaches over single-objective ones have not been clearly spelled out and experimentally demonstrated. This paper fills this gap for task-specific approaches: starting from well-known results in multi-objective optimisation, we discuss how to tackle commonly recognised problems in evolutionary robotics. In particular, we show that multi-objective optimisation (i) allows evolving a more varied set of behaviours by exploring multiple trade-offs of the objectives to optimise, (ii) supports the evolution of the desired behaviour through the introduction of objectives as proxies, (iii) avoids the premature convergence to local optima possibly introduced by multi-component fitness functions, and (iv) solves the bootstrap problem exploiting ancillary objectives to guide evolution in the early phases. We present an experimental demonstration of these benefits in three different case studies: maze navigation in a single robot domain, flocking in a swarm robotics context, and a strictly collaborative task in collective robotics.

  11. Hybrid Microgrid Configuration Optimization with Evolutionary Algorithms

    NASA Astrophysics Data System (ADS)

    Lopez, Nicolas

    This dissertation explores the Renewable Energy Integration Problem, and proposes a Genetic Algorithm embedded with a Monte Carlo simulation to solve large instances of the problem that are impractical to solve via full enumeration. The Renewable Energy Integration Problem is defined as finding the optimum set of components to supply the electric demand to a hybrid microgrid. The components considered are solar panels, wind turbines, diesel generators, electric batteries, connections to the power grid and converters, which can be inverters and/or rectifiers. The methodology developed is explained as well as the combinatorial formulation. In addition, 2 case studies of a single objective optimization version of the problem are presented, in order to minimize cost and to minimize global warming potential (GWP) followed by a multi-objective implementation of the offered methodology, by utilizing a non-sorting Genetic Algorithm embedded with a monte Carlo Simulation. The method is validated by solving a small instance of the problem with known solution via a full enumeration algorithm developed by NREL in their software HOMER. The dissertation concludes that the evolutionary algorithms embedded with Monte Carlo simulation namely modified Genetic Algorithms are an efficient form of solving the problem, by finding approximate solutions in the case of single objective optimization, and by approximating the true Pareto front in the case of multiple objective optimization of the Renewable Energy Integration Problem.

  12. Image Tracking for the High Similarity Drug Tablets Based on Light Intensity Reflective Energy and Artificial Neural Network

    PubMed Central

    Liang, Zhongwei; Zhou, Liang; Liu, Xiaochu; Wang, Xiaogang

    2014-01-01

    It is obvious that tablet image tracking exerts a notable influence on the efficiency and reliability of high-speed drug mass production, and, simultaneously, it also emerges as a big difficult problem and targeted focus during production monitoring in recent years, due to the high similarity shape and random position distribution of those objectives to be searched for. For the purpose of tracking tablets accurately in random distribution, through using surface fitting approach and transitional vector determination, the calibrated surface of light intensity reflective energy can be established, describing the shape topology and topography details of objective tablet. On this basis, the mathematical properties of these established surfaces have been proposed, and thereafter artificial neural network (ANN) has been employed for classifying those moving targeted tablets by recognizing their different surface properties; therefore, the instantaneous coordinate positions of those drug tablets on one image frame can then be determined. By repeating identical pattern recognition on the next image frame, the real-time movements of objective tablet templates were successfully tracked in sequence. This paper provides reliable references and new research ideas for the real-time objective tracking in the case of drug production practices. PMID:25143781

  13. Reliability Based Design for a Raked Wing Tip of an Airframe

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.

    2011-01-01

    A reliability-based optimization methodology has been developed to design the raked wing tip of the Boeing 767-400 extended range airliner made of composite and metallic materials. Design is formulated for an accepted level of risk or reliability. The design variables, weight and the constraints became functions of reliability. Uncertainties in the load, strength and the material properties, as well as the design variables, were modeled as random parameters with specified distributions, like normal, Weibull or Gumbel functions. The objective function and constraint, or a failure mode, became derived functions of the risk-level. Solution to the problem produced the optimum design with weight, variables and constraints as a function of the risk-level. Optimum weight versus reliability traced out an inverted-S shaped graph. The center of the graph corresponded to a 50 percent probability of success, or one failure in two samples. Under some assumptions, this design would be quite close to the deterministic optimum solution. The weight increased when reliability exceeded 50 percent, and decreased when the reliability was compromised. A design could be selected depending on the level of risk acceptable to a situation. The optimization process achieved up to a 20-percent reduction in weight over traditional design.

  14. A modified NSGA-II solution for a new multi-objective hub maximal covering problem under uncertain shipments

    NASA Astrophysics Data System (ADS)

    Ebrahimi Zade, Amir; Sadegheih, Ahmad; Lotfi, Mohammad Mehdi

    2014-07-01

    Hubs are centers for collection, rearrangement, and redistribution of commodities in transportation networks. In this paper, non-linear multi-objective formulations for single and multiple allocation hub maximal covering problems as well as the linearized versions are proposed. The formulations substantially mitigate complexity of the existing models due to the fewer number of constraints and variables. Also, uncertain shipments are studied in the context of hub maximal covering problems. In many real-world applications, any link on the path from origin to destination may fail to work due to disruption. Therefore, in the proposed bi-objective model, maximizing safety of the weakest path in the network is considered as the second objective together with the traditional maximum coverage goal. Furthermore, to solve the bi-objective model, a modified version of NSGA-II with a new dynamic immigration operator is developed in which the accurate number of immigrants depends on the results of the other two common NSGA-II operators, i.e. mutation and crossover. Besides validating proposed models, computational results confirm a better performance of modified NSGA-II versus traditional one.

  15. Multi-objects recognition for distributed intelligent sensor networks

    NASA Astrophysics Data System (ADS)

    He, Haibo; Chen, Sheng; Cao, Yuan; Desai, Sachi; Hohil, Myron E.

    2008-04-01

    This paper proposes an innovative approach for multi-objects recognition for homeland security and defense based intelligent sensor networks. Unlike the conventional way of information analysis, data mining in such networks is typically characterized with high information ambiguity/uncertainty, data redundancy, high dimensionality and real-time constrains. Furthermore, since a typical military based network normally includes multiple mobile sensor platforms, ground forces, fortified tanks, combat flights, and other resources, it is critical to develop intelligent data mining approaches to fuse different information resources to understand dynamic environments, to support decision making processes, and finally to achieve the goals. This paper aims to address these issues with a focus on multi-objects recognition. Instead of classifying a single object as in the traditional image classification problems, the proposed method can automatically learn multiple objectives simultaneously. Image segmentation techniques are used to identify the interesting regions in the field, which correspond to multiple objects such as soldiers or tanks. Since different objects will come with different feature sizes, we propose a feature scaling method to represent each object in the same number of dimensions. This is achieved by linear/nonlinear scaling and sampling techniques. Finally, support vector machine (SVM) based learning algorithms are developed to learn and build the associations for different objects, and such knowledge will be adaptively accumulated for objects recognition in the testing stage. We test the effectiveness of proposed method in different simulated military environments.

  16. Validity of an Observation Method for Assessing Pain Behavior in Individuals With Multiple Sclerosis

    PubMed Central

    Cook, Karon F.; Roddey, Toni S.; Bamer, Alyssa M.; Amtmann, Dagmar; Keefe, Francis J

    2012-01-01

    Context Pain is a common and complex experience for individuals who live with multiple sclerosis (MS) that interferes with physical, psychological and social function. A valid and reliable tool for quantifying observed pain behaviors in MS is critical to understanding how pain behaviors contribute to pain-related disability in this clinical population. Objectives To evaluate the reliability and validity of a pain behavioral observation protocol in individuals who have MS. Methods Community-dwelling volunteers with multiple sclerosis (N=30), back pain (N=5), or arthritis (N=8) were recruited based on clinician referrals, advertisements, fliers, web postings, and participation in previous research. Participants completed measures of pain severity, pain interference, and self-reported pain behaviors and were videotaped doing typical activities (e.g., walking, sitting). Two coders independently recorded frequencies of pain behaviors by category (e.g., guarding, bracing) and inter-rater reliability statistics were calculated. Naïve observers reviewed videotapes of individuals with MS and rated their pain. Spearman correlations were calculated between pain behavior frequencies and self-reported pain and pain ratings by naïve observers. Results Inter-rater reliability estimates indicated the reliability of pain codes in the MS sample. Kappa coefficients ranged from moderate agreement (sighing = 0.40) to substantial agreement (guarding = 0.83). These values were comparable to those obtained in the combined back pain and arthritis sample. Concurrent validity was supported by correlations with self-reported pain (0.46-0.53) and with self-reports of pain behaviors (0.58). Construct validity was supported by finding of 0.87 correlation between total pain behaviors observed by coders and mean pain ratings by naïve observers. Conclusion Results support use of the pain behavior observation protocol for assessing pain behaviors of individuals with MS. Valid assessments of pain behaviors of individuals with MS in could lead to creative interventions in the management of chronic pain in this population. PMID:23159684

  17. Biobjective planning of GEO debris removal mission with multiple servicing spacecrafts

    NASA Astrophysics Data System (ADS)

    Jing, Yu; Chen, Xiao-qian; Chen, Li-hu

    2014-12-01

    The mission planning of GEO debris removal with multiple servicing spacecrafts (SScs) is studied in this paper. Specifically, the SScs are considered to be initially on the GEO belt, and they should rendezvous with debris of different orbital slots and different inclinations, remove them to the graveyard orbit and finally return to their initial locations. Three key problems should be resolved here: task assignment, mission sequence planning and transfer trajectory optimization for each SSc. The minimum-cost, two-impulse phasing maneuver is used for each rendezvous. The objective is to find a set of optimal planning schemes with minimum fuel cost and travel duration. Considering this mission as a hybrid optimal control problem, a mathematical model is proposed. A modified multi-objective particle swarm optimization is employed to address the model. Numerous examples are carried out to demonstrate the effectiveness of the model and solution method. In this paper, single-SSc and multiple-SSc scenarios with the same amount of fuel are compared. Numerous experiments indicate that for a definite GEO debris removal mission, that which alternative (single-SSc or multiple-SSc) is better (cost less fuel and consume less travel time) is determined by many factors. Although in some cases, multiple-SSc scenarios may perform worse than single-SSc scenarios, the extra costs are considered worth the gain in mission safety and robustness.

  18. Mango: multiple alignment with N gapped oligos.

    PubMed

    Zhang, Zefeng; Lin, Hao; Li, Ming

    2008-06-01

    Multiple sequence alignment is a classical and challenging task. The problem is NP-hard. The full dynamic programming takes too much time. The progressive alignment heuristics adopted by most state-of-the-art works suffer from the "once a gap, always a gap" phenomenon. Is there a radically new way to do multiple sequence alignment? In this paper, we introduce a novel and orthogonal multiple sequence alignment method, using both multiple optimized spaced seeds and new algorithms to handle these seeds efficiently. Our new algorithm processes information of all sequences as a whole and tries to build the alignment vertically, avoiding problems caused by the popular progressive approaches. Because the optimized spaced seeds have proved significantly more sensitive than the consecutive k-mers, the new approach promises to be more accurate and reliable. To validate our new approach, we have implemented MANGO: Multiple Alignment with N Gapped Oligos. Experiments were carried out on large 16S RNA benchmarks, showing that MANGO compares favorably, in both accuracy and speed, against state-of-the-art multiple sequence alignment methods, including ClustalW 1.83, MUSCLE 3.6, MAFFT 5.861, ProbConsRNA 1.11, Dialign 2.2.1, DIALIGN-T 0.2.1, T-Coffee 4.85, POA 2.0, and Kalign 2.0. We have further demonstrated the scalability of MANGO on very large datasets of repeat elements. MANGO can be downloaded at http://www.bioinfo.org.cn/mango/ and is free for academic usage.

  19. Practical system for generating digital mixed reality video holograms.

    PubMed

    Song, Joongseok; Kim, Changseob; Park, Hanhoon; Park, Jong-Il

    2016-07-10

    We propose a practical system that can effectively mix the depth data of real and virtual objects by using a Z buffer and can quickly generate digital mixed reality video holograms by using multiple graphic processing units (GPUs). In an experiment, we verify that real objects and virtual objects can be merged naturally in free viewing angles, and the occlusion problem is well handled. Furthermore, we demonstrate that the proposed system can generate mixed reality video holograms at 7.6 frames per second. Finally, the system performance is objectively verified by users' subjective evaluations.

  20. Analysis of the Multi Strategy Goal Programming for Micro-Grid Based on Dynamic ant Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Qiu, J. P.; Niu, D. X.

    Micro-grid is one of the key technologies of the future energy supplies. Take economic planning. reliability, and environmental protection of micro grid as a basis for the analysis of multi-strategy objective programming problems for micro grid which contains wind power, solar power, and battery and micro gas turbine. Establish the mathematical model of each power generation characteristics and energy dissipation. and change micro grid planning multi-objective function under different operating strategies to a single objective model based on AHP method. Example analysis shows that in combination with dynamic ant mixed genetic algorithm can get the optimal power output of this model.

  1. A mission-oriented orbit design method of remote sensing satellite for region monitoring mission based on evolutionary algorithm

    NASA Astrophysics Data System (ADS)

    Shen, Xin; Zhang, Jing; Yao, Huang

    2015-12-01

    Remote sensing satellites play an increasingly prominent role in environmental monitoring and disaster rescue. Taking advantage of almost the same sunshine condition to same place and global coverage, most of these satellites are operated on the sun-synchronous orbit. However, it brings some problems inevitably, the most significant one is that the temporal resolution of sun-synchronous orbit satellite can't satisfy the demand of specific region monitoring mission. To overcome the disadvantages, two methods are exploited: the first one is to build satellite constellation which contains multiple sunsynchronous satellites, just like the CHARTER mechanism has done; the second is to design non-predetermined orbit based on the concrete mission demand. An effective method for remote sensing satellite orbit design based on multiobjective evolution algorithm is presented in this paper. Orbit design problem is converted into a multi-objective optimization problem, and a fast and elitist multi-objective genetic algorithm is utilized to solve this problem. Firstly, the demand of the mission is transformed into multiple objective functions, and the six orbit elements of the satellite are taken as genes in design space, then a simulate evolution process is performed. An optimal resolution can be obtained after specified generation via evolution operation (selection, crossover, and mutation). To examine validity of the proposed method, a case study is introduced: Orbit design of an optical satellite for regional disaster monitoring, the mission demand include both minimizing the average revisit time internal of two objectives. The simulation result shows that the solution for this mission obtained by our method meet the demand the users' demand. We can draw a conclusion that the method presented in this paper is efficient for remote sensing orbit design.

  2. Cooperative angle-only orbit initialization via fusion of admissible areas

    NASA Astrophysics Data System (ADS)

    Jia, Bin; Pham, Khanh; Blasch, Erik; Chen, Genshe; Shen, Dan; Wang, Zhonghai

    2017-05-01

    For the short-arc angle only orbit initialization problem, the admissible area is often used. However, the accuracy using a single sensor is often limited. For high value space objects, it is desired to achieve more accurate results. Fortunately, multiple sensors, which are dedicated to space situational awareness, are available. The work in this paper uses multiple sensors' information to cooperatively initialize the orbit based on the fusion of multiple admissible areas. Both the centralized fusion and decentralized fusion are discussed. Simulation results verify the expectation that the orbit initialization accuracy is improved by using information from multiple sensors.

  3. Multicriteria decision analysis: Overview and implications for environmental decision making

    USGS Publications Warehouse

    Hermans, Caroline M.; Erickson, Jon D.; Erickson, Jon D.; Messner, Frank; Ring, Irene

    2007-01-01

    Environmental decision making involving multiple stakeholders can benefit from the use of a formal process to structure stakeholder interactions, leading to more successful outcomes than traditional discursive decision processes. There are many tools available to handle complex decision making. Here we illustrate the use of a multicriteria decision analysis (MCDA) outranking tool (PROMETHEE) to facilitate decision making at the watershed scale, involving multiple stakeholders, multiple criteria, and multiple objectives. We compare various MCDA methods and their theoretical underpinnings, examining methods that most realistically model complex decision problems in ways that are understandable and transparent to stakeholders.

  4. Research pressure instrumentation for NASA space shuttle main engine

    NASA Technical Reports Server (NTRS)

    Anderson, P. J.; Nussbaum, P.; Gustafson, G.

    1985-01-01

    The breadboard feasibility model of a silicon piezoresistive pressure transducer suitable for space shuttle main engine (SSME) applications was demonstrated. The development of pressure instrumentation for the SSME was examined. The objective is to develop prototype pressure transducers which are targeted to meet the SSME performance design goals and to fabricate, test and deliver a total of 10 prototype units. Effective utilization of the many advantages of silicon piezoresistive strain sensing technology to achieve the objectives of advanced state-of-the-art pressure sensors for reliability, accuracy and ease of manufacture is analyzed. Integration of multiple functions on a single chip is the key attribute of the technology.

  5. Value Focused Thinking in Developing Aerobatic Aircraft Selection Model for Turkish Air Force

    DTIC Science & Technology

    2012-03-22

    many reasons . Most problems in decision- making involve multiple objectives and uncertainties. The number of alternatives can be significant and make ...and Republic of Turkey all around the world”. This is a clear and concise statement of the most basic reason for decision. After making interview...Hwang, C.-L. (1995). Multiple Attribute Decison Making : An Introduction. California: Sage Publications. 90 Vita First Lieutenant

  6. Reliable inference of light curve parameters in the presence of systematics

    NASA Astrophysics Data System (ADS)

    Gibson, Neale P.

    2016-10-01

    Time-series photometry and spectroscopy of transiting exoplanets allow us to study their atmospheres. Unfortunately, the required precision to extract atmospheric information surpasses the design specifications of most general purpose instrumentation. This results in instrumental systematics in the light curves that are typically larger than the target precision. Systematics must therefore be modelled, leaving the inference of light-curve parameters conditioned on the subjective choice of systematics models and model-selection criteria. Here, I briefly review the use of systematics models commonly used for transmission and emission spectroscopy, including model selection, marginalisation over models, and stochastic processes. These form a hierarchy of models with increasing degree of objectivity. I argue that marginalisation over many systematics models is a minimal requirement for robust inference. Stochastic models provide even more flexibility and objectivity, and therefore produce the most reliable results. However, no systematics models are perfect, and the best strategy is to compare multiple methods and repeat observations where possible.

  7. Efficient RNA structure comparison algorithms.

    PubMed

    Arslan, Abdullah N; Anandan, Jithendar; Fry, Eric; Monschke, Keith; Ganneboina, Nitin; Bowerman, Jason

    2017-12-01

    Recently proposed relative addressing-based ([Formula: see text]) RNA secondary structure representation has important features by which an RNA structure database can be stored into a suffix array. A fast substructure search algorithm has been proposed based on binary search on this suffix array. Using this substructure search algorithm, we present a fast algorithm that finds the largest common substructure of given multiple RNA structures in [Formula: see text] format. The multiple RNA structure comparison problem is NP-hard in its general formulation. We introduced a new problem for comparing multiple RNA structures. This problem has more strict similarity definition and objective, and we propose an algorithm that solves this problem efficiently. We also develop another comparison algorithm that iteratively calls this algorithm to locate nonoverlapping large common substructures in compared RNAs. With the new resulting tools, we improved the RNASSAC website (linked from http://faculty.tamuc.edu/aarslan ). This website now also includes two drawing tools: one specialized for preparing RNA substructures that can be used as input by the search tool, and another one for automatically drawing the entire RNA structure from a given structure sequence.

  8. A two-stage approach to the depot shunting driver assignment problem with workload balance considerations.

    PubMed

    Wang, Jiaxi; Gronalt, Manfred; Sun, Yan

    2017-01-01

    Due to its environmentally sustainable and energy-saving characteristics, railway transportation nowadays plays a fundamental role in delivering passengers and goods. Emerged in the area of transportation planning, the crew (workforce) sizing problem and the crew scheduling problem have been attached great importance by the railway industry and the scientific community. In this paper, we aim to solve the two problems by proposing a novel two-stage optimization approach in the context of the electric multiple units (EMU) depot shunting driver assignment problem. Given a predefined depot shunting schedule, the first stage of the approach focuses on determining an optimal size of shunting drivers. While the second stage is formulated as a bi-objective optimization model, in which we comprehensively consider the objectives of minimizing the total walking distance and maximizing the workload balance. Then we combine the normalized normal constraint method with a modified Pareto filter algorithm to obtain Pareto solutions for the bi-objective optimization problem. Furthermore, we conduct a series of numerical experiments to demonstrate the proposed approach. Based on the computational results, the regression analysis yield a driver size predictor and the sensitivity analysis give some interesting insights that are useful for decision makers.

  9. A two-stage approach to the depot shunting driver assignment problem with workload balance considerations

    PubMed Central

    Gronalt, Manfred; Sun, Yan

    2017-01-01

    Due to its environmentally sustainable and energy-saving characteristics, railway transportation nowadays plays a fundamental role in delivering passengers and goods. Emerged in the area of transportation planning, the crew (workforce) sizing problem and the crew scheduling problem have been attached great importance by the railway industry and the scientific community. In this paper, we aim to solve the two problems by proposing a novel two-stage optimization approach in the context of the electric multiple units (EMU) depot shunting driver assignment problem. Given a predefined depot shunting schedule, the first stage of the approach focuses on determining an optimal size of shunting drivers. While the second stage is formulated as a bi-objective optimization model, in which we comprehensively consider the objectives of minimizing the total walking distance and maximizing the workload balance. Then we combine the normalized normal constraint method with a modified Pareto filter algorithm to obtain Pareto solutions for the bi-objective optimization problem. Furthermore, we conduct a series of numerical experiments to demonstrate the proposed approach. Based on the computational results, the regression analysis yield a driver size predictor and the sensitivity analysis give some interesting insights that are useful for decision makers. PMID:28704489

  10. Multiple Revolution Solutions for the Perturbed Lambert Problem using the Method of Particular Solutions and Picard Iteration

    NASA Astrophysics Data System (ADS)

    Woollands, Robyn M.; Read, Julie L.; Probe, Austin B.; Junkins, John L.

    2017-12-01

    We present a new method for solving the multiple revolution perturbed Lambert problem using the method of particular solutions and modified Chebyshev-Picard iteration. The method of particular solutions differs from the well-known Newton-shooting method in that integration of the state transition matrix (36 additional differential equations) is not required, and instead it makes use of a reference trajectory and a set of n particular solutions. Any numerical integrator can be used for solving two-point boundary problems with the method of particular solutions, however we show that using modified Chebyshev-Picard iteration affords an avenue for increased efficiency that is not available with other step-by-step integrators. We take advantage of the path approximation nature of modified Chebyshev-Picard iteration (nodes iteratively converge to fixed points in space) and utilize a variable fidelity force model for propagating the reference trajectory. Remarkably, we demonstrate that computing the particular solutions with only low fidelity function evaluations greatly increases the efficiency of the algorithm while maintaining machine precision accuracy. Our study reveals that solving the perturbed Lambert's problem using the method of particular solutions with modified Chebyshev-Picard iteration is about an order of magnitude faster compared with the classical shooting method and a tenth-twelfth order Runge-Kutta integrator. It is well known that the solution to Lambert's problem over multiple revolutions is not unique and to ensure that all possible solutions are considered we make use of a reliable preexisting Keplerian Lambert solver to warm start our perturbed algorithm.

  11. Validation of the German prostate-specific module.

    PubMed

    Bestmann, Beate; Rohde, Volker; Siebmann, Jens-Ulrich; Galalae, Razvan; Weidner, Wolfgang; Küchler, Thomas

    2006-02-01

    Theoretically, all patients newly diagnosed with prostate cancer are faced with a choice of treatment options: radical prostatectomy or radio therapy. Although these different treatments may have no differences in terms of survival, they may have very different consequences on the subsequent quality of life (QoL). Prerequisite to analyze QoL is a reliable and valid instrument to assess these differences not only in terms of general QoL (EORTC QLQ-C30) but prostate specific symptoms with a prostate specific module as well. Therefore, the aim of this study was a psychometric evaluation (validation) of the prostate-specific module (PSM). Five historical cohort studies were put together for an empirical meta-analysis. The main objective was to analyze the module's psychometric properties. The total sample consisted of 1,185 patients, of whom 950 completed the QoL questionnaires (EORTC QLQ-C30 and a prostate specific module developed by Kuechler et al.). First step of analysis was a principal component analysis that revealed the following scales: urinary problems, incontinence, erectile dysfunction, sexual problems, problems with partner, pain, heat, nutrition, and psychic strain. The module showed good reliability and concurrent validity and very good construct validity, since the module is able to discriminate between different treatment regimes, tumor stages and age. The German PSM is a reliable, valid and applicable tool for QoL in patients with prostate cancer.

  12. The Relationship Between Non-Symbolic Multiplication and Division in Childhood

    PubMed Central

    McCrink, Koleen; Shafto, Patrick; Barth, Hilary

    2016-01-01

    Children without formal education in addition and subtraction are able to perform multi-step operations over an approximate number of objects. Further, their performance improves when solving approximate (but not exact) addition and subtraction problems that allow for inversion as a shortcut (e.g., a + b − b = a). The current study examines children’s ability to perform multi-step operations, and the potential for an inversion benefit, for the operations of approximate, non-symbolic multiplication and division. Children were trained to compute a multiplication and division scaling factor (*2 or /2, *4 or /4), and then tested on problems that combined two of these factors in a way that either allowed for an inversion shortcut (e.g., 8 * 4 / 4) or did not (e.g., 8 * 4 / 2). Children’s performance was significantly better than chance for all scaling factors during training, and they successfully computed the outcomes of the multi-step testing problems. They did not exhibit a performance benefit for problems with the a * b / b structure, suggesting they did not draw upon inversion reasoning as a logical shortcut to help them solve the multi-step test problems. PMID:26880261

  13. Convergent and invariant object representations for sight, sound, and touch.

    PubMed

    Man, Kingson; Damasio, Antonio; Meyer, Kaspar; Kaplan, Jonas T

    2015-09-01

    We continuously perceive objects in the world through multiple sensory channels. In this study, we investigated the convergence of information from different sensory streams within the cerebral cortex. We presented volunteers with three common objects via three different modalities-sight, sound, and touch-and used multivariate pattern analysis of functional magnetic resonance imaging data to map the cortical regions containing information about the identity of the objects. We could reliably predict which of the three stimuli a subject had seen, heard, or touched from the pattern of neural activity in the corresponding early sensory cortices. Intramodal classification was also successful in large portions of the cerebral cortex beyond the primary areas, with multiple regions showing convergence of information from two or all three modalities. Using crossmodal classification, we also searched for brain regions that would represent objects in a similar fashion across different modalities of presentation. We trained a classifier to distinguish objects presented in one modality and then tested it on the same objects presented in a different modality. We detected audiovisual invariance in the right temporo-occipital junction, audiotactile invariance in the left postcentral gyrus and parietal operculum, and visuotactile invariance in the right postcentral and supramarginal gyri. Our maps of multisensory convergence and crossmodal generalization reveal the underlying organization of the association cortices, and may be related to the neural basis for mental concepts. © 2015 Wiley Periodicals, Inc.

  14. A method of solving tilt illumination for multiple distance phase retrieval

    NASA Astrophysics Data System (ADS)

    Guo, Cheng; Li, Qiang; Tan, Jiubin; Liu, Shutian; Liu, Zhengjun

    2018-07-01

    Multiple distance phase retrieval is a technique of using a series of intensity patterns to reconstruct a complex-valued image of object. However, tilt illumination originating from the off-axis displacement of incident light significantly impairs its imaging quality. To eliminate this affection, we use cross-correlation calibration to estimate oblique angle of incident light and a Fourier-based strategy to correct tilted illumination effect. Compared to other methods, binary and biological object are both stably reconstructed in simulation and experiment. This work provides a simple but beneficial method to solve the problem of tilt illumination for lens-free multi-distance system.

  15. Dynamic Network Selection for Multicast Services in Wireless Cooperative Networks

    NASA Astrophysics Data System (ADS)

    Chen, Liang; Jin, Le; He, Feng; Cheng, Hanwen; Wu, Lenan

    In next generation mobile multimedia communications, different wireless access networks are expected to cooperate. However, it is a challenging task to choose an optimal transmission path in this scenario. This paper focuses on the problem of selecting the optimal access network for multicast services in the cooperative mobile and broadcasting networks. An algorithm is proposed, which considers multiple decision factors and multiple optimization objectives. An analytic hierarchy process (AHP) method is applied to schedule the service queue and an artificial neural network (ANN) is used to improve the flexibility of the algorithm. Simulation results show that by applying the AHP method, a group of weight ratios can be obtained to improve the performance of multiple objectives. And ANN method is effective to adaptively adjust weight ratios when users' new waiting threshold is generated.

  16. Assessing the Problem Formulation in an Integrated Assessment Model: Implications for Climate Policy Decision-Support

    NASA Astrophysics Data System (ADS)

    Garner, G. G.; Reed, P. M.; Keller, K.

    2014-12-01

    Integrated assessment models (IAMs) are often used with the intent to aid in climate change decisionmaking. Numerous studies have analyzed the effects of parametric and/or structural uncertainties in IAMs, but uncertainties regarding the problem formulation are often overlooked. Here we use the Dynamic Integrated model of Climate and the Economy (DICE) to analyze the effects of uncertainty surrounding the problem formulation. The standard DICE model adopts a single objective to maximize a weighted sum of utilities of per-capita consumption. Decisionmakers, however, may be concerned with a broader range of values and preferences that are not captured by this a priori definition of utility. We reformulate the problem by introducing three additional objectives that represent values such as (i) reliably limiting global average warming to two degrees Celsius and minimizing both (ii) the costs of abatement and (iii) the damages due to climate change. We derive a set of Pareto-optimal solutions over which decisionmakers can trade-off and assess performance criteria a posteriori. We illustrate the potential for myopia in the traditional problem formulation and discuss the capability of this multiobjective formulation to provide decision support.

  17. A Scalable and Robust Multi-Agent Approach to Distributed Optimization

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan

    2005-01-01

    Modularizing a large optimization problem so that the solutions to the subproblems provide a good overall solution is a challenging problem. In this paper we present a multi-agent approach to this problem based on aligning the agent objectives with the system objectives, obviating the need to impose external mechanisms to achieve collaboration among the agents. This approach naturally addresses scaling and robustness issues by ensuring that the agents do not rely on the reliable operation of other agents We test this approach in the difficult distributed optimization problem of imperfect device subset selection [Challet and Johnson, 2002]. In this problem, there are n devices, each of which has a "distortion", and the task is to find the subset of those n devices that minimizes the average distortion. Our results show that in large systems (1000 agents) the proposed approach provides improvements of over an order of magnitude over both traditional optimization methods and traditional multi-agent methods. Furthermore, the results show that even in extreme cases of agent failures (i.e., half the agents fail midway through the simulation) the system remains coordinated and still outperforms a failure-free and centralized optimization algorithm.

  18. Multi-trip vehicle routing and scheduling problem with time window in real life

    NASA Astrophysics Data System (ADS)

    Sze, San-Nah; Chiew, Kang-Leng; Sze, Jeeu-Fong

    2012-09-01

    This paper studies a manpower scheduling problem with multiple maintenance operations and vehicle routing considerations. Service teams located at a common service centre are required to travel to different customer sites. All customers must be served within given time window, which are known in advance. The scheduling process must take into consideration complex constraints such as a meal break during the team's shift, multiple travelling trips, synchronisation of service teams and working shifts. The main objective of this study is to develop a heuristic that can generate high quality solution in short time for large problem instances. A Two-stage Scheduling Heuristic is developed for different variants of the problem. Empirical results show that the proposed solution performs effectively and efficiently. In addition, our proposed approximation algorithm is very flexible and can be easily adapted to different scheduling environments and operational requirements.

  19. Local Approximation and Hierarchical Methods for Stochastic Optimization

    NASA Astrophysics Data System (ADS)

    Cheng, Bolong

    In this thesis, we present local and hierarchical approximation methods for two classes of stochastic optimization problems: optimal learning and Markov decision processes. For the optimal learning problem class, we introduce a locally linear model with radial basis function for estimating the posterior mean of the unknown objective function. The method uses a compact representation of the function which avoids storing the entire history, as is typically required by nonparametric methods. We derive a knowledge gradient policy with the locally parametric model, which maximizes the expected value of information. We show the policy is asymptotically optimal in theory, and experimental works suggests that the method can reliably find the optimal solution on a range of test functions. For the Markov decision processes problem class, we are motivated by an application where we want to co-optimize a battery for multiple revenue, in particular energy arbitrage and frequency regulation. The nature of this problem requires the battery to make charging and discharging decisions at different time scales while accounting for the stochastic information such as load demand, electricity prices, and regulation signals. Computing the exact optimal policy becomes intractable due to the large state space and the number of time steps. We propose two methods to circumvent the computation bottleneck. First, we propose a nested MDP model that structure the co-optimization problem into smaller sub-problems with reduced state space. This new model allows us to understand how the battery behaves down to the two-second dynamics (that of the frequency regulation market). Second, we introduce a low-rank value function approximation for backward dynamic programming. This new method only requires computing the exact value function for a small subset of the state space and approximate the entire value function via low-rank matrix completion. We test these methods on historical price data from the PJM Interconnect and show that it outperforms the baseline approach used in the industry.

  20. The relationship between interpersonal problems and occupational stress in physicians.

    PubMed

    Falkum, Erik; Vaglum, Per

    2005-01-01

    This article examined the associations between occupational stress and interpersonal problems in physicians. A nationwide representative sample of Norwegian physicians received the 64-item version of the Inventory of Interpersonal Problems (IIP-64) (N=862, response rate=70%) and six instruments measuring occupational stress. Comparison of means, correlation and reliability statistics and multiple regression analyses were applied. The IIP-64 total score had a significant impact on job satisfaction, perceived unrealistic expectancies, communication with colleagues and nurses and on stress from interaction with patients. Being overly subassertive was related to low job satisfaction. Being overly expressive was linked to the experience of unrealistic expectancies from others and lack of positive feedback, whereas overly competitive physicians tended to have poorer relationships with both colleagues and nurses. Addressing interpersonal problems in medical school and postgraduate training may be a valuable measure to prevent job stress and promote quality of care.

  1. NDE reliability and probability of detection (POD) evolution and paradigm shift

    NASA Astrophysics Data System (ADS)

    Singh, Surendra

    2014-02-01

    The subject of NDE Reliability and POD has gone through multiple phases since its humble beginning in the late 1960s. This was followed by several programs including the important one nicknamed "Have Cracks - Will Travel" or in short "Have Cracks" by Lockheed Georgia Company for US Air Force during 1974-1978. This and other studies ultimately led to a series of developments in the field of reliability and POD starting from the introduction of fracture mechanics and Damaged Tolerant Design (DTD) to statistical framework by Bernes and Hovey in 1981 for POD estimation to MIL-STD HDBK 1823 (1999) and 1823A (2009). During the last decade, various groups and researchers have further studied the reliability and POD using Model Assisted POD (MAPOD), Simulation Assisted POD (SAPOD), and applying Bayesian Statistics. All and each of these developments had one objective, i.e., improving accuracy of life prediction in components that to a large extent depends on the reliability and capability of NDE methods. Therefore, it is essential to have a reliable detection and sizing of large flaws in components. Currently, POD is used for studying reliability and capability of NDE methods, though POD data offers no absolute truth regarding NDE reliability, i.e., system capability, effects of flaw morphology, and quantifying the human factors. Furthermore, reliability and POD have been reported alike in meaning but POD is not NDE reliability. POD is a subset of the reliability that consists of six phases: 1) samples selection using DOE, 2) NDE equipment setup and calibration, 3) System Measurement Evaluation (SME) including Gage Repeatability &Reproducibility (Gage R&R) and Analysis Of Variance (ANOVA), 4) NDE system capability and electronic and physical saturation, 5) acquiring and fitting data to a model, and data analysis, and 6) POD estimation. This paper provides an overview of all major POD milestones for the last several decades and discuss rationale for using Integrated Computational Materials Engineering (ICME), MAPOD, SAPOD, and Bayesian statistics for studying controllable and non-controllable variables including human factors for estimating POD. Another objective is to list gaps between "hoped for" versus validated or fielded failed hardware.

  2. New mathematical modeling for a location-routing-inventory problem in a multi-period closed-loop supply chain in a car industry

    NASA Astrophysics Data System (ADS)

    Forouzanfar, F.; Tavakkoli-Moghaddam, R.; Bashiri, M.; Baboli, A.; Hadji Molana, S. M.

    2017-11-01

    This paper studies a location-routing-inventory problem in a multi-period closed-loop supply chain with multiple suppliers, producers, distribution centers, customers, collection centers, recovery, and recycling centers. In this supply chain, centers are multiple levels, a price increase factor is considered for operational costs at centers, inventory and shortage (including lost sales and backlog) are allowed at production centers, arrival time of vehicles of each plant to its dedicated distribution centers and also departure from them are considered, in such a way that the sum of system costs and the sum of maximum time at each level should be minimized. The aforementioned problem is formulated in the form of a bi-objective nonlinear integer programming model. Due to the NP-hard nature of the problem, two meta-heuristics, namely, non-dominated sorting genetic algorithm (NSGA-II) and multi-objective particle swarm optimization (MOPSO), are used in large sizes. In addition, a Taguchi method is used to set the parameters of these algorithms to enhance their performance. To evaluate the efficiency of the proposed algorithms, the results for small-sized problems are compared with the results of the ɛ-constraint method. Finally, four measuring metrics, namely, the number of Pareto solutions, mean ideal distance, spacing metric, and quality metric, are used to compare NSGA-II and MOPSO.

  3. Analysis instrument test on mathematical power the material geometry of space flat side for grade 8

    NASA Astrophysics Data System (ADS)

    Kusmaryono, Imam; Suyitno, Hardi; Dwijanto, Karomah, Nur

    2017-08-01

    The main problem of research to determine the quality of test items on the material side of flat geometry to assess students' mathematical power. The method used is quantitative descriptive. The subjects were students of class 8 as many as 20 students. The object of research is the quality of test items in terms of the power of mathematics: validity, reliability, level of difficulty and power differentiator. Instrument mathematical power ratings are tested include: written tests and questionnaires about the disposition of mathematical power. Data were obtained from the field, in the form of test data on the material geometry of space flat side and questionnaires. The results of the test instrument to the reliability of the test item is influenced by many factors. Factors affecting the reliability of the instrument is the number of items, homogeneity test questions, the time required, the uniformity of conditions of the test taker, the homogeneity of the group, the variability problem, and motivation of the individual (person taking the test). Overall, the evaluation results of this study stated that the test instrument can be used as a tool to measure students' mathematical power.

  4. Factor structure of the General Health Questionnaire-28 (GHQ-28) from infertile women attending the Yazd Research and Clinical Center for Infertility.

    PubMed

    Shayan, Zahra; Pourmovahed, Zahra; Najafipour, Fatemeh; Abdoli, Ali Mohammad; Mohebpour, Fatemeh; Najafipour, Sedighe

    2015-12-01

    Nowadays, infertility problems have become a social concern, and are associated with multiple psychological and social problems. Also, it affects the interpersonal communication between the individual, familial, and social characteristics. Since women are exposed to stressors of physical, mental, social factors, and treatment of infertility, providing a psychometric screening tool is necessary for disorders of this group. The aim of this study was to determine the factor structure of the general health questionnaire-28 to discover mental disorders in infertile women. In this study, 220 infertile women undergoing treatment of infertility were selected from the Yazd Research and Clinical Center for Infertility with convenience sampling in 2011. After completing the general health questionnaire by the project manager, validity and, reliability of the questionnaire were calculated by confirmatory factor structure and Cronbach's alpha, respectively. Four factors, including anxiety and insomnia, social dysfunction, depression, and physical symptoms were extracted from the factor structure. 50.12% of the total variance was explained by four factors. The reliability coefficient of the questionnaire was obtained 0.90. Analysis of the factor structure and reliability of General Health Questionnaire-28 showed that it is suitable as a screening instrument for assessing general health of infertile women.

  5. Hip range of motion and provocative physical examination tests reliability and agreement in asymptomatic volunteers

    PubMed Central

    Prather, H; Harris-Hayes, M; Hunt, D; Steger-May, K; Mathew, V; Clohisy, JC

    2012-01-01

    Objective The objectives of this study are the following: 1) report passive hip ROM in asymptomatic young adults, 2) report the intra-tester and inter-tester reliability of hip ROM measurements among testers of multiple disciplines, 3) report the results of provocative hip tests and tester agreement. Design descriptive epidemiology study Setting tertiary university Participants Twenty-eight young adult volunteers without musculoskeletal symptoms, history of disorder or surgery involving the lumbar spine or lower extremities were enrolled and completed the study. Methods Asymptomatic young adult volunteers completed questionnaires and were examined by two blinded examiners during a single session. The testers were physical therapists and physicians. Hip range of motion and provocative tests were completed by both examiners on each hip. Main Outcome Measurements Inter and intra-rater reliability for ROM and agreement for provocative tests was determined. Results Twenty-eight asymptomatic adults with mean age 31 years old (range 18–51 years) and mean modified Harris Hip Score of 99.5 ± 1.5 and UCLA Activity score of 8.8 ± 1.2 completed the study. Intra-rater agreement was excellent for all hip range of motion measurements, with intraclass correlation coefficients (ICCs) ranging from 0.76 to 0.97 with similar agreement if the examiner was a physical therapist or a physician. Excellent inter-rater reliability was found for hip flexion ICC 0.87 (95% CI 0.78 to 0.92), supine internal rotation ICC 0.75 (95% CI 0.60 to 0.84) and prone internal rotation ICC 0.79 (95% CI 0.66 to 0.87). The least reliable measurements were supine hip abduction (ICC 0.34) and supine external rotation (ICC 0.18). Agreement between examiners ranged from 96–100% for provocative hip tests which included the hip impingement, resisted straight leg raise, FABER/Patrick’s and log roll tests. Conclusions Specific hip ROM measures show excellent inter-rater reliability and provocative hip tests show good agreement among multiple examiners and medical disciplines. Further studies are needed to assess the utilization of these measurements and tests as a part of a hip screening examination to assess for young adults at risk intra-articular hip disorders prior to the onset of degenerative changes. PMID:20970757

  6. GWM-VI: groundwater management with parallel processing for multiple MODFLOW versions

    USGS Publications Warehouse

    Banta, Edward R.; Ahlfeld, David P.

    2013-01-01

    Groundwater Management–Version Independent (GWM–VI) is a new version of the Groundwater Management Process of MODFLOW. The Groundwater Management Process couples groundwater-flow simulation with a capability to optimize stresses on the simulated aquifer based on an objective function and constraints imposed on stresses and aquifer state. GWM–VI extends prior versions of Groundwater Management in two significant ways—(1) it can be used with any version of MODFLOW that meets certain requirements on input and output, and (2) it is structured to allow parallel processing of the repeated runs of the MODFLOW model that are required to solve the optimization problem. GWM–VI uses the same input structure for files that describe the management problem as that used by prior versions of Groundwater Management. GWM–VI requires only minor changes to the input files used by the MODFLOW model. GWM–VI uses the Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER-API) to implement both version independence and parallel processing. GWM–VI communicates with the MODFLOW model by manipulating certain input files and interpreting results from the MODFLOW listing file and binary output files. Nearly all capabilities of prior versions of Groundwater Management are available in GWM–VI. GWM–VI has been tested with MODFLOW-2005, MODFLOW-NWT (a Newton formulation for MODFLOW-2005), MF2005-FMP2 (the Farm Process for MODFLOW-2005), SEAWAT, and CFP (Conduit Flow Process for MODFLOW-2005). This report provides sample problems that demonstrate a range of applications of GWM–VI and the directory structure and input information required to use the parallel-processing capability.

  7. School readiness among children with behavior problems at entrance into kindergarten: results from a US national study.

    PubMed

    Montes, Guillermo; Lotyczewski, Bohdan S; Halterman, Jill S; Hightower, Alan D

    2012-03-01

    The impact of behavior problems on kindergarten readiness is not known. Our objective was to estimate the association between behavior problems and kindergarten readiness on a US national sample. In the US educational system, kindergarten is a natural point of entry into formal schooling at age 5 because fewer than half of the children enter kindergarten with prior formal preschool education. Parents of 1,200 children who were scheduled to enter kindergarten for the first time and were members of the Harris Interactive online national panel were surveyed. We defined behavior problems as an affirmative response to the question, "Has your child ever had behavior problems?" We validated this against attention deficit hyperactivity disorder diagnosis, scores on a reliable socioemotional scale, and child's receipt of early intervention services. We used linear, tobit, and logistic regression analyses to estimate the association between having behavior problems and scores in reliable scales of motor, play, speech and language, and school skills and an overall kindergarten readiness indicator. The sample included 176 children with behavior problems for a national prevalence of 14% (confidence interval, 11.5-17.5). Children with behavior problems were more likely to be male and live in households with lower income and parental education. We found that children with behavior problems entered kindergarten with lower speech and language, motor, play, and school skills, even after controlling for demographics and region. Delays were 0.6-1 SD below scores of comparable children without behavior problems. Parents of children with behavior problems were 5.2 times more likely to report their child was not ready for kindergarten. Childhood behavior problems are associated with substantial delays in motor, language, play, school, and socioemotional skills before entrance into kindergarten. Early screening and intervention is recommended.

  8. A Diagnostic Assessment of Evolutionary Multiobjective Optimization for Water Resources Systems

    NASA Astrophysics Data System (ADS)

    Reed, P.; Hadka, D.; Herman, J.; Kasprzyk, J.; Kollat, J.

    2012-04-01

    This study contributes a rigorous diagnostic assessment of state-of-the-art multiobjective evolutionary algorithms (MOEAs) and highlights key advances that the water resources field can exploit to better discover the critical tradeoffs constraining our systems. This study provides the most comprehensive diagnostic assessment of MOEAs for water resources to date, exploiting more than 100,000 MOEA runs and trillions of design evaluations. The diagnostic assessment measures the effectiveness, efficiency, reliability, and controllability of ten benchmark MOEAs for a representative suite of water resources applications addressing rainfall-runoff calibration, long-term groundwater monitoring (LTM), and risk-based water supply portfolio planning. The suite of problems encompasses a range of challenging problem properties including (1) many-objective formulations with 4 or more objectives, (2) multi-modality (or false optima), (3) nonlinearity, (4) discreteness, (5) severe constraints, (6) stochastic objectives, and (7) non-separability (also called epistasis). The applications are representative of the dominant problem classes that have shaped the history of MOEAs in water resources and that will be dominant foci in the future. Recommendations are provided for which modern MOEAs should serve as tools and benchmarks in the future water resources literature.

  9. Inference of emission rates from multiple sources using Bayesian probability theory.

    PubMed

    Yee, Eugene; Flesch, Thomas K

    2010-03-01

    The determination of atmospheric emission rates from multiple sources using inversion (regularized least-squares or best-fit technique) is known to be very susceptible to measurement and model errors in the problem, rendering the solution unusable. In this paper, a new perspective is offered for this problem: namely, it is argued that the problem should be addressed as one of inference rather than inversion. Towards this objective, Bayesian probability theory is used to estimate the emission rates from multiple sources. The posterior probability distribution for the emission rates is derived, accounting fully for the measurement errors in the concentration data and the model errors in the dispersion model used to interpret the data. The Bayesian inferential methodology for emission rate recovery is validated against real dispersion data, obtained from a field experiment involving various source-sensor geometries (scenarios) consisting of four synthetic area sources and eight concentration sensors. The recovery of discrete emission rates from three different scenarios obtained using Bayesian inference and singular value decomposition inversion are compared and contrasted.

  10. Dense fog on the highway: Visual range monitoring in cars?

    NASA Technical Reports Server (NTRS)

    Hahn, W.; Krichbaumer, W.; Streicher, J.; Werner, CH.

    1992-01-01

    This paper reports on the development of a new sensor. Laser range-finders are currently installed in cars and trucks to measure the distance to a proceeding car (LEICA). A modification of such a sensor to measure visibility was made. The problems that had to be solved were: (1) choice of wavelength with relation to the human eye for visibility measurements; (2) dependency of the wavelength on atmospheric turbidity; (3) laser eye-safety; and (4) influence of multiple scattering at visibilities smaller than 200 m. The wavelength used for lidar sensors in the near infrared presents no real problems because the object to be sensed is fog appearing white which means that scattering from fog is wavelength independent. There are however differences in backscatter-to-extinction ratio for different fog and weather situations. The two solutions to these problems are polarization and multiple scattering. As known from airport operations of a laser ceilometer, one can use this multiple scattering contribution to determine the visibility.

  11. Multiple fault separation and detection by joint subspace learning for the health assessment of wind turbine gearboxes

    NASA Astrophysics Data System (ADS)

    Du, Zhaohui; Chen, Xuefeng; Zhang, Han; Zi, Yanyang; Yan, Ruqiang

    2017-09-01

    The gearbox of a wind turbine (WT) has dominant failure rates and highest downtime loss among all WT subsystems. Thus, gearbox health assessment for maintenance cost reduction is of paramount importance. The concurrence of multiple faults in gearbox components is a common phenomenon due to fault induction mechanism. This problem should be considered before planning to replace the components of the WT gearbox. Therefore, the key fault patterns should be reliably identified from noisy observation data for the development of an effective maintenance strategy. However, most of the existing studies focusing on multiple fault diagnosis always suffer from inappropriate division of fault information in order to satisfy various rigorous decomposition principles or statistical assumptions, such as the smooth envelope principle of ensemble empirical mode decomposition and the mutual independence assumption of independent component analysis. Thus, this paper presents a joint subspace learning-based multiple fault detection (JSL-MFD) technique to construct different subspaces adaptively for different fault patterns. Its main advantage is its capability to learn multiple fault subspaces directly from the observation signal itself. It can also sparsely concentrate the feature information into a few dominant subspace coefficients. Furthermore, it can eliminate noise by simply performing coefficient shrinkage operations. Consequently, multiple fault patterns are reliably identified by utilizing the maximum fault information criterion. The superiority of JSL-MFD in multiple fault separation and detection is comprehensively investigated and verified by the analysis of a data set of a 750 kW WT gearbox. Results show that JSL-MFD is superior to a state-of-the-art technique in detecting hidden fault patterns and enhancing detection accuracy.

  12. Worst-Case Energy Efficiency Maximization in a 5G Massive MIMO-NOMA System.

    PubMed

    Chinnadurai, Sunil; Selvaprabhu, Poongundran; Jeong, Yongchae; Jiang, Xueqin; Lee, Moon Ho

    2017-09-18

    In this paper, we examine the robust beamforming design to tackle the energy efficiency (EE) maximization problem in a 5G massive multiple-input multiple-output (MIMO)-non-orthogonal multiple access (NOMA) downlink system with imperfect channel state information (CSI) at the base station. A novel joint user pairing and dynamic power allocation (JUPDPA) algorithm is proposed to minimize the inter user interference and also to enhance the fairness between the users. This work assumes imperfect CSI by adding uncertainties to channel matrices with worst-case model, i.e., ellipsoidal uncertainty model (EUM). A fractional non-convex optimization problem is formulated to maximize the EE subject to the transmit power constraints and the minimum rate requirement for the cell edge user. The designed problem is difficult to solve due to its nonlinear fractional objective function. We firstly employ the properties of fractional programming to transform the non-convex problem into its equivalent parametric form. Then, an efficient iterative algorithm is proposed established on the constrained concave-convex procedure (CCCP) that solves and achieves convergence to a stationary point of the above problem. Finally, Dinkelbach's algorithm is employed to determine the maximum energy efficiency. Comprehensive numerical results illustrate that the proposed scheme attains higher worst-case energy efficiency as compared with the existing NOMA schemes and the conventional orthogonal multiple access (OMA) scheme.

  13. Worst-Case Energy Efficiency Maximization in a 5G Massive MIMO-NOMA System

    PubMed Central

    Jeong, Yongchae; Jiang, Xueqin; Lee, Moon Ho

    2017-01-01

    In this paper, we examine the robust beamforming design to tackle the energy efficiency (EE) maximization problem in a 5G massive multiple-input multiple-output (MIMO)-non-orthogonal multiple access (NOMA) downlink system with imperfect channel state information (CSI) at the base station. A novel joint user pairing and dynamic power allocation (JUPDPA) algorithm is proposed to minimize the inter user interference and also to enhance the fairness between the users. This work assumes imperfect CSI by adding uncertainties to channel matrices with worst-case model, i.e., ellipsoidal uncertainty model (EUM). A fractional non-convex optimization problem is formulated to maximize the EE subject to the transmit power constraints and the minimum rate requirement for the cell edge user. The designed problem is difficult to solve due to its nonlinear fractional objective function. We firstly employ the properties of fractional programming to transform the non-convex problem into its equivalent parametric form. Then, an efficient iterative algorithm is proposed established on the constrained concave-convex procedure (CCCP) that solves and achieves convergence to a stationary point of the above problem. Finally, Dinkelbach’s algorithm is employed to determine the maximum energy efficiency. Comprehensive numerical results illustrate that the proposed scheme attains higher worst-case energy efficiency as compared with the existing NOMA schemes and the conventional orthogonal multiple access (OMA) scheme. PMID:28927019

  14. Research pressure instrumentation for NASA Space Shuttle main engine

    NASA Technical Reports Server (NTRS)

    Anderson, P. J.; Nussbaum, P.; Gustafson, G.

    1984-01-01

    The development of prototype pressure transducers which are targeted to meet the Space Shuttle Main Engine SSME performance design goals is discussed. The fabrication, testing and delivery of 10 prototype units is examined. Silicon piezoresistive strain sensing technology is used to achieve the objectives of advanced state-of-the-art pressure sensors in terms of reliability, accuracy and ease of manufacture. Integration of multiple functions on a single chip is the key attribute of this technology.

  15. Women's 'inner-balance': a comparison of stressors, personality traits and health problems by age groups.

    PubMed

    Kenney, J W

    2000-03-01

    Women's 'inner-balance': a comparison of stressors, personality traits and health problems by age groups The purposes of this descriptive study were to identify differences in women's stressors, personality mediating traits and symptoms of health problems by age groups, and to guide revisions for development of a shorter, reliable questionnaire to measure women's health and risks for stress-related illnesses. A convenience sample of 299 women aged between 18 and 66 years who resided in the south-western United States and could read English completed a lengthy questionnaire. ANOVAs were used to compare women by three age groups. Young women (18-29 years) reported high stressors, less healthy personality traits, and significantly more physical and emotional symptoms of health problems than middle-age and older women. Middle-age women (30-45 years) had significantly more stressors than other women, but their healthy personality traits may have contributed to fewer health problems. Older women (46-66 years) had the fewest stressors, highest healthy personality traits, and fewest symptoms of problems compared to other age groups. In their roles and relationships as wives, mothers and employees, women experienced multiple stressors such as inadequate physical and emotional support from their spouse/partner, along with parenting and employee difficulties that contributed to their health problems. Young and middle-aged women were more stressed, juggling the multiple responsibilities and demands of their spouse, children, ageing parents, and their occupation, while trying to maintain their own 'inner balance'.

  16. Neural Dynamics of Multiple Object Processing in Mild Cognitive Impairment and Alzheimer's Disease: Future Early Diagnostic Biomarkers?

    PubMed

    Bagattini, Chiara; Mazza, Veronica; Panizza, Laura; Ferrari, Clarissa; Bonomini, Cristina; Brignani, Debora

    2017-01-01

    The aim of this study was to investigate the behavioral and electrophysiological dynamics of multiple object processing (MOP) in mild cognitive impairment (MCI) and Alzheimer's disease (AD), and to test whether its neural signatures may represent reliable diagnostic biomarkers. Behavioral performance and event-related potentials [N2pc and contralateral delay activity (CDA)] were measured in AD, MCI, and healthy controls during a MOP task, which consisted in enumerating a variable number of targets presented among distractors. AD patients showed an overall decline in accuracy for both small and large target quantities, whereas in MCI patients, only enumeration of large quantities was impaired. N2pc, a neural marker of attentive individuation, was spared in both AD and MCI patients. In contrast, CDA, which indexes visual short term memory abilities, was altered in both groups of patients, with a non-linear pattern of amplitude modulation along the continuum of the disease: a reduction in AD and an increase in MCI. These results indicate that AD pathology shows a progressive decline in MOP, which is associated to the decay of visual short-term memory mechanisms. Crucially, CDA may be considered as a useful neural signature both to distinguish between healthy and pathological aging and to characterize the different stages along the AD continuum, possibly becoming a reliable candidate for an early diagnostic biomarker of AD pathology.

  17. Monolithic ceramic analysis using the SCARE program

    NASA Technical Reports Server (NTRS)

    Manderscheid, Jane M.

    1988-01-01

    The Structural Ceramics Analysis and Reliability Evaluation (SCARE) computer program calculates the fast fracture reliability of monolithic ceramic components. The code is a post-processor to the MSC/NASTRAN general purpose finite element program. The SCARE program automatically accepts the MSC/NASTRAN output necessary to compute reliability. This includes element stresses, temperatures, volumes, and areas. The SCARE program computes two-parameter Weibull strength distributions from input fracture data for both volume and surface flaws. The distributions can then be used to calculate the reliability of geometrically complex components subjected to multiaxial stress states. Several fracture criteria and flaw types are available for selection by the user, including out-of-plane crack extension theories. The theoretical basis for the reliability calculations was proposed by Batdorf. These models combine linear elastic fracture mechanics (LEFM) with Weibull statistics to provide a mechanistic failure criterion. Other fracture theories included in SCARE are the normal stress averaging technique and the principle of independent action. The objective of this presentation is to summarize these theories, including their limitations and advantages, and to provide a general description of the SCARE program, along with example problems.

  18. Detection of single bacteria - causative agents of meningitis using raman microscopy

    NASA Astrophysics Data System (ADS)

    Baikova, T. V.; Minaeva, S. A.; Sundukov, A. V.; Svistunova, T. S.; Bagratashvili, V. N.; Alushin, M. V.; Gonchukov, S. A.

    2015-03-01

    Early diagnostics of meningitis is a very topical problem as it is a fulminant disease with a high level of mortality. The progress of this disease is, as a rule, accompanied by the appearance of bacteria in the cerebrospinal fluid (CSF) composition. The examination of the CSF is well known to be the only reliable approach to the identification of meningitis. However, the traditional biochemical analyses are time consuming and not always reliable, simple, and inexpensive, whereas the optical methods are poorly developed. This work is devoted to the study of Raman spectra of several bacterial cultures which are mainly present during meningitis. Raman microscopy is a prompt and noninvasive technique capable of providing reliable information about molecular-level alterations of biological objects at their minimal quantity and size. It was shown that there are characteristic lines in Raman spectra which can be the reliable markers for determination of bacterial form of meningitis at a level of a single bacterium.

  19. The Aviation Paradox: Why We Can 'Know' Jetliners But Not Reactors.

    PubMed

    Downer, John

    2017-01-01

    Publics and policymakers increasingly have to contend with the risks of complex, safety-critical technologies, such as airframes and reactors. As such, 'technological risk' has become an important object of modern governance, with state regulators as core agents, and 'reliability assessment' as the most essential metric. The Science and Technology Studies (STS) literature casts doubt on whether or not we should place our faith in these assessments because predictively calculating the ultra-high reliability required of such systems poses seemingly insurmountable epistemological problems. This paper argues that these misgivings are warranted in the nuclear sphere, despite evidence from the aviation sphere suggesting that such calculations can be accurate. It explains why regulatory calculations that predict the reliability of new airframes cannot work in principle, and then it explains why those calculations work in practice. It then builds on this explanation to argue that the means by which engineers manage reliability in aviation is highly domain-specific, and to suggest how a more nuanced understanding of jetliners could inform debates about nuclear energy.

  20. Assessment of system reliability for a stochastic-flow distribution network with the spoilage property

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Kuei; Huang, Cheng-Fu; Yeh, Cheng-Ta

    2016-04-01

    In supply chain management, satisfying customer demand is the most concerned for the manager. However, the goods may rot or be spoilt during delivery owing to natural disasters, inclement weather, traffic accidents, collisions, and so on, such that the intact goods may not meet market demand. This paper concentrates on a stochastic-flow distribution network (SFDN), in which a node denotes a supplier, a transfer station, or a market, while a route denotes a carrier providing the delivery service for a pair of nodes. The available capacity of the carrier is stochastic because the capacity may be partially reserved by other customers. The addressed problem is to evaluate the system reliability, the probability that the SFDN can satisfy the market demand with the spoilage rate under the budget constraint from multiple suppliers to the customer. An algorithm is developed in terms of minimal paths to evaluate the system reliability along with a numerical example to illustrate the solution procedure. A practical case of fruit distribution is presented accordingly to emphasise the management implication of the system reliability.

  1. *Environmental mycobacteriosis and drinking water: an emerging problem for developed countries

    EPA Science Inventory

    Background and Objective: Rates ofpulmonary environmental mycobacteriosis (EM) appear to be increasing among developed countries during the past 20 years. EM is caused by multiple species of pathogenic mycobacteria that have been recovered from soil, water, water aerosols, biofil...

  2. Multigrid-based reconstruction algorithm for quantitative photoacoustic tomography

    PubMed Central

    Li, Shengfu; Montcel, Bruno; Yuan, Zhen; Liu, Wanyu; Vray, Didier

    2015-01-01

    This paper proposes a multigrid inversion framework for quantitative photoacoustic tomography reconstruction. The forward model of optical fluence distribution and the inverse problem are solved at multiple resolutions. A fixed-point iteration scheme is formulated for each resolution and used as a cost function. The simulated and experimental results for quantitative photoacoustic tomography reconstruction show that the proposed multigrid inversion can dramatically reduce the required number of iterations for the optimization process without loss of reliability in the results. PMID:26203371

  3. Object Oriented Modeling and Design

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali

    2007-01-01

    The Object Oriented Modeling and Design seminar is intended for software professionals and students, it covers the concepts and a language-independent graphical notation that can be used to analyze problem requirements, and design a solution to the problem. The seminar discusses the three kinds of object-oriented models class, state, and interaction. The class model represents the static structure of a system, the state model describes the aspects of a system that change over time as well as control behavior and the interaction model describes how objects collaborate to achieve overall results. Existing knowledge of object oriented programming may benefit the learning of modeling and good design. Specific expectations are: Create a class model, Read, recognize, and describe a class model, Describe association and link, Show abstract classes used with multiple inheritance, Explain metadata, reification and constraints, Group classes into a package, Read, recognize, and describe a state model, Explain states and transitions, Read, recognize, and describe interaction model, Explain Use cases and use case relationships, Show concurrency in activity diagram, Object interactions in sequence diagram.

  4. Genetic Particle Swarm Optimization–Based Feature Selection for Very-High-Resolution Remotely Sensed Imagery Object Change Detection

    PubMed Central

    Chen, Qiang; Chen, Yunhao; Jiang, Weiguo

    2016-01-01

    In the field of multiple features Object-Based Change Detection (OBCD) for very-high-resolution remotely sensed images, image objects have abundant features and feature selection affects the precision and efficiency of OBCD. Through object-based image analysis, this paper proposes a Genetic Particle Swarm Optimization (GPSO)-based feature selection algorithm to solve the optimization problem of feature selection in multiple features OBCD. We select the Ratio of Mean to Variance (RMV) as the fitness function of GPSO, and apply the proposed algorithm to the object-based hybrid multivariate alternative detection model. Two experiment cases on Worldview-2/3 images confirm that GPSO can significantly improve the speed of convergence, and effectively avoid the problem of premature convergence, relative to other feature selection algorithms. According to the accuracy evaluation of OBCD, GPSO is superior at overall accuracy (84.17% and 83.59%) and Kappa coefficient (0.6771 and 0.6314) than other algorithms. Moreover, the sensitivity analysis results show that the proposed algorithm is not easily influenced by the initial parameters, but the number of features to be selected and the size of the particle swarm would affect the algorithm. The comparison experiment results reveal that RMV is more suitable than other functions as the fitness function of GPSO-based feature selection algorithm. PMID:27483285

  5. Approximation of reliabilities for multiple-trait model with maternal effects.

    PubMed

    Strabel, T; Misztal, I; Bertrand, J K

    2001-04-01

    Reliabilities for a multiple-trait maternal model were obtained by combining reliabilities obtained from single-trait models. Single-trait reliabilities were obtained using an approximation that supported models with additive and permanent environmental effects. For the direct effect, the maternal and permanent environmental variances were assigned to the residual. For the maternal effect, variance of the direct effect was assigned to the residual. Data included 10,550 birth weight, 11,819 weaning weight, and 3,617 postweaning gain records of Senepol cattle. Reliabilities were obtained by generalized inversion and by using single-trait and multiple-trait approximation methods. Some reliabilities obtained by inversion were negative because inbreeding was ignored in calculating the inverse of the relationship matrix. The multiple-trait approximation method reduced the bias of approximation when compared with the single-trait method. The correlations between reliabilities obtained by inversion and by multiple-trait procedures for the direct effect were 0.85 for birth weight, 0.94 for weaning weight, and 0.96 for postweaning gain. Correlations for maternal effects for birth weight and weaning weight were 0.96 to 0.98 for both approximations. Further improvements can be achieved by refining the single-trait procedures.

  6. The embedded operating system project

    NASA Technical Reports Server (NTRS)

    Campbell, R. H.

    1985-01-01

    The design and construction of embedded operating systems for real-time advanced aerospace applications was investigated. The applications require reliable operating system support that must accommodate computer networks. Problems that arise in the construction of such operating systems, reconfiguration, consistency and recovery in a distributed system, and the issues of real-time processing are reported. A thesis that provides theoretical foundations for the use of atomic actions to support fault tolerance and data consistency in real-time object-based system is included. The following items are addressed: (1) atomic actions and fault-tolerance issues; (2) operating system structure; (3) program development; (4) a reliable compiler for path Pascal; and (5) mediators, a mechanism for scheduling distributed system processes.

  7. Conditional nonlinear optimal perturbations based on the particle swarm optimization and their applications to the predictability problems

    NASA Astrophysics Data System (ADS)

    Zheng, Qin; Yang, Zubin; Sha, Jianxin; Yan, Jun

    2017-02-01

    In predictability problem research, the conditional nonlinear optimal perturbation (CNOP) describes the initial perturbation that satisfies a certain constraint condition and causes the largest prediction error at the prediction time. The CNOP has been successfully applied in estimation of the lower bound of maximum predictable time (LBMPT). Generally, CNOPs are calculated by a gradient descent algorithm based on the adjoint model, which is called ADJ-CNOP. This study, through the two-dimensional Ikeda model, investigates the impacts of the nonlinearity on ADJ-CNOP and the corresponding precision problems when using ADJ-CNOP to estimate the LBMPT. Our conclusions are that (1) when the initial perturbation is large or the prediction time is long, the strong nonlinearity of the dynamical model in the prediction variable will lead to failure of the ADJ-CNOP method, and (2) when the objective function has multiple extreme values, ADJ-CNOP has a large probability of producing local CNOPs, hence making a false estimation of the LBMPT. Furthermore, the particle swarm optimization (PSO) algorithm, one kind of intelligent algorithm, is introduced to solve this problem. The method using PSO to compute CNOP is called PSO-CNOP. The results of numerical experiments show that even with a large initial perturbation and long prediction time, or when the objective function has multiple extreme values, PSO-CNOP can always obtain the global CNOP. Since the PSO algorithm is a heuristic search algorithm based on the population, it can overcome the impact of nonlinearity and the disturbance from multiple extremes of the objective function. In addition, to check the estimation accuracy of the LBMPT presented by PSO-CNOP and ADJ-CNOP, we partition the constraint domain of initial perturbations into sufficiently fine grid meshes and take the LBMPT obtained by the filtering method as a benchmark. The result shows that the estimation presented by PSO-CNOP is closer to the true value than the one by ADJ-CNOP with the forecast time increasing.

  8. Multi-target detection and positioning in crowds using multiple camera surveillance

    NASA Astrophysics Data System (ADS)

    Huang, Jiahu; Zhu, Qiuyu; Xing, Yufeng

    2018-04-01

    In this study, we propose a pixel correspondence algorithm for positioning in crowds based on constraints on the distance between lines of sight, grayscale differences, and height in a world coordinates system. First, a Gaussian mixture model is used to obtain the background and foreground from multi-camera videos. Second, the hair and skin regions are extracted as regions of interest. Finally, the correspondences between each pixel in the region of interest are found under multiple constraints and the targets are positioned by pixel clustering. The algorithm can provide appropriate redundancy information for each target, which decreases the risk of losing targets due to a large viewing angle and wide baseline. To address the correspondence problem for multiple pixels, we construct a pixel-based correspondence model based on a similar permutation matrix, which converts the correspondence problem into a linear programming problem where a similar permutation matrix is found by minimizing an objective function. The correct pixel correspondences can be obtained by determining the optimal solution of this linear programming problem and the three-dimensional position of the targets can also be obtained by pixel clustering. Finally, we verified the algorithm with multiple cameras in experiments, which showed that the algorithm has high accuracy and robustness.

  9. Evaluating the Effect of Minimizing Screws on Stabilization of Symphysis Mandibular Fracture by 3D Finite Element Analysis.

    PubMed

    Kharmanda, Ghias; Kharma, Mohamed-Yaser

    2017-06-01

    The objective of this work is to integrate structural optimization and reliability concepts into mini-plate fixation strategy used in symphysis mandibular fractures. The structural reliability levels are next estimated when considering a single failure mode and multiple failure modes. A 3-dimensional finite element model is developed in order to evaluate the ability of reducing the negative effect due to the stabilization of the fracture. Topology optimization process is considered in the conceptual design stage to predict possible fixation layouts. In the detailed design stage, suitable mini-plates are selected taking into account the resulting topology and different anatomical considerations. Several muscle forces are considered in order to obtain realistic predictions. Since some muscles can be cut or harmed during the surgery and cannot operate at its maximum capacity, there is a strong motivation to introduce the loading uncertainties in order to obtain reliable designs. The structural reliability is carried out for a single failure mode and multiple failure modes. The different results are validated with a clinical case of a male patient with symphysis fracture. In this case while use of the upper plate fixation with four holes, only two screws were applied to protect adjacent vital structure. This behavior does not affect the stability of the fracture. The proposed strategy to optimize bone plates leads to fewer complications and second surgeries, less patient discomfort, and shorter time of healing.

  10. Investigation of Content and Face Validity and Reliability of Sociocultural Attitude towards Appearance Questionnaire-3 (SATAQ-3) among Female Adolescents

    PubMed Central

    Mousazadeh, Somayeh; Rakhshan, Mahnaz; Mohammadi, Fateme

    2017-01-01

    Objective: This study aimed to determine the psychometric properties of sociocultural attitude towards appearance questionnaire in female adolescents. Method: This was a methodological study. The English version of the questionnaire was translated into Persian, using forward-backward method. Then the face validity, content validity and reliability were checked. To ensure face validity, the questionnaire was given to 25 female adolescents, a psychologist and three nurses, who were required to evaluate the items with respect to problems, ambiguity, relativity, proper terms and grammar, and understandability. For content validity, 15 experts in psychology and nursing, who met the inclusion criteria, were required. They were asked to assess the qualitative of content validity. To determine the quantitative content validity, content validity index and content validity ratio were calculated. At the end, internal consistency of the items was assessed, using Cronbach’s alpha method. Results: According to the expert judgments, content validity ratio was 0.81 and content validity index was 0.91. Besides, the reliability of the questionnaire was confirmed with Cronbach’s alpha = 0.91, and physical and developmental areas showed the highest reliability indices. Conclusion: The aforementioned questionnaire could be used in researches to assess female adolescents’ self-concept. This can be a stepping-stone towards identification of problems and improvement of adolescents’ body image. PMID:28496497

  11. Unsupervised motion-based object segmentation refined by color

    NASA Astrophysics Data System (ADS)

    Piek, Matthijs C.; Braspenning, Ralph; Varekamp, Chris

    2003-06-01

    For various applications, such as data compression, structure from motion, medical imaging and video enhancement, there is a need for an algorithm that divides video sequences into independently moving objects. Because our focus is on video enhancement and structure from motion for consumer electronics, we strive for a low complexity solution. For still images, several approaches exist based on colour, but these lack in both speed and segmentation quality. For instance, colour-based watershed algorithms produce a so-called oversegmentation with many segments covering each single physical object. Other colour segmentation approaches exist which somehow limit the number of segments to reduce this oversegmentation problem. However, this often results in inaccurate edges or even missed objects. Most likely, colour is an inherently insufficient cue for real world object segmentation, because real world objects can display complex combinations of colours. For video sequences, however, an additional cue is available, namely the motion of objects. When different objects in a scene have different motion, the motion cue alone is often enough to reliably distinguish objects from one another and the background. However, because of the lack of sufficient resolution of efficient motion estimators, like the 3DRS block matcher, the resulting segmentation is not at pixel resolution, but at block resolution. Existing pixel resolution motion estimators are more sensitive to noise, suffer more from aperture problems or have less correspondence to the true motion of objects when compared to block-based approaches or are too computationally expensive. From its tendency to oversegmentation it is apparent that colour segmentation is particularly effective near edges of homogeneously coloured areas. On the other hand, block-based true motion estimation is particularly effective in heterogeneous areas, because heterogeneous areas improve the chance a block is unique and thus decrease the chance of the wrong position producing a good match. Consequently, a number of methods exist which combine motion and colour segmentation. These methods use colour segmentation as a base for the motion segmentation and estimation or perform an independent colour segmentation in parallel which is in some way combined with the motion segmentation. The presented method uses both techniques to complement each other by first segmenting on motion cues and then refining the segmentation with colour. To our knowledge few methods exist which adopt this approach. One example is te{meshrefine}. This method uses an irregular mesh, which hinders its efficient implementation in consumer electronics devices. Furthermore, the method produces a foreground/background segmentation, while our applications call for the segmentation of multiple objects. NEW METHOD As mentioned above we start with motion segmentation and refine the edges of this segmentation with a pixel resolution colour segmentation method afterwards. There are several reasons for this approach: + Motion segmentation does not produce the oversegmentation which colour segmentation methods normally produce, because objects are more likely to have colour discontinuities than motion discontinuities. In this way, the colour segmentation only has to be done at the edges of segments, confining the colour segmentation to a smaller part of the image. In such a part, it is more likely that the colour of an object is homogeneous. + This approach restricts the computationally expensive pixel resolution colour segmentation to a subset of the image. Together with the very efficient 3DRS motion estimation algorithm, this helps to reduce the computational complexity. + The motion cue alone is often enough to reliably distinguish objects from one another and the background. To obtain the motion vector fields, a variant of the 3DRS block-based motion estimator which analyses three frames of input was used. The 3DRS motion estimator is known for its ability to estimate motion vectors which closely resemble the true motion. BLOCK-BASED MOTION SEGMENTATION As mentioned above we start with a block-resolution segmentation based on motion vectors. The presented method is inspired by the well-known K-means segmentation method te{K-means}. Several other methods (e.g. te{kmeansc}) adapt K-means for connectedness by adding a weighted shape-error. This adds the additional difficulty of finding the correct weights for the shape-parameters. Also, these methods often bias one particular pre-defined shape. The presented method, which we call K-regions, encourages connectedness because only blocks at the edges of segments may be assigned to another segment. This constrains the segmentation method to such a degree that it allows the method to use least squares for the robust fitting of affine motion models for each segment. Contrary to te{parmkm}, the segmentation step still operates on vectors instead of model parameters. To make sure the segmentation is temporally consistent, the segmentation of the previous frame will be used as initialisation for every new frame. We also present a scheme which makes the algorithm independent of the initially chosen amount of segments. COLOUR-BASED INTRA-BLOCK SEGMENTATION The block resolution motion-based segmentation forms the starting point for the pixel resolution segmentation. The pixel resolution segmentation is obtained from the block resolution segmentation by reclassifying pixels only at the edges of clusters. We assume that an edge between two objects can be found in either one of two neighbouring blocks that belong to different clusters. This assumption allows us to do the pixel resolution segmentation on each pair of such neighbouring blocks separately. Because of the local nature of the segmentation, it largely avoids problems with heterogeneously coloured areas. Because no new segments are introduced in this step, it also does not suffer from oversegmentation problems. The presented method has no problems with bifurcations. For the pixel resolution segmentation itself we reclassify pixels such that we optimize an error norm which favour similarly coloured regions and straight edges. SEGMENTATION MEASURE To assist in the evaluation of the proposed algorithm we developed a quality metric. Because the problem does not have an exact specification, we decided to define a ground truth output which we find desirable for a given input. We define the measure for the segmentation quality as being how different the segmentation is from the ground truth. Our measure enables us to evaluate oversegmentation and undersegmentation seperately. Also, it allows us to evaluate which parts of a frame suffer from oversegmentation or undersegmentation. The proposed algorithm has been tested on several typical sequences. CONCLUSIONS In this abstract we presented a new video segmentation method which performs well in the segmentation of multiple independently moving foreground objects from each other and the background. It combines the strong points of both colour and motion segmentation in the way we expected. One of the weak points is that the segmentation method suffers from undersegmentation when adjacent objects display similar motion. In sequences with detailed backgrounds the segmentation will sometimes display noisy edges. Apart from these results, we think that some of the techniques, and in particular the K-regions technique, may be useful for other two-dimensional data segmentation problems.

  12. End-to-End Network QoS via Scheduling of Flexible Resource Reservation Requests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, S.; Katramatos, D.; Yu, D.

    2011-11-14

    Modern data-intensive applications move vast amounts of data between multiple locations around the world. To enable predictable and reliable data transfer, next generation networks allow such applications to reserve network resources for exclusive use. In this paper, we solve an important problem (called SMR3) to accommodate multiple and concurrent network reservation requests between a pair of end-sites. Given the varying availability of bandwidth within the network, our goal is to accommodate as many reservation requests as possible while minimizing the total time needed to complete the data transfers. We first prove that SMR3 is an NP-hard problem. Then we solvemore » it by developing a polynomial-time heuristic, called RRA. The RRA algorithm hinges on an efficient mechanism to accommodate large number of requests by minimizing the bandwidth wastage. Finally, via numerical results, we show that RRA constructs schedules that accommodate significantly larger number of requests compared to other, seemingly efficient, heuristics.« less

  13. Structured decision making for managing pneumonia epizootics in bighorn sheep

    USGS Publications Warehouse

    Sells, Sarah N.; Mitchell, Michael S.; Edwards, Victoria L.; Gude, Justin A.; Anderson, Neil J.

    2016-01-01

    Good decision-making is essential to conserving wildlife populations. Although there may be multiple ways to address a problem, perfect solutions rarely exist. Managers are therefore tasked with identifying decisions that will best achieve desired outcomes. Structured decision making (SDM) is a method of decision analysis used to identify the most effective, efficient, and realistic decisions while accounting for values and priorities of the decision maker. The stepwise process includes identifying the management problem, defining objectives for solving the problem, developing alternative approaches to achieve the objectives, and formally evaluating which alternative is most likely to accomplish the objectives. The SDM process can be more effective than informal decision-making because it provides a transparent way to quantitatively evaluate decisions for addressing multiple management objectives while incorporating science, uncertainty, and risk tolerance. To illustrate the application of this process to a management need, we present an SDM-based decision tool developed to identify optimal decisions for proactively managing risk of pneumonia epizootics in bighorn sheep (Ovis canadensis) in Montana. Pneumonia epizootics are a major challenge for managers due to long-term impacts to herds, epistemic uncertainty in timing and location of future epizootics, and consequent difficulty knowing how or when to manage risk. The decision tool facilitates analysis of alternative decisions for how to manage herds based on predictions from a risk model, herd-specific objectives, and predicted costs and benefits of each alternative. Decision analyses for 2 example herds revealed that meeting management objectives necessitates specific approaches unique to each herd. The analyses showed how and under what circumstances the alternatives are optimal compared to other approaches and current management. Managers can be confident that these decisions are effective, efficient, and realistic because they explicitly account for important considerations managers implicitly weigh when making decisions, including competing management objectives, uncertainty in potential outcomes, and risk tolerance.

  14. Conformational Space Annealing explained: A general optimization algorithm, with diverse applications

    NASA Astrophysics Data System (ADS)

    Joung, InSuk; Kim, Jong Yun; Gross, Steven P.; Joo, Keehyoung; Lee, Jooyoung

    2018-02-01

    Many problems in science and engineering can be formulated as optimization problems. One way to solve these problems is to develop tailored problem-specific approaches. As such development is challenging, an alternative is to develop good generally-applicable algorithms. Such algorithms are easy to apply, typically function robustly, and reduce development time. Here we provide a description for one such algorithm called Conformational Space Annealing (CSA) along with its python version, PyCSA. We previously applied it to many optimization problems including protein structure prediction and graph community detection. To demonstrate its utility, we have applied PyCSA to two continuous test functions, namely Ackley and Eggholder functions. In addition, in order to provide complete generality of PyCSA to any types of an objective function, we demonstrate the way PyCSA can be applied to a discrete objective function, namely a parameter optimization problem. Based on the benchmarking results of the three problems, the performance of CSA is shown to be better than or similar to the most popular optimization method, simulated annealing. For continuous objective functions, we found that, L-BFGS-B was the best performing local optimization method, while for a discrete objective function Nelder-Mead was the best. The current version of PyCSA can be run in parallel at the coarse grained level by calculating multiple independent local optimizations separately. The source code of PyCSA is available from http://lee.kias.re.kr.

  15. Advantages of Task-Specific Multi-Objective Optimisation in Evolutionary Robotics

    PubMed Central

    Trianni, Vito; López-Ibáñez, Manuel

    2015-01-01

    The application of multi-objective optimisation to evolutionary robotics is receiving increasing attention. A survey of the literature reveals the different possibilities it offers to improve the automatic design of efficient and adaptive robotic systems, and points to the successful demonstrations available for both task-specific and task-agnostic approaches (i.e., with or without reference to the specific design problem to be tackled). However, the advantages of multi-objective approaches over single-objective ones have not been clearly spelled out and experimentally demonstrated. This paper fills this gap for task-specific approaches: starting from well-known results in multi-objective optimisation, we discuss how to tackle commonly recognised problems in evolutionary robotics. In particular, we show that multi-objective optimisation (i) allows evolving a more varied set of behaviours by exploring multiple trade-offs of the objectives to optimise, (ii) supports the evolution of the desired behaviour through the introduction of objectives as proxies, (iii) avoids the premature convergence to local optima possibly introduced by multi-component fitness functions, and (iv) solves the bootstrap problem exploiting ancillary objectives to guide evolution in the early phases. We present an experimental demonstration of these benefits in three different case studies: maze navigation in a single robot domain, flocking in a swarm robotics context, and a strictly collaborative task in collective robotics. PMID:26295151

  16. Development and validation of brief scales to measure emotional and behavioural problems among Chinese adolescents

    PubMed Central

    Shen, Minxue; Hu, Ming; Sun, Zhenqiu

    2017-01-01

    Objectives To develop and validate brief scales to measure common emotional and behavioural problems among adolescents in the examination-oriented education system and collectivistic culture of China. Setting Middle schools in Hunan province. Participants 5442 middle school students aged 11–19 years were sampled. 4727 valid questionnaires were collected and used for validation of the scales. The final sample included 2408 boys and 2319 girls. Primary and secondary outcome measures The tools were assessed by the item response theory, classical test theory (reliability and construct validity) and differential item functioning. Results Four scales to measure anxiety, depression, study problem and sociality problem were established. Exploratory factor analysis showed that each scale had two solutions. Confirmatory factor analysis showed acceptable to good model fit for each scale. Internal consistency and test–retest reliability of all scales were above 0.7. Item response theory showed that all items had acceptable discrimination parameters and most items had appropriate difficulty parameters. 10 items demonstrated differential item functioning with respect to gender. Conclusions Four brief scales were developed and validated among adolescents in middle schools of China. The scales have good psychometric properties with minor differential item functioning. They can be used in middle school settings, and will help school officials to assess the students’ emotional/behavioural problems. PMID:28062469

  17. Comparing multiple statistical methods for inverse prediction in nuclear forensics applications

    DOE PAGES

    Lewis, John R.; Zhang, Adah; Anderson-Cook, Christine Michaela

    2017-10-29

    Forensic science seeks to predict source characteristics using measured observables. Statistically, this objective can be thought of as an inverse problem where interest is in the unknown source characteristics or factors ( X) of some underlying causal model producing the observables or responses (Y = g ( X) + error). Here, this paper reviews several statistical methods for use in inverse problems and demonstrates that comparing results from multiple methods can be used to assess predictive capability. Motivation for assessing inverse predictions comes from the desired application to historical and future experiments involving nuclear material production for forensics research inmore » which inverse predictions, along with an assessment of predictive capability, are desired.« less

  18. Comparing multiple statistical methods for inverse prediction in nuclear forensics applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, John R.; Zhang, Adah; Anderson-Cook, Christine Michaela

    Forensic science seeks to predict source characteristics using measured observables. Statistically, this objective can be thought of as an inverse problem where interest is in the unknown source characteristics or factors ( X) of some underlying causal model producing the observables or responses (Y = g ( X) + error). Here, this paper reviews several statistical methods for use in inverse problems and demonstrates that comparing results from multiple methods can be used to assess predictive capability. Motivation for assessing inverse predictions comes from the desired application to historical and future experiments involving nuclear material production for forensics research inmore » which inverse predictions, along with an assessment of predictive capability, are desired.« less

  19. Multi-point objective-oriented sequential sampling strategy for constrained robust design

    NASA Astrophysics Data System (ADS)

    Zhu, Ping; Zhang, Siliang; Chen, Wei

    2015-03-01

    Metamodelling techniques are widely used to approximate system responses of expensive simulation models. In association with the use of metamodels, objective-oriented sequential sampling methods have been demonstrated to be effective in balancing the need for searching an optimal solution versus reducing the metamodelling uncertainty. However, existing infilling criteria are developed for deterministic problems and restricted to one sampling point in one iteration. To exploit the use of multiple samples and identify the true robust solution in fewer iterations, a multi-point objective-oriented sequential sampling strategy is proposed for constrained robust design problems. In this article, earlier development of objective-oriented sequential sampling strategy for unconstrained robust design is first extended to constrained problems. Next, a double-loop multi-point sequential sampling strategy is developed. The proposed methods are validated using two mathematical examples followed by a highly nonlinear automotive crashworthiness design example. The results show that the proposed method can mitigate the effect of both metamodelling uncertainty and design uncertainty, and identify the robust design solution more efficiently than the single-point sequential sampling approach.

  20. Reliable use of determinants to solve nonlinear structural eigenvalue problems efficiently

    NASA Technical Reports Server (NTRS)

    Williams, F. W.; Kennedy, D.

    1988-01-01

    The analytical derivation, numerical implementation, and performance of a multiple-determinant parabolic interpolation method (MDPIM) for use in solving transcendental eigenvalue (critical buckling or undamped free vibration) problems in structural mechanics are presented. The overall bounding, eigenvalue-separation, qualified parabolic interpolation, accuracy-confirmation, and convergence-recovery stages of the MDPIM are described in detail, and the numbers of iterations required to solve sample plane-frame problems using the MDPIM are compared with those for a conventional bisection method and for the Newtonian method of Simpson (1984) in extensive tables. The MDPIM is shown to use 31 percent less computation time than bisection when accuracy of 0.0001 is required, but 62 percent less when accuracy of 10 to the -8th is required; the time savings over the Newtonian method are about 10 percent.

  1. Forest management and economics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buongiorno, J.; Gilless, J.K.

    1987-01-01

    This volume provides a survey of quantitative methods, guiding the reader through formulation and analysis of models that address forest management problems. The authors use simple mathematics, graphics, and short computer programs to explain each method. Emphasizing applications, they discuss linear, integer, dynamic, and goal programming; simulation; network modeling; and econometrics, as these relate to problems of determining economic harvest schedules in even-aged and uneven-aged forests, the evaluation of forest policies, multiple-objective decision making, and more.

  2. Consistent Steering System using SCTP for Bluetooth Scatternet Sensor Network

    NASA Astrophysics Data System (ADS)

    Dhaya, R.; Sadasivam, V.; Kanthavel, R.

    2012-12-01

    Wireless communication is the best way to convey information from source to destination with flexibility and mobility and Bluetooth is the wireless technology suitable for short distance. On the other hand a wireless sensor network (WSN) consists of spatially distributed autonomous sensors to cooperatively monitor physical or environmental conditions, such as temperature, sound, vibration, pressure, motion or pollutants. Using Bluetooth piconet wireless technique in sensor nodes creates limitation in network depth and placement. The introduction of Scatternet solves the network restrictions with lack of reliability in data transmission. When the depth of the network increases, it results in more difficulties in routing. No authors so far focused on the reliability factors of Scatternet sensor network's routing. This paper illustrates the proposed system architecture and routing mechanism to increase the reliability. The another objective is to use reliable transport protocol that uses the multi-homing concept and supports multiple streams to prevent head-of-line blocking. The results show that the Scatternet sensor network has lower packet loss even in the congestive environment than the existing system suitable for all surveillance applications.

  3. Airplane detection based on fusion framework by combining saliency model with Deep Convolutional Neural Networks

    NASA Astrophysics Data System (ADS)

    Dou, Hao; Sun, Xiao; Li, Bin; Deng, Qianqian; Yang, Xubo; Liu, Di; Tian, Jinwen

    2018-03-01

    Aircraft detection from very high resolution remote sensing images, has gained more increasing interest in recent years due to the successful civil and military applications. However, several problems still exist: 1) how to extract the high-level features of aircraft; 2) locating objects within such a large image is difficult and time consuming; 3) A common problem of multiple resolutions of satellite images still exists. In this paper, inspirited by biological visual mechanism, the fusion detection framework is proposed, which fusing the top-down visual mechanism (deep CNN model) and bottom-up visual mechanism (GBVS) to detect aircraft. Besides, we use multi-scale training method for deep CNN model to solve the problem of multiple resolutions. Experimental results demonstrate that our method can achieve a better detection result than the other methods.

  4. Multiple-attribute group decision making with different formats of preference information on attributes.

    PubMed

    Xu, Zeshui

    2007-12-01

    Interval utility values, interval fuzzy preference relations, and interval multiplicative preference relations are three common uncertain-preference formats used by decision-makers to provide their preference information in the process of decision making under fuzziness. This paper is devoted in investigating multiple-attribute group-decision-making problems where the attribute values are not precisely known but the value ranges can be obtained, and the decision-makers provide their preference information over attributes by three different uncertain-preference formats i.e., 1) interval utility values; 2) interval fuzzy preference relations; and 3) interval multiplicative preference relations. We first utilize some functions to normalize the uncertain decision matrix and then transform it into an expected decision matrix. We establish a goal-programming model to integrate the expected decision matrix and all three different uncertain-preference formats from which the attribute weights and the overall attribute values of alternatives can be obtained. Then, we use the derived overall attribute values to get the ranking of the given alternatives and to select the best one(s). The model not only can reflect both the subjective considerations of all decision-makers and the objective information but also can avoid losing and distorting the given objective and subjective decision information in the process of information integration. Furthermore, we establish some models to solve the multiple-attribute group-decision-making problems with three different preference formats: 1) utility values; 2) fuzzy preference relations; and 3) multiplicative preference relations. Finally, we illustrate the applicability and effectiveness of the developed models with two practical examples.

  5. The sizing of hamstring grafts for anterior cruciate reconstruction: intra- and inter-observer reliability.

    PubMed

    Dwyer, Tim; Whelan, Daniel B; Khoshbin, Amir; Wasserstein, David; Dold, Andrew; Chahal, Jaskarndip; Nauth, Aaron; Murnaghan, M Lucas; Ogilvie-Harris, Darrell J; Theodoropoulos, John S

    2015-04-01

    The objective of this study was to establish the intra- and inter-observer reliability of hamstring graft measurement using cylindrical sizing tubes. Hamstring tendons (gracilis and semitendinosus) were harvested from ten cadavers by a single surgeon and whip stitched together to create ten 4-strand hamstring grafts. Ten sports medicine surgeons and fellows sized each graft independently using either hollow cylindrical sizers or block sizers in 0.5-mm increments—the sizing technique used was applied consistently to each graft. Surgeons moved sequentially from graft to graft and measured each hamstring graft twice. Surgeons were asked to state the measured proximal (femoral) and distal (tibial) diameter of each graft, as well as the diameter of the tibial and femoral tunnels that they would drill if performing an anterior cruciate ligament (ACL) reconstruction using that graft. Reliability was established using intra-class correlation coefficients. Overall, both the inter-observer and intra-observer agreement were >0.9, demonstrating excellent reliability. The inter-observer reliability for drill sizes was also excellent (>0.9). Excellent correlation was seen between cylindrical sizing, and drill sizes (>0.9). Sizing of hamstring grafts by multiple surgeons demonstrated excellent intra-observer and intra-observer reliability, potentially validating clinical studies exploring ACL reconstruction outcomes by hamstring graft diameter when standard techniques are used. III.

  6. Load Balancing in Distributed Web Caching: A Novel Clustering Approach

    NASA Astrophysics Data System (ADS)

    Tiwari, R.; Kumar, K.; Khan, G.

    2010-11-01

    The World Wide Web suffers from scaling and reliability problems due to overloaded and congested proxy servers. Caching at local proxy servers helps, but cannot satisfy more than a third to half of requests; more requests are still sent to original remote origin servers. In this paper we have developed an algorithm for Distributed Web Cache, which incorporates cooperation among proxy servers of one cluster. This algorithm uses Distributed Web Cache concepts along with static hierarchies with geographical based clusters of level one proxy server with dynamic mechanism of proxy server during the congestion of one cluster. Congestion and scalability problems are being dealt by clustering concept used in our approach. This results in higher hit ratio of caches, with lesser latency delay for requested pages. This algorithm also guarantees data consistency between the original server objects and the proxy cache objects.

  7. FPFH-based graph matching for 3D point cloud registration

    NASA Astrophysics Data System (ADS)

    Zhao, Jiapeng; Li, Chen; Tian, Lihua; Zhu, Jihua

    2018-04-01

    Correspondence detection is a vital step in point cloud registration and it can help getting a reliable initial alignment. In this paper, we put forward an advanced point feature-based graph matching algorithm to solve the initial alignment problem of rigid 3D point cloud registration with partial overlap. Specifically, Fast Point Feature Histograms are used to determine the initial possible correspondences firstly. Next, a new objective function is provided to make the graph matching more suitable for partially overlapping point cloud. The objective function is optimized by the simulated annealing algorithm for final group of correct correspondences. Finally, we present a novel set partitioning method which can transform the NP-hard optimization problem into a O(n3)-solvable one. Experiments on the Stanford and UWA public data sets indicates that our method can obtain better result in terms of both accuracy and time cost compared with other point cloud registration methods.

  8. 3-Dimensional Root Cause Diagnosis via Co-analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Ziming; Lan, Zhiling; Yu, Li

    2012-01-01

    With the growth of system size and complexity, reliability has become a major concern for large-scale systems. Upon the occurrence of failure, system administrators typically trace the events in Reliability, Availability, and Serviceability (RAS) logs for root cause diagnosis. However, RAS log only contains limited diagnosis information. Moreover, the manual processing is time-consuming, error-prone, and not scalable. To address the problem, in this paper we present an automated root cause diagnosis mechanism for large-scale HPC systems. Our mechanism examines multiple logs to provide a 3-D fine-grained root cause analysis. Here, 3-D means that our analysis will pinpoint the failure layer,more » the time, and the location of the event that causes the problem. We evaluate our mechanism by means of real logs collected from a production IBM Blue Gene/P system at Oak Ridge National Laboratory. It successfully identifies failure layer information for 219 failures during 23-month period. Furthermore, it effectively identifies the triggering events with time and location information, even when the triggering events occur hundreds of hours before the resulting failures.« less

  9. The admissions process of a bachelor of science in nursing program: initial reliability and validity of the personal interview.

    PubMed

    Carpio, B; Brown, B

    1993-01-01

    The undergraduate nursing degree program (B.Sc.N.) at McMaster University School of Nursing uses small groups, and is learner-centered and problem-based. A study was conducted during the 1991 admissions cycle to determine the initial reliability and validity of the semi-structured personal interview which constitutes the final component of candidate selection for this program. During the interview, three-member teams assess applicant suitability to the program based on six dimensions: applicant motivation, awareness of the program, problem-solving abilities, ability to relate to others, self-appraisal skills, and career goals. Each interviewer assigns the applicant a global rating using a seven-point scale. For the purposes of this study four interviewer teams were randomly selected from the pool of 31 teams to interview four simulated (preprogrammed) applicants. Using two-factor repeated-measures ANOVA to analyze interview ratings, inter-rater and inter-team intraclass correlation coefficients (ICC) were calculated. Inter-team reliability ranged from .64 to .97 for the individual dimensions, and .66 to .89 on global ratings. Inter-rater ICC for the six dimensions ranged from .81 to .99, and .96 to .99 for the global ratings. The item-to-total correlation coefficients between individual dimensions and global ratings ranged from .8 to 1.0. Pearson correlations between items ranged from .77 to 1.0. The ICC were then calculated for the interview scores of 108 actual applicants to the program. Inter-rater reliability based on global ratings was .79 for the single (1 rater) observation, and .91 for the multiple (3 rater) observation. These findings support the continued use of the interview as a reliable instrument with face validity. Studies of predictive validity will be undertaken.

  10. Development of the Seasonal Migrant Agricultural Worker Stress Scale in Sanliurfa, Southeast Turkey.

    PubMed

    Simsek, Zeynep; Ersin, Fatma; Kirmizitoprak, Evin

    2016-01-01

    Stress is one of the main causes of health problems, especially mental disorders. These health problems cause a significant amount of ability loss and increase cost. It is estimated that by 2020, mental disorders will constitute 15% of the total disease burden, and depression will rank second only after ischemic heart disease. Environmental experiences are paramount in increasing the liability of mental disorders in those who constantly face sustained high levels of stress. The objective of this study was to develop a stress scale for seasonal migrant agricultural workers aged 18 years and older. The sample consisted of 270 randomly selected seasonal migrant agricultural workers. The average age of the participants was 33.1 ± 14, and 50.7% were male. The Cronbach alpha coefficient and test-retest methods were used for reliability analyses. Although the factor analysis was performed for the structure validity of the scale, the Kaiser-Meyer-Olkin coefficient and Bartlett test were used to determine the convenience of the data for the factor analysis. In the reliability analyses, the Cronbach alpha coefficient of internal consistency was calculated as .96, and the test-retest reliability coefficient was .81. In the exploratory factor analysis for validity of the scale, four factors were obtained, and the factors represented workplace physical conditions (25.7% of the total variance), workplace psychosocial and economic factors (19.3% of the total variance), workplace health problems (15.2% of the total variance), and school problems (10.1% of the total variance). The four factors explained 70.3% of the total variance. As a result of the expert opinions and analyses, a stress scale with 48 items was developed. The highest score to be obtained from the scale was 144, and the lowest score was 0. The increase in the score indicates the increase in the stress levels. The findings show that the scale is a valid and reliable assessment instrument that can be used in epidemiological research and planning interventions.

  11. Object Segmentation and Ground Truth in 3D Embryonic Imaging.

    PubMed

    Rajasekaran, Bhavna; Uriu, Koichiro; Valentin, Guillaume; Tinevez, Jean-Yves; Oates, Andrew C

    2016-01-01

    Many questions in developmental biology depend on measuring the position and movement of individual cells within developing embryos. Yet, tools that provide this data are often challenged by high cell density and their accuracy is difficult to measure. Here, we present a three-step procedure to address this problem. Step one is a novel segmentation algorithm based on image derivatives that, in combination with selective post-processing, reliably and automatically segments cell nuclei from images of densely packed tissue. Step two is a quantitative validation using synthetic images to ascertain the efficiency of the algorithm with respect to signal-to-noise ratio and object density. Finally, we propose an original method to generate reliable and experimentally faithful ground truth datasets: Sparse-dense dual-labeled embryo chimeras are used to unambiguously measure segmentation errors within experimental data. Together, the three steps outlined here establish a robust, iterative procedure to fine-tune image analysis algorithms and microscopy settings associated with embryonic 3D image data sets.

  12. Object Segmentation and Ground Truth in 3D Embryonic Imaging

    PubMed Central

    Rajasekaran, Bhavna; Uriu, Koichiro; Valentin, Guillaume; Tinevez, Jean-Yves; Oates, Andrew C.

    2016-01-01

    Many questions in developmental biology depend on measuring the position and movement of individual cells within developing embryos. Yet, tools that provide this data are often challenged by high cell density and their accuracy is difficult to measure. Here, we present a three-step procedure to address this problem. Step one is a novel segmentation algorithm based on image derivatives that, in combination with selective post-processing, reliably and automatically segments cell nuclei from images of densely packed tissue. Step two is a quantitative validation using synthetic images to ascertain the efficiency of the algorithm with respect to signal-to-noise ratio and object density. Finally, we propose an original method to generate reliable and experimentally faithful ground truth datasets: Sparse-dense dual-labeled embryo chimeras are used to unambiguously measure segmentation errors within experimental data. Together, the three steps outlined here establish a robust, iterative procedure to fine-tune image analysis algorithms and microscopy settings associated with embryonic 3D image data sets. PMID:27332860

  13. Strategic planning decision making using fuzzy SWOT-TOPSIS with reliability factor

    NASA Astrophysics Data System (ADS)

    Mohamad, Daud; Afandi, Nur Syamimi; Kamis, Nor Hanimah

    2015-10-01

    Strategic planning is a process of decision making and action for long-term activities in an organization. The Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis has been commonly used to help organizations in strategizing their future direction by analyzing internal and external environment. However, SWOT analysis has some limitations as it is unable to prioritize appropriately the multiple alternative strategic decisions. Some efforts have been made to solve this problem by incorporating Multi Criteria Decision Making (MCDM) methods. Nevertheless, another important aspect has raised concerns on obtaining the decision that is the reliability of the information. Decision makers evaluate differently depending on their level of confidence or sureness in the evaluation. This study proposes a decision making procedure for strategic planning using SWOT-TOPSIS method by incorporating the reliability factor of the evaluation based on Z-number. An example using a local authority in the east coast of Malaysia is illustrated to determine the strategic options ranking and to prioritize factors in each SWOT category.

  14. Space shuttle hypergolic bipropellant RCS engine design study, Bell model 8701

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A research program was conducted to define the level of the current technology base for reaction control system rocket engines suitable for space shuttle applications. The project consisted of engine analyses, design, fabrication, and tests. The specific objectives are: (1) extrapolating current engine design experience to design of an RCS engine with required safety, reliability, performance, and operational capability, (2) demonstration of multiple reuse capability, and (3) identification of current design and technology deficiencies and critical areas for future effort.

  15. Development and validation of a visual grading scale for assessing image quality of AP pelvis radiographic images.

    PubMed

    Mraity, Hussien A A B; England, Andrew; Cassidy, Simon; Eachus, Peter; Dominguez, Alejandro; Hogg, Peter

    2016-01-01

    The aim of this article was to apply psychometric theory to develop and validate a visual grading scale for assessing the visual perception of digital image quality anteroposterior (AP) pelvis. Psychometric theory was used to guide scale development. Seven phantom and seven cadaver images of visually and objectively predetermined quality were used to help assess scale reliability and validity. 151 volunteers scored phantom images, and 184 volunteers scored cadaver images. Factor analysis and Cronbach's alpha were used to assess scale validity and reliability. A 24-item scale was produced. Aggregated mean volunteer scores for each image correlated with the rank order of the visually and objectively predetermined image qualities. Scale items had good interitem correlation (≥0.2) and high factor loadings (≥0.3). Cronbach's alpha (reliability) revealed that the scale has acceptable levels of internal reliability for both phantom and cadaver images (α = 0.8 and 0.9, respectively). Factor analysis suggested that the scale is multidimensional (assessing multiple quality themes). This study represents the first full development and validation of a visual image quality scale using psychometric theory. It is likely that this scale will have clinical, training and research applications. This article presents data to create and validate visual grading scales for radiographic examinations. The visual grading scale, for AP pelvis examinations, can act as a validated tool for future research, teaching and clinical evaluations of image quality.

  16. Psychiatric diagnostic dilemmas in the medical setting.

    PubMed

    Strain, James J

    2005-09-01

    To review the problems posed for doctors by the failure of existing taxonomies to provide a satisfactory method for deriving diagnoses in cases of physical/psychiatric comorbidity, and of relating diagnoses on multiple axes. Review of existing taxonomies and key criticisms. The author was guided in selection by his experience as a member of the working parties involved in the creation of the American Psychiatric Association's DSM-IV. The attempts of the two major taxonomies, the ICD-10 and the American Psychiatric Association's DSM-IV, to address the problem by use of glossaries and multiple axes are described, and found wanting. Novel approaches, including McHugh and Slavey's perspectives of disease, dimensions, behaviour and life story, are described and evaluated. The problem of developing valid and reliable measures of physical/psychiatric comorbidity is addressed, including a discussion of genetic factors, neurobiological variables, target markers and other pathophysiological indicators. Finally, the concept of depression as a systemic illness involving brain, mind and body is raised and the implications of this discussed. Taxonomies require major revision in order to provide a useful basis for communication and research about one of the most frequent presentations in the community, physical/psychiatric comorbidity.

  17. Joint Sparse Recovery With Semisupervised MUSIC

    NASA Astrophysics Data System (ADS)

    Wen, Zaidao; Hou, Biao; Jiao, Licheng

    2017-05-01

    Discrete multiple signal classification (MUSIC) with its low computational cost and mild condition requirement becomes a significant noniterative algorithm for joint sparse recovery (JSR). However, it fails in rank defective problem caused by coherent or limited amount of multiple measurement vectors (MMVs). In this letter, we provide a novel sight to address this problem by interpreting JSR as a binary classification problem with respect to atoms. Meanwhile, MUSIC essentially constructs a supervised classifier based on the labeled MMVs so that its performance will heavily depend on the quality and quantity of these training samples. From this viewpoint, we develop a semisupervised MUSIC (SS-MUSIC) in the spirit of machine learning, which declares that the insufficient supervised information in the training samples can be compensated from those unlabeled atoms. Instead of constructing a classifier in a fully supervised manner, we iteratively refine a semisupervised classifier by exploiting the labeled MMVs and some reliable unlabeled atoms simultaneously. Through this way, the required conditions and iterations can be greatly relaxed and reduced. Numerical experimental results demonstrate that SS-MUSIC can achieve much better recovery performances than other MUSIC extended algorithms as well as some typical greedy algorithms for JSR in terms of iterations and recovery probability.

  18. Exploring the feasibility of traditional image querying tasks for industrial radiographs

    NASA Astrophysics Data System (ADS)

    Bray, Iliana E.; Tsai, Stephany J.; Jimenez, Edward S.

    2015-08-01

    Although there have been great strides in object recognition with optical images (photographs), there has been comparatively little research into object recognition for X-ray radiographs. Our exploratory work contributes to this area by creating an object recognition system designed to recognize components from a related database of radiographs. Object recognition for radiographs must be approached differently than for optical images, because radiographs have much less color-based information to distinguish objects, and they exhibit transmission overlap that alters perceived object shapes. The dataset used in this work contained more than 55,000 intermixed radiographs and photographs, all in a compressed JPEG form and with multiple ways of describing pixel information. For this work, a robust and efficient system is needed to combat problems presented by properties of the X-ray imaging modality, the large size of the given database, and the quality of the images contained in said database. We have explored various pre-processing techniques to clean the cluttered and low-quality images in the database, and we have developed our object recognition system by combining multiple object detection and feature extraction methods. We present the preliminary results of the still-evolving hybrid object recognition system.

  19. Multidisciplinary design optimization of vehicle instrument panel based on multi-objective genetic algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Ping; Wu, Guangqiang

    2013-03-01

    Typical multidisciplinary design optimization(MDO) has gradually been proposed to balance performances of lightweight, noise, vibration and harshness(NVH) and safety for instrument panel(IP) structure in the automotive development. Nevertheless, plastic constitutive relation of Polypropylene(PP) under different strain rates, has not been taken into consideration in current reliability-based and collaborative IP MDO design. In this paper, based on tensile test under different strain rates, the constitutive relation of Polypropylene material is studied. Impact simulation tests for head and knee bolster are carried out to meet the regulation of FMVSS 201 and FMVSS 208, respectively. NVH analysis is performed to obtain mainly the natural frequencies and corresponding mode shapes, while the crashworthiness analysis is employed to examine the crash behavior of IP structure. With the consideration of lightweight, NVH, head and knee bolster impact performance, design of experiment(DOE), response surface model(RSM), and collaborative optimization(CO) are applied to realize the determined and reliability-based optimizations, respectively. Furthermore, based on multi-objective genetic algorithm(MOGA), the optimal Pareto sets are completed to solve the multi-objective optimization(MOO) problem. The proposed research ensures the smoothness of Pareto set, enhances the ability of engineers to make a comprehensive decision about multi-objectives and choose the optimal design, and improves the quality and efficiency of MDO.

  20. A Hybrid Lifetime Extended Directional Approach for WBANs

    PubMed Central

    Li, Changle; Yuan, Xiaoming; Yang, Li; Song, Yueyang

    2015-01-01

    Wireless Body Area Networks (WBANs) can provide real-time and reliable health monitoring, attributing to the human-centered and sensor interoperability properties. WBANs have become a key component of the ubiquitous eHealth (electronic health) revolution that prospers on the basis of information and communication technologies. The prime consideration in WBAN is how to maximize the network lifetime with battery-powered sensor nodes in energy constraint. Novel solutions in Medium Access Control (MAC) protocols are imperative to satisfy the particular BAN scenario and the need of excellent energy efficiency in healthcare applications. In this paper, we propose a hybrid Lifetime Extended Directional Approach (LEDA) MAC protocol based on IEEE 802.15.6 to reduce energy consumption and prolong network lifetime. The LEDA MAC protocol takes full advantages of directional superiority in energy saving that employs multi-beam directional mode in Carrier Sense Multiple Access/Collision Avoidance (CSMA/CA) and single-beam directional mode in Time Division Multiple Access (TDMA) for alternative in data reservation and transmission according to the traffic varieties. Moreover, the impacts of some inherent problems of directional antennas such as deafness and hidden terminal problem can be decreased owing to that all nodes generate individual beam according to user priorities designated. Furthermore, LEDA MAC employs a Dynamic Polled Allocation Period (DPAP) for burst data transmissions to increase the network reliability and adaptability. Extensive analysis and simulation results show that the proposed LEDA MAC protocol achieves extended network lifetime with improved performance compared with IEEE 802.15.6. PMID:26556357

  1. Track Everything: Limiting Prior Knowledge in Online Multi-Object Recognition.

    PubMed

    Wong, Sebastien C; Stamatescu, Victor; Gatt, Adam; Kearney, David; Lee, Ivan; McDonnell, Mark D

    2017-10-01

    This paper addresses the problem of online tracking and classification of multiple objects in an image sequence. Our proposed solution is to first track all objects in the scene without relying on object-specific prior knowledge, which in other systems can take the form of hand-crafted features or user-based track initialization. We then classify the tracked objects with a fast-learning image classifier, that is based on a shallow convolutional neural network architecture and demonstrate that object recognition improves when this is combined with object state information from the tracking algorithm. We argue that by transferring the use of prior knowledge from the detection and tracking stages to the classification stage, we can design a robust, general purpose object recognition system with the ability to detect and track a variety of object types. We describe our biologically inspired implementation, which adaptively learns the shape and motion of tracked objects, and apply it to the Neovision2 Tower benchmark data set, which contains multiple object types. An experimental evaluation demonstrates that our approach is competitive with the state-of-the-art video object recognition systems that do make use of object-specific prior knowledge in detection and tracking, while providing additional practical advantages by virtue of its generality.

  2. Study of flow over object problems by a nodal discontinuous Galerkin-lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Wu, Jie; Shen, Meng; Liu, Chen

    2018-04-01

    The flow over object problems are studied by a nodal discontinuous Galerkin-lattice Boltzmann method (NDG-LBM) in this work. Different from the standard lattice Boltzmann method, the current method applies the nodal discontinuous Galerkin method into the streaming process in LBM to solve the resultant pure convection equation, in which the spatial discretization is completed on unstructured grids and the low-storage explicit Runge-Kutta scheme is used for time marching. The present method then overcomes the disadvantage of standard LBM for depending on the uniform meshes. Moreover, the collision process in the LBM is completed by using the multiple-relaxation-time scheme. After the validation of the NDG-LBM by simulating the lid-driven cavity flow, the simulations of flows over a fixed circular cylinder, a stationary airfoil and rotating-stationary cylinders are performed. Good agreement of present results with previous results is achieved, which indicates that the current NDG-LBM is accurate and effective for flow over object problems.

  3. Removing Barriers for Effective Deployment of Intermittent Renewable Generation

    NASA Astrophysics Data System (ADS)

    Arabali, Amirsaman

    The stochastic nature of intermittent renewable resources is the main barrier to effective integration of renewable generation. This problem can be studied from feeder-scale and grid-scale perspectives. Two new stochastic methods are proposed to meet the feeder-scale controllable load with a hybrid renewable generation (including wind and PV) and energy storage system. For the first method, an optimization problem is developed whose objective function is the cost of the hybrid system including the cost of renewable generation and storage subject to constraints on energy storage and shifted load. A smart-grid strategy is developed to shift the load and match the renewable energy generation and controllable load. Minimizing the cost function guarantees minimum PV and wind generation installation, as well as storage capacity selection for supplying the controllable load. A confidence coefficient is allocated to each stochastic constraint which shows to what degree the constraint is satisfied. In the second method, a stochastic framework is developed for optimal sizing and reliability analysis of a hybrid power system including renewable resources (PV and wind) and energy storage system. The hybrid power system is optimally sized to satisfy the controllable load with a specified reliability level. A load-shifting strategy is added to provide more flexibility for the system and decrease the installation cost. Load shifting strategies and their potential impacts on the hybrid system reliability/cost analysis are evaluated trough different scenarios. Using a compromise-solution method, the best compromise between the reliability and cost will be realized for the hybrid system. For the second problem, a grid-scale stochastic framework is developed to examine the storage application and its optimal placement for the social cost and transmission congestion relief of wind integration. Storage systems are optimally placed and adequately sized to minimize the sum of operation and congestion costs over a scheduling period. A technical assessment framework is developed to enhance the efficiency of wind integration and evaluate the economics of storage technologies and conventional gas-fired alternatives. The proposed method is used to carry out a cost-benefit analysis for the IEEE 24-bus system and determine the most economical technology. In order to mitigate the financial and technical concerns of renewable energy integration into the power system, a stochastic framework is proposed for transmission grid reinforcement studies in a power system with wind generation. A multi-stage multi-objective transmission network expansion planning (TNEP) methodology is developed which considers the investment cost, absorption of private investment and reliability of the system as the objective functions. A Non-dominated Sorting Genetic Algorithm (NSGA II) optimization approach is used in combination with a probabilistic optimal power flow (POPF) to determine the Pareto optimal solutions considering the power system uncertainties. Using a compromise-solution method, the best final plan is then realized based on the decision maker preferences. The proposed methodology is applied to the IEEE 24-bus Reliability Tests System (RTS) to evaluate the feasibility and practicality of the developed planning strategy.

  4. Average Likelihood Methods of Classification of Code Division Multiple Access (CDMA)

    DTIC Science & Technology

    2016-05-01

    case of cognitive radio applications. Modulation classification is part of a broader problem known as blind or uncooperative demodulation the goal of...Introduction 2 2.1 Modulation Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 2.2 Research Objectives...6 3 Modulation Classification Methods 7 3.0.1 Ad Hoc

  5. Computational Methods for Sparse Solution of Linear Inverse Problems

    DTIC Science & Technology

    2009-03-01

    this approach is that the algorithms take advantage of fast matrix–vector multiplications. An implementation is available as pdco and SolveBP in the...M. A. Saunders, “ PDCO : primal-dual interior-point method for con- vex objectives,” Systems Optimization Laboratory, Stanford University, Tech. Rep

  6. Item Analysis in Introductory Economics Testing.

    ERIC Educational Resources Information Center

    Tinari, Frank D.

    1979-01-01

    Computerized analysis of multiple choice test items is explained. Examples of item analysis applications in the introductory economics course are discussed with respect to three objectives: to evaluate learning; to improve test items; and to help improve classroom instruction. Problems, costs and benefits of the procedures are identified. (JMD)

  7. The fertility quality of life (FertiQoL) tool: development and general psychometric properties†

    PubMed Central

    Boivin, Jacky; Takefman, Janet; Braverman, Andrea

    2011-01-01

    BACKGROUND To develop the first international instrument to measure fertility quality of life (FertiQoL) in men and women experiencing fertility problems, to evaluate the preliminary psychometric properties of this new tool and to translate FertiQoL into multiple languages. METHOD We conducted a survey, both online and in fertility clinics in USA, Australia/New Zealand, Canada and UK. A total of 1414 people with fertility problems participated. The main outcome measure was the FertiQoL tool. RESULTS FertiQoL consists of 36 items that assess core (24 items) and treatment-related quality of life (QoL) (10 items) and overall life and physical health (2 items). Cronbach reliability statistics for the Core and Treatment FertiQoL (and subscales) were satisfactory and in the range of 0.72 and 0.92. Sensitivity analyses showed that FertiQoL detected expected relations between QoL and gender, parity and support-seeking. FertiQoL was translated into 20 languages by the same translation team with each translation verified by local bilingual fertility experts. CONCLUSIONS FertiQoL is a reliable measure of the impact of fertility problems and its treatment on QoL. Future research should establish its use in cross-cultural research and clinical work. PMID:21665875

  8. The James Supportive Care Screening: integrating science and practice to meet the NCCN guidelines for distress management at a Comprehensive Cancer Center.

    PubMed

    Wells-Di Gregorio, Sharla; Porensky, Emily K; Minotti, Matthew; Brown, Susan; Snapp, Janet; Taylor, Robert M; Adolph, Michael D; Everett, Sherman; Lowther, Kenneth; Callahan, Kelly; Streva, Devita; Heinke, Vicki; Leno, Debra; Flower, Courtney; McVey, Anne; Andersen, Barbara Lee

    2013-09-01

    Selecting a measure for oncology distress screening can be challenging. The measure must be brief, but comprehensive, capturing patients' most distressing concerns. The measure must provide meaningful coverage of multiple domains, assess symptom and problem-related distress, and ideally be suited for both clinical and research purposes. From March 2006 to August 2012, the James Supportive Care Screening (SCS) was developed and validated in three phases including content validation, factor analysis, and measure validation. Exploratory factor analyses were completed with 596 oncology patients followed by a confirmatory factor analysis with 477 patients. Six factors were identified and confirmed including (i) emotional concerns; (ii) physical symptoms; (iii) social/practical problems; (iv) spiritual problems; (v) cognitive concerns; and (vi) healthcare decision making/communication issues. Subscale evaluation reveals good to excellent internal consistency, test-retest reliability, and convergent, divergent, and predictive validity. Specificity of individual items was 0.90 and 0.87, respectively, for identifying patients with DSM-IV-TR diagnoses of major depression and generalized anxiety disorder. Results support use of the James SCS to quickly detect the most frequent and distressing symptoms and concerns of cancer patients. The James SCS is an efficient, reliable, and valid clinical and research outcomes measure. Copyright © 2013 John Wiley & Sons, Ltd.

  9. Bee Inspired Novel Optimization Algorithm and Mathematical Model for Effective and Efficient Route Planning in Railway System

    PubMed Central

    Leong, Kah Huo; Abdul-Rahman, Hamzah; Wang, Chen; Onn, Chiu Chuen

    2016-01-01

    Railway and metro transport systems (RS) are becoming one of the popular choices of transportation among people, especially those who live in urban cities. Urbanization and increasing population due to rapid development of economy in many cities are leading to a bigger demand for urban rail transit. Despite being a popular variant of Traveling Salesman Problem (TSP), it appears that the universal formula or techniques to solve the problem are yet to be found. This paper aims to develop an optimization algorithm for optimum route selection to multiple destinations in RS before returning to the starting point. Bee foraging behaviour is examined to generate a reliable algorithm in railway TSP. The algorithm is then verified by comparing the results with the exact solutions in 10 test cases, and a numerical case study is designed to demonstrate the application with large size sample. It is tested to be efficient and effective in railway route planning as the tour can be completed within a certain period of time by using minimal resources. The findings further support the reliability of the algorithm and capability to solve the problems with different complexity. This algorithm can be used as a method to assist business practitioners making better decision in route planning. PMID:27930659

  10. Bee Inspired Novel Optimization Algorithm and Mathematical Model for Effective and Efficient Route Planning in Railway System.

    PubMed

    Leong, Kah Huo; Abdul-Rahman, Hamzah; Wang, Chen; Onn, Chiu Chuen; Loo, Siaw-Chuing

    2016-01-01

    Railway and metro transport systems (RS) are becoming one of the popular choices of transportation among people, especially those who live in urban cities. Urbanization and increasing population due to rapid development of economy in many cities are leading to a bigger demand for urban rail transit. Despite being a popular variant of Traveling Salesman Problem (TSP), it appears that the universal formula or techniques to solve the problem are yet to be found. This paper aims to develop an optimization algorithm for optimum route selection to multiple destinations in RS before returning to the starting point. Bee foraging behaviour is examined to generate a reliable algorithm in railway TSP. The algorithm is then verified by comparing the results with the exact solutions in 10 test cases, and a numerical case study is designed to demonstrate the application with large size sample. It is tested to be efficient and effective in railway route planning as the tour can be completed within a certain period of time by using minimal resources. The findings further support the reliability of the algorithm and capability to solve the problems with different complexity. This algorithm can be used as a method to assist business practitioners making better decision in route planning.

  11. Learning to rank image tags with limited training examples.

    PubMed

    Songhe Feng; Zheyun Feng; Rong Jin

    2015-04-01

    With an increasing number of images that are available in social media, image annotation has emerged as an important research topic due to its application in image matching and retrieval. Most studies cast image annotation into a multilabel classification problem. The main shortcoming of this approach is that it requires a large number of training images with clean and complete annotations in order to learn a reliable model for tag prediction. We address this limitation by developing a novel approach that combines the strength of tag ranking with the power of matrix recovery. Instead of having to make a binary decision for each tag, our approach ranks tags in the descending order of their relevance to the given image, significantly simplifying the problem. In addition, the proposed method aggregates the prediction models for different tags into a matrix, and casts tag ranking into a matrix recovery problem. It introduces the matrix trace norm to explicitly control the model complexity, so that a reliable prediction model can be learned for tag ranking even when the tag space is large and the number of training images is limited. Experiments on multiple well-known image data sets demonstrate the effectiveness of the proposed framework for tag ranking compared with the state-of-the-art approaches for image annotation and tag ranking.

  12. Parana Basin Structure from Multi-Objective Inversion of Surface Wave and Receiver Function by Competent Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    An, M.; Assumpcao, M.

    2003-12-01

    The joint inversion of receiver function and surface wave is an effective way to diminish the influences of the strong tradeoff among parameters and the different sensitivity to the model parameters in their respective inversions, but the inversion problem becomes more complex. Multi-objective problems can be much more complicated than single-objective inversion in the model selection and optimization. If objectives are involved and conflicting, models can be ordered only partially. In this case, Pareto-optimal preference should be used to select solutions. On the other hand, the inversion to get only a few optimal solutions can not deal properly with the strong tradeoff between parameters, the uncertainties in the observation, the geophysical complexities and even the incompetency of the inversion technique. The effective way is to retrieve the geophysical information statistically from many acceptable solutions, which requires more competent global algorithms. Competent genetic algorithms recently proposed are far superior to the conventional genetic algorithm and can solve hard problems quickly, reliably and accurately. In this work we used one of competent genetic algorithms, Bayesian Optimization Algorithm as the main inverse procedure. This algorithm uses Bayesian networks to draw out inherited information and can use Pareto-optimal preference in the inversion. With this algorithm, the lithospheric structure of Paran"› basin is inverted to fit both the observations of inter-station surface wave dispersion and receiver function.

  13. Inter-rater Reliability of Sustained Aberrant Movement Patterns as a Clinical Assessment of Muscular Fatigue

    PubMed Central

    Aerts, Frank; Carrier, Kathy; Alwood, Becky

    2016-01-01

    Background: The assessment of clinical manifestation of muscle fatigue is an effective procedure in establishing therapeutic exercise dose. Few studies have evaluated physical therapist reliability in establishing muscle fatigue through detection of changes in quality of movement patterns in a live setting. Objective: The purpose of this study is to evaluate the inter-rater reliability of physical therapists’ ability to detect altered movement patterns due to muscle fatigue. Design: A reliability study in a live setting with multiple raters. Participants: Forty-four healthy individuals (ages 19-35) were evaluated by six physical therapists in a live setting. Methods: Participants were evaluated by physical therapists for altered movement patterns during resisted shoulder rotation. Each participant completed a total of four tests: right shoulder internal rotation, right shoulder external rotation, left shoulder internal rotation and left shoulder external rotation. Results: For all tests combined, the inter-rater reliability for a single rater scoring ICC (2,1) was .65 (95%, .60, .71) This corresponds to moderate inter-rater reliability between physical therapists. Limitations: The results of this study apply only to healthy participants and therefore cannot be generalized to a symptomatic population. Conclusion: Moderate inter-rater reliability was found between physical therapists in establishing muscle fatigue through the observation of sustained altered movement patterns during dynamic resistive shoulder internal and external rotation. PMID:27347241

  14. Research requirements to improve reliability of civil helicopters

    NASA Technical Reports Server (NTRS)

    Dougherty, J. J., III; Barrett, L. D.

    1978-01-01

    The major reliability problems of the civil helicopter fleet as reported by helicopter operational and maintenance personnel are documented. An assessment of each problem is made to determine if the reliability can be improved by application of present technology or whether additional research and development are required. The reliability impact is measured in three ways: (1) The relative frequency of each problem in the fleet. (2) The relative on-aircraft manhours to repair, associated with each fleet problem. (3) The relative cost of repair materials or replacement parts associated with each fleet problem. The data reviewed covered the period of 1971 through 1976 and covered only turbine engine aircraft.

  15. "Unreliability as a threat to understanding psychopathology: The cautionary tale of attentional bias": Correction to Rodebaugh et al. (2016).

    PubMed

    2016-10-01

    Reports an error in "Unreliability as a threat to understanding psychopathology: The cautionary tale of attentional bias" by Thomas L. Rodebaugh, Rachel B. Scullin, Julia K. Langer, David J. Dixon, Jonathan D. Huppert, Amit Bernstein, Ariel Zvielli and Eric J. Lenze ( Journal of Abnormal Psychology , 2016[Aug], Vol 125[6], 840-851). There was an error in the Author Note concerning the support of the MacBrain Face Stimulus Set. The correct statement is provided. (The following abstract of the original article appeared in record 2016-30117-001.) The use of unreliable measures constitutes a threat to our understanding of psychopathology, because advancement of science using both behavioral and biologically oriented measures can only be certain if such measurements are reliable. Two pillars of the National Institute of Mental Health's portfolio-the Research Domain Criteria (RDoC) initiative for psychopathology and the target engagement initiative in clinical trials-cannot succeed without measures that possess the high reliability necessary for tests involving mediation and selection based on individual differences. We focus on the historical lack of reliability of attentional bias measures as an illustration of how reliability can pose a threat to our understanding. Our own data replicate previous findings of poor reliability for traditionally used scores, which suggests a serious problem with the ability to test theories regarding attentional bias. This lack of reliability may also suggest problems with the assumption (in both theory and the formula for the scores) that attentional bias is consistent and stable across time. In contrast, measures accounting for attention as a dynamic process in time show good reliability in our data. The field is sorely in need of research reporting findings and reliability for attentional bias scores using multiple methods, including those focusing on dynamic processes over time. We urge researchers to test and report reliability of all measures, considering findings of low reliability not just as a nuisance but as an opportunity to modify and improve upon the underlying theory. Full assessment of reliability of measures will maximize the possibility that RDoC (and psychological science more generally) will succeed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  16. Validity and reliability of the International Cooperative Ataxia Rating Scale (ICARS) and the Scale for the Assessment and Rating of Ataxia (SARA) in multiple sclerosis patients with ataxia.

    PubMed

    Salcı, Yeliz; Fil, Ayla; Keklicek, Hilal; Çetin, Barış; Armutlu, Kadriye; Dolgun, Anıl; Tuncer, Aslı; Karabudak, Rana

    2017-11-01

    Ataxia is an extremely common problem in multiple sclerosis (MS) patients. Thus, appropriate scales are required for detailed assessment of this issue. The aim of our study was to investigate the reliability and validity of the Turkish version of the International Cooperative Ataxia Rating Scale (ICARS) and Scale for the Assessment and Rating of Ataxia (SARA), which are widely used in ataxia evaluation in the context of other cerebellar diseases. This cross-sectional study included 80 MS patients with Kurtzke cerebellar functional system score (C-FSS) greater than zero and slight pyramidal involvement. The Expanded Disability Status Scale (EDSS), C-FSS, and Berg Balance Scale (BBS) were administered. SARA and ICARS were assessed on first admission by two physical therapists. Seven days later, second assessments were repeated in same way for reliability. Intra-rater and inter-rater reliability were found to be high for both ICARS and SARA (p< 0.001) The Cronbach's α coefficients were 0.922 and 0.921 for SARA (reviewer 1 and reviewer 2 respectively) and 0.952 and 0.952 for ICARS (reviewer 1 and reviewer 2, respectively). There were no floor or ceiling effects determined for either scale except for item 17 of ICARS (p= 0.055). The EDSS total score had significant correlations with both SARA and ICARS (rho: 0.557 and 0.707, respectively). C-FSS had moderate correlation with SARA and high correlation with ICARS (rho: 0.469 and 0.653, respectively). BBS had no significant correlation with SARA and ICARS. (rho: -0.048 and -0.008 respectively). According to the area under the curve (AUC) value, ICARS is the best scale to discriminate mild and moderate ataxia. (AUC: 0.875). Factor analyses of ICARS showed that the rating results were determined by five different factors that did not coincide with the ICARS sub-scales. Our study demonstrated that ICARS and SARA are both reliable in MS patients with ataxia. Although ICARS has some structural problems, it seems to be more valid given its high correlations with EDSS and C-FSS. SARA also can be preferred as a brief assessment. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Multi-model ensemble hydrologic prediction using Bayesian model averaging

    NASA Astrophysics Data System (ADS)

    Duan, Qingyun; Ajami, Newsha K.; Gao, Xiaogang; Sorooshian, Soroosh

    2007-05-01

    Multi-model ensemble strategy is a means to exploit the diversity of skillful predictions from different models. This paper studies the use of Bayesian model averaging (BMA) scheme to develop more skillful and reliable probabilistic hydrologic predictions from multiple competing predictions made by several hydrologic models. BMA is a statistical procedure that infers consensus predictions by weighing individual predictions based on their probabilistic likelihood measures, with the better performing predictions receiving higher weights than the worse performing ones. Furthermore, BMA provides a more reliable description of the total predictive uncertainty than the original ensemble, leading to a sharper and better calibrated probability density function (PDF) for the probabilistic predictions. In this study, a nine-member ensemble of hydrologic predictions was used to test and evaluate the BMA scheme. This ensemble was generated by calibrating three different hydrologic models using three distinct objective functions. These objective functions were chosen in a way that forces the models to capture certain aspects of the hydrograph well (e.g., peaks, mid-flows and low flows). Two sets of numerical experiments were carried out on three test basins in the US to explore the best way of using the BMA scheme. In the first set, a single set of BMA weights was computed to obtain BMA predictions, while the second set employed multiple sets of weights, with distinct sets corresponding to different flow intervals. In both sets, the streamflow values were transformed using Box-Cox transformation to ensure that the probability distribution of the prediction errors is approximately Gaussian. A split sample approach was used to obtain and validate the BMA predictions. The test results showed that BMA scheme has the advantage of generating more skillful and equally reliable probabilistic predictions than original ensemble. The performance of the expected BMA predictions in terms of daily root mean square error (DRMS) and daily absolute mean error (DABS) is generally superior to that of the best individual predictions. Furthermore, the BMA predictions employing multiple sets of weights are generally better than those using single set of weights.

  18. Solving the problem of comparing whole bacterial genomes across different sequencing platforms.

    PubMed

    Kaas, Rolf S; Leekitcharoenphon, Pimlapas; Aarestrup, Frank M; Lund, Ole

    2014-01-01

    Whole genome sequencing (WGS) shows great potential for real-time monitoring and identification of infectious disease outbreaks. However, rapid and reliable comparison of data generated in multiple laboratories and using multiple technologies is essential. So far studies have focused on using one technology because each technology has a systematic bias making integration of data generated from different platforms difficult. We developed two different procedures for identifying variable sites and inferring phylogenies in WGS data across multiple platforms. The methods were evaluated on three bacterial data sets and sequenced on three different platforms (Illumina, 454, Ion Torrent). We show that the methods are able to overcome the systematic biases caused by the sequencers and infer the expected phylogenies. It is concluded that the cause of the success of these new procedures is due to a validation of all informative sites that are included in the analysis. The procedures are available as web tools.

  19. Multiple-scanning-probe tunneling microscope with nanoscale positional recognition function.

    PubMed

    Higuchi, Seiji; Kuramochi, Hiromi; Laurent, Olivier; Komatsubara, Takashi; Machida, Shinichi; Aono, Masakazu; Obori, Kenichi; Nakayama, Tomonobu

    2010-07-01

    Over the past decade, multiple-scanning-probe microscope systems with independently controlled probes have been developed for nanoscale electrical measurements. We developed a quadruple-scanning-probe tunneling microscope (QSPTM) that can determine and control the probe position through scanning-probe imaging. The difficulty of operating multiple probes with submicrometer precision drastically increases with the number of probes. To solve problems such as determining the relative positions of the probes and avoiding of contact between the probes, we adopted sample-scanning methods to obtain four images simultaneously and developed an original control system for QSPTM operation with a function of automatic positional recognition. These improvements make the QSPTM a more practical and useful instrument since four images can now be reliably produced, and consequently the positioning of the four probes becomes easier owing to the reduced chance of accidental contact between the probes.

  20. Three-dimensional sensing methodology combining stereo vision and phase-measuring profilometry based on dynamic programming

    NASA Astrophysics Data System (ADS)

    Lee, Hyunki; Kim, Min Young; Moon, Jeon Il

    2017-12-01

    Phase measuring profilometry and moiré methodology have been widely applied to the three-dimensional shape measurement of target objects, because of their high measuring speed and accuracy. However, these methods suffer from inherent limitations called a correspondence problem, or 2π-ambiguity problem. Although a kind of sensing method to combine well-known stereo vision and phase measuring profilometry (PMP) technique simultaneously has been developed to overcome this problem, it still requires definite improvement for sensing speed and measurement accuracy. We propose a dynamic programming-based stereo PMP method to acquire more reliable depth information and in a relatively small time period. The proposed method efficiently fuses information from two stereo sensors in terms of phase and intensity simultaneously based on a newly defined cost function of dynamic programming. In addition, the important parameters are analyzed at the view point of the 2π-ambiguity problem and measurement accuracy. To analyze the influence of important hardware and software parameters related to the measurement performance and to verify its efficiency, accuracy, and sensing speed, a series of experimental tests were performed with various objects and sensor configurations.

Top