Sample records for multi-method case study

  1. Using Case Study Multi-Methods to Investigate Close(r) Collaboration: Course-Based Tutoring and the Directive/Nondirective Instructional Continuum

    ERIC Educational Resources Information Center

    Corbett, Steven J.

    2011-01-01

    This essay presents case studies of "course-based tutoring" (CBT) and one-to-one tutorials in two sections of developmental first-year composition (FYC) at a large West Coast research university. The author's study uses a combination of rhetorical and discourse analyses and ethnographic and case study multi-methods to investigate both…

  2. A Three-Dimensional Target Depth-Resolution Method with a Single-Vector Sensor

    PubMed Central

    Zhao, Anbang; Bi, Xuejie; Hui, Juan; Zeng, Caigao; Ma, Lin

    2018-01-01

    This paper mainly studies and verifies the target number category-resolution method in multi-target cases and the target depth-resolution method of aerial targets. Firstly, target depth resolution is performed by using the sign distribution of the reactive component of the vertical complex acoustic intensity; the target category and the number resolution in multi-target cases is realized with a combination of the bearing-time recording information; and the corresponding simulation verification is carried out. The algorithm proposed in this paper can distinguish between the single-target multi-line spectrum case and the multi-target multi-line spectrum case. This paper presents an improved azimuth-estimation method for multi-target cases, which makes the estimation results more accurate. Using the Monte Carlo simulation, the feasibility of the proposed target number and category-resolution algorithm in multi-target cases is verified. In addition, by studying the field characteristics of the aerial and surface targets, the simulation results verify that there is only amplitude difference between the aerial target field and the surface target field under the same environmental parameters, and an aerial target can be treated as a special case of a surface target; the aerial target category resolution can then be realized based on the sign distribution of the reactive component of the vertical acoustic intensity so as to realize three-dimensional target depth resolution. By processing data from a sea experiment, the feasibility of the proposed aerial target three-dimensional depth-resolution algorithm is verified. PMID:29649173

  3. A Three-Dimensional Target Depth-Resolution Method with a Single-Vector Sensor.

    PubMed

    Zhao, Anbang; Bi, Xuejie; Hui, Juan; Zeng, Caigao; Ma, Lin

    2018-04-12

    This paper mainly studies and verifies the target number category-resolution method in multi-target cases and the target depth-resolution method of aerial targets. Firstly, target depth resolution is performed by using the sign distribution of the reactive component of the vertical complex acoustic intensity; the target category and the number resolution in multi-target cases is realized with a combination of the bearing-time recording information; and the corresponding simulation verification is carried out. The algorithm proposed in this paper can distinguish between the single-target multi-line spectrum case and the multi-target multi-line spectrum case. This paper presents an improved azimuth-estimation method for multi-target cases, which makes the estimation results more accurate. Using the Monte Carlo simulation, the feasibility of the proposed target number and category-resolution algorithm in multi-target cases is verified. In addition, by studying the field characteristics of the aerial and surface targets, the simulation results verify that there is only amplitude difference between the aerial target field and the surface target field under the same environmental parameters, and an aerial target can be treated as a special case of a surface target; the aerial target category resolution can then be realized based on the sign distribution of the reactive component of the vertical acoustic intensity so as to realize three-dimensional target depth resolution. By processing data from a sea experiment, the feasibility of the proposed aerial target three-dimensional depth-resolution algorithm is verified.

  4. A Homogenization Approach for Design and Simulation of Blast Resistant Composites

    NASA Astrophysics Data System (ADS)

    Sheyka, Michael

    Structural composites have been used in aerospace and structural engineering due to their high strength to weight ratio. Composite laminates have been successfully and extensively used in blast mitigation. This dissertation examines the use of the homogenization approach to design and simulate blast resistant composites. Three case studies are performed to examine the usefulness of different methods that may be used in designing and optimizing composite plates for blast resistance. The first case study utilizes a single degree of freedom system to simulate the blast and a reliability based approach. The first case study examines homogeneous plates and the optimal stacking sequence and plate thicknesses are determined. The second and third case studies use the homogenization method to calculate the properties of composite unit cell made of two different materials. The methods are integrated with dynamic simulation environments and advanced optimization algorithms. The second case study is 2-D and uses an implicit blast simulation, while the third case study is 3-D and simulates blast using the explicit blast method. Both case studies 2 and 3 rely on multi-objective genetic algorithms for the optimization process. Pareto optimal solutions are determined in case studies 2 and 3. Case study 3 is an integrative method for determining optimal stacking sequence, microstructure and plate thicknesses. The validity of the different methods such as homogenization, reliability, explicit blast modeling and multi-objective genetic algorithms are discussed. Possible extension of the methods to include strain rate effects and parallel computation is also examined.

  5. Generalizing DTW to the multi-dimensional case requires an adaptive approach

    PubMed Central

    Hu, Bing; Jin, Hongxia; Wang, Jun; Keogh, Eamonn

    2017-01-01

    In recent years Dynamic Time Warping (DTW) has emerged as the distance measure of choice for virtually all time series data mining applications. For example, virtually all applications that process data from wearable devices use DTW as a core sub-routine. This is the result of significant progress in improving DTW’s efficiency, together with multiple empirical studies showing that DTW-based classifiers at least equal (and generally surpass) the accuracy of all their rivals across dozens of datasets. Thus far, most of the research has considered only the one-dimensional case, with practitioners generalizing to the multi-dimensional case in one of two ways, dependent or independent warping. In general, it appears the community believes either that the two ways are equivalent, or that the choice is irrelevant. In this work, we show that this is not the case. The two most commonly used multi-dimensional DTW methods can produce different classifications, and neither one dominates over the other. This seems to suggest that one should learn the best method for a particular application. However, we will show that this is not necessary; a simple, principled rule can be used on a case-by-case basis to predict which of the two methods we should trust at the time of classification. Our method allows us to ensure that classification results are at least as accurate as the better of the two rival methods, and, in many cases, our method is significantly more accurate. We demonstrate our ideas with the most extensive set of multi-dimensional time series classification experiments ever attempted. PMID:29104448

  6. Non-monetary valuation using Multi-Criteria Decision Analysis: Sensitivity of additive aggregation methods to scaling and compensation assumptions

    EPA Science Inventory

    Analytical methods for Multi-Criteria Decision Analysis (MCDA) support the non-monetary valuation of ecosystem services for environmental decision making. Many published case studies transform ecosystem service outcomes into a common metric and aggregate the outcomes to set land ...

  7. Approximate static condensation algorithm for solving multi-material diffusion problems on meshes non-aligned with material interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kikinzon, Evgeny; Kuznetsov, Yuri; Lipnikov, Konstatin

    In this study, we describe a new algorithm for solving multi-material diffusion problem when material interfaces are not aligned with the mesh. In this case interface reconstruction methods are used to construct approximate representation of interfaces between materials. They produce so-called multi-material cells, in which materials are represented by material polygons that contain only one material. The reconstructed interface is not continuous between cells. Finally, we suggest the new method for solving multi-material diffusion problems on such meshes and compare its performance with known homogenization methods.

  8. Approximate static condensation algorithm for solving multi-material diffusion problems on meshes non-aligned with material interfaces

    DOE PAGES

    Kikinzon, Evgeny; Kuznetsov, Yuri; Lipnikov, Konstatin; ...

    2017-07-08

    In this study, we describe a new algorithm for solving multi-material diffusion problem when material interfaces are not aligned with the mesh. In this case interface reconstruction methods are used to construct approximate representation of interfaces between materials. They produce so-called multi-material cells, in which materials are represented by material polygons that contain only one material. The reconstructed interface is not continuous between cells. Finally, we suggest the new method for solving multi-material diffusion problems on such meshes and compare its performance with known homogenization methods.

  9. Perceptions of Organizational Culture of a Multi-Campus Community College District: Mixed Methods in Concert

    ERIC Educational Resources Information Center

    Kuster Dale, Kimberly

    2012-01-01

    This concurrent, mixed-methods case study analyzed perceptions of current and preferred organizational culture within a rural, multi-campus community college district. This phenomenon was examined by analyzing and comparing data collected by surveying all full-time employees utilizing the Organizational Culture Assessment Instrument (OCAI) and…

  10. Structural damage detection-oriented multi-type sensor placement with multi-objective optimization

    NASA Astrophysics Data System (ADS)

    Lin, Jian-Fu; Xu, You-Lin; Law, Siu-Seong

    2018-05-01

    A structural damage detection-oriented multi-type sensor placement method with multi-objective optimization is developed in this study. The multi-type response covariance sensitivity-based damage detection method is first introduced. Two objective functions for optimal sensor placement are then introduced in terms of the response covariance sensitivity and the response independence. The multi-objective optimization problem is formed by using the two objective functions, and the non-dominated sorting genetic algorithm (NSGA)-II is adopted to find the solution for the optimal multi-type sensor placement to achieve the best structural damage detection. The proposed method is finally applied to a nine-bay three-dimensional frame structure. Numerical results show that the optimal multi-type sensor placement determined by the proposed method can avoid redundant sensors and provide satisfactory results for structural damage detection. The restriction on the number of each type of sensors in the optimization can reduce the searching space in the optimization to make the proposed method more effective. Moreover, how to select a most optimal sensor placement from the Pareto solutions via the utility function and the knee point method is demonstrated in the case study.

  11. An Exploration of Mathematics Graduate Teaching Assistants' Teaching Philosophies: A Case Study

    ERIC Educational Resources Information Center

    Nepal, Kedar Mani

    2014-01-01

    This multi-case study is an exploration of mathematics graduate teaching assistants' teaching philosophies. It focused on the cases of four purposefully selected beginning mathematics graduate teaching assistants (MGTAs) including two domestic and two international MGTAs. Using qualitative research methods, this dissertation study focused on the…

  12. Computer Applications with the Related Facts in Multi-Grade: Teachers Opinions

    ERIC Educational Resources Information Center

    Öztürk, Mesut; Yilmaz, Gül Kaleli; Akkan, Yasar; Kaplan, Abdullah

    2015-01-01

    The study was conducted to examine the views on the use of computers in mathematics courses teachers in multi grade schools. The case study method of qualitative research design was used in this study. 10 teachers in the Bayburt in Turkey participated in the study. Conducted interviews with teachers participating in the study and the observations…

  13. Application of multi response optimization with grey relational analysis and fuzzy logic method

    NASA Astrophysics Data System (ADS)

    Winarni, Sri; Wahyu Indratno, Sapto

    2018-01-01

    Multi-response optimization is an optimization process by considering multiple responses simultaneously. The purpose of this research is to get the optimum point on multi-response optimization process using grey relational analysis and fuzzy logic method. The optimum point is determined from the Fuzzy-GRG (Grey Relational Grade) variable which is the conversion of the Signal to Noise Ratio of the responses involved. The case study used in this research are case optimization of electrical process parameters in electrical disharge machining. It was found that the combination of treatments resulting to optimum MRR and SR was a 70 V gap voltage factor, peak current 9 A and duty factor 0.8.

  14. A Case Study of Enabling Factors in the Technology Integration Change Process

    ERIC Educational Resources Information Center

    Hsu, Pi-Sui; Sharma, Priya

    2008-01-01

    The purpose of this qualitative case study was to analyze enabling factors in the technology integration change process in a multi-section science methods course, SCIED 408 (pseudonym), from 1997 to 2003 at a large northeastern university in the United States. We used two major data collection methods, in-depth interviewing and document reviews.…

  15. Application of the multi-disciplinary thematic seminar method in two homecare cases - a comparative study.

    PubMed

    Scandurra, Isabella; Hägglund, Maria; Koch, Sabine

    2008-01-01

    A significant problem with current health information technologies is that they poorly support collaborative work of healthcare professionals, sometimes leading to a fragmentation of workflow and disruption of healthcare processes. This paper presents two homecare cases, both applying multi-disciplinary thematic seminars (MdTS) as a collaborative method for user needs elicitation and requirements specification. This study describes the MdTS application to elicit user needs from different perspectives to coincide with collaborative professions' work practices in two cases. Despite different objectives, the two cases validated that MdTS emphasized the "points of intersection" in cooperative work. Different user groups with similar, yet distinct needs reached a common understanding of the entire work process, agreed upon requirements and participated in the design of prototypes supporting cooperative work. MdTS was applicable in both exploratory and normative studies aiming to elicit the specific requirements in a cooperative environment.

  16. Multidisciplinary Approaches to Educational Research: Case Studies from Europe and the Developing World. Routledge Research in Education

    ERIC Educational Resources Information Center

    Rizvi, Sadaf, Ed.

    2011-01-01

    This book provides an original perspective on a range of controversial issues in educational and social research through case studies of multi-disciplinary and mixed-method research involving children, teachers, schools and communities in Europe and the developing world. These case studies from researchers "across continents" and…

  17. Digital case-based learning system in school.

    PubMed

    Gu, Peipei; Guo, Jiayang

    2017-01-01

    With the continuing growth of multi-media learning resources, it is important to offer methods helping learners to explore and acquire relevant learning information effectively. As services that organize multi-media learning materials together to support programming learning, the digital case-based learning system is needed. In order to create a case-oriented e-learning system, this paper concentrates on the digital case study of multi-media resources and learning processes with an integrated framework. An integration of multi-media resources, testing and learning strategies recommendation as the learning unit is proposed in the digital case-based learning framework. The learning mechanism of learning guidance, multi-media materials learning and testing feedback is supported in our project. An improved personalized genetic algorithm which incorporates preference information and usage degree into the crossover and mutation process is proposed to assemble the personalized test sheet for each learner. A learning strategies recommendation solution is proposed to recommend learning strategies for learners to help them to learn. The experiments are conducted to prove that the proposed approaches are capable of constructing personalized sheets and the effectiveness of the framework.

  18. Digital case-based learning system in school

    PubMed Central

    Gu, Peipei

    2017-01-01

    With the continuing growth of multi-media learning resources, it is important to offer methods helping learners to explore and acquire relevant learning information effectively. As services that organize multi-media learning materials together to support programming learning, the digital case-based learning system is needed. In order to create a case-oriented e-learning system, this paper concentrates on the digital case study of multi-media resources and learning processes with an integrated framework. An integration of multi-media resources, testing and learning strategies recommendation as the learning unit is proposed in the digital case-based learning framework. The learning mechanism of learning guidance, multi-media materials learning and testing feedback is supported in our project. An improved personalized genetic algorithm which incorporates preference information and usage degree into the crossover and mutation process is proposed to assemble the personalized test sheet for each learner. A learning strategies recommendation solution is proposed to recommend learning strategies for learners to help them to learn. The experiments are conducted to prove that the proposed approaches are capable of constructing personalized sheets and the effectiveness of the framework. PMID:29107965

  19. SU-D-206-01: Employing a Novel Consensus Optimization Strategy to Achieve Iterative Cone Beam CT Reconstruction On a Multi-GPU Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, B; Southern Medical University, Guangzhou, Guangdong; Tian, Z

    Purpose: While compressed sensing-based cone-beam CT (CBCT) iterative reconstruction techniques have demonstrated tremendous capability of reconstructing high-quality images from undersampled noisy data, its long computation time still hinders wide application in routine clinic. The purpose of this study is to develop a reconstruction framework that employs modern consensus optimization techniques to achieve CBCT reconstruction on a multi-GPU platform for improved computational efficiency. Methods: Total projection data were evenly distributed to multiple GPUs. Each GPU performed reconstruction using its own projection data with a conventional total variation regularization approach to ensure image quality. In addition, the solutions from GPUs were subjectmore » to a consistency constraint that they should be identical. We solved the optimization problem with all the constraints considered rigorously using an alternating direction method of multipliers (ADMM) algorithm. The reconstruction framework was implemented using OpenCL on a platform with two Nvidia GTX590 GPU cards, each with two GPUs. We studied the performance of our method and demonstrated its advantages through a simulation case with a NCAT phantom and an experimental case with a Catphan phantom. Result: Compared with the CBCT images reconstructed using conventional FDK method with full projection datasets, our proposed method achieved comparable image quality with about one third projection numbers. The computation time on the multi-GPU platform was ∼55 s and ∼ 35 s in the two cases respectively, achieving a speedup factor of ∼ 3.0 compared with single GPU reconstruction. Conclusion: We have developed a consensus ADMM-based CBCT reconstruction method which enabled performing reconstruction on a multi-GPU platform. The achieved efficiency made this method clinically attractive.« less

  20. Improving multi-objective reservoir operation optimization with sensitivity-informed dimension reduction

    NASA Astrophysics Data System (ADS)

    Chu, J.; Zhang, C.; Fu, G.; Li, Y.; Zhou, H.

    2015-08-01

    This study investigates the effectiveness of a sensitivity-informed method for multi-objective operation of reservoir systems, which uses global sensitivity analysis as a screening tool to reduce computational demands. Sobol's method is used to screen insensitive decision variables and guide the formulation of the optimization problems with a significantly reduced number of decision variables. This sensitivity-informed method dramatically reduces the computational demands required for attaining high-quality approximations of optimal trade-off relationships between conflicting design objectives. The search results obtained from the reduced complexity multi-objective reservoir operation problems are then used to pre-condition the full search of the original optimization problem. In two case studies, the Dahuofang reservoir and the inter-basin multi-reservoir system in Liaoning province, China, sensitivity analysis results show that reservoir performance is strongly controlled by a small proportion of decision variables. Sensitivity-informed dimension reduction and pre-conditioning are evaluated in their ability to improve the efficiency and effectiveness of multi-objective evolutionary optimization. Overall, this study illustrates the efficiency and effectiveness of the sensitivity-informed method and the use of global sensitivity analysis to inform dimension reduction of optimization problems when solving complex multi-objective reservoir operation problems.

  1. Correlations of stock price fluctuations under multi-scale and multi-threshold scenarios

    NASA Astrophysics Data System (ADS)

    Sui, Guo; Li, Huajiao; Feng, Sida; Liu, Xueyong; Jiang, Meihui

    2018-01-01

    The multi-scale method is widely used in analyzing time series of financial markets and it can provide market information for different economic entities who focus on different periods. Through constructing multi-scale networks of price fluctuation correlation in the stock market, we can detect the topological relationship between each time series. Previous research has not addressed the problem that the original fluctuation correlation networks are fully connected networks and more information exists within these networks that is currently being utilized. Here we use listed coal companies as a case study. First, we decompose the original stock price fluctuation series into different time scales. Second, we construct the stock price fluctuation correlation networks at different time scales. Third, we delete the edges of the network based on thresholds and analyze the network indicators. Through combining the multi-scale method with the multi-threshold method, we bring to light the implicit information of fully connected networks.

  2. Optimizing the construction of devices to control inaccesible surfaces - case study

    NASA Astrophysics Data System (ADS)

    Niţu, E. L.; Costea, A.; Iordache, M. D.; Rizea, A. D.; Babă, Al

    2017-10-01

    The modern concept for the evolution of manufacturing systems requires multi-criteria optimization of technological processes and equipments, prioritizing associated criteria according to their importance. Technological preparation of the manufacturing can be developed, depending on the volume of production, to the limit of favourable economical effects related to the recovery of the costs for the design and execution of the technological equipment. Devices, as subsystems of the technological system, in the general context of modernization and diversification of machines, tools, semi-finished products and drives, are made in a multitude of constructive variants, which in many cases do not allow their identification, study and improvement. This paper presents a case study in which the multi-criteria analysis of some structures, based on a general optimization method, of novelty character, is used in order to determine the optimal construction variant of a control device. The rational construction of the control device confirms that the optimization method and the proposed calculation methods are correct and determine a different system configuration, new features and functions, and a specific method of working to control inaccessible surfaces.

  3. A Case Study of Knowledge Management in the "Back Office" of Two English Football Clubs

    ERIC Educational Resources Information Center

    Doloriert, Clair; Whitworth, Kieran

    2011-01-01

    Purpose: This study aims to explore knowledge management (KM) practice in the "back office" of two English football clubs. Design/methodology/approach: The paper takes the form of a comparative case study of two medium-sized businesses using multi-method data including unstructured interviews, structured questionnaires and document…

  4. Navigating a Divided Society: Educational Research Strategies for Post-Conflict Bosnia and Herzegovina

    ERIC Educational Resources Information Center

    Komatsu, Taro

    2012-01-01

    This article discusses methodological issues associated with education research in Bosnia and Herzegovina (BiH) and describes strategies taken to address them. Within a case study, mixed methods allowed the author to examine school leaders' perceptions multi-dimensionally. Multi-level analysis was essential to the understanding of policy-making…

  5. A new web-based framework development for fuzzy multi-criteria group decision-making.

    PubMed

    Hanine, Mohamed; Boutkhoum, Omar; Tikniouine, Abdessadek; Agouti, Tarik

    2016-01-01

    Fuzzy multi-criteria group decision making (FMCGDM) process is usually used when a group of decision-makers faces imprecise data or linguistic variables to solve the problems. However, this process contains many methods that require many time-consuming calculations depending on the number of criteria, alternatives and decision-makers in order to reach the optimal solution. In this study, a web-based FMCGDM framework that offers decision-makers a fast and reliable response service is proposed. The proposed framework includes commonly used tools for multi-criteria decision-making problems such as fuzzy Delphi, fuzzy AHP and fuzzy TOPSIS methods. The integration of these methods enables taking advantages of the strengths and complements each method's weakness. Finally, a case study of location selection for landfill waste in Morocco is performed to demonstrate how this framework can facilitate decision-making process. The results demonstrate that the proposed framework can successfully accomplish the goal of this study.

  6. Classification algorithm of lung lobe for lung disease cases based on multislice CT images

    NASA Astrophysics Data System (ADS)

    Matsuhiro, M.; Kawata, Y.; Niki, N.; Nakano, Y.; Mishima, M.; Ohmatsu, H.; Tsuchida, T.; Eguchi, K.; Kaneko, M.; Moriyama, N.

    2011-03-01

    With the development of multi-slice CT technology, to obtain an accurate 3D image of lung field in a short time is possible. To support that, a lot of image processing methods need to be developed. In clinical setting for diagnosis of lung cancer, it is important to study and analyse lung structure. Therefore, classification of lung lobe provides useful information for lung cancer analysis. In this report, we describe algorithm which classify lungs into lung lobes for lung disease cases from multi-slice CT images. The classification algorithm of lung lobes is efficiently carried out using information of lung blood vessel, bronchus, and interlobar fissure. Applying the classification algorithms to multi-slice CT images of 20 normal cases and 5 lung disease cases, we demonstrate the usefulness of the proposed algorithms.

  7. Optimized production planning model for a multi-plant cultivation system under uncertainty

    NASA Astrophysics Data System (ADS)

    Ke, Shunkui; Guo, Doudou; Niu, Qingliang; Huang, Danfeng

    2015-02-01

    An inexact multi-constraint programming model under uncertainty was developed by incorporating a production plan algorithm into the crop production optimization framework under the multi-plant collaborative cultivation system. In the production plan, orders from the customers are assigned to a suitable plant under the constraints of plant capabilities and uncertainty parameters to maximize profit and achieve customer satisfaction. The developed model and solution method were applied to a case study of a multi-plant collaborative cultivation system to verify its applicability. As determined in the case analysis involving different orders from customers, the period of plant production planning and the interval between orders can significantly affect system benefits. Through the analysis of uncertain parameters, reliable and practical decisions can be generated using the suggested model of a multi-plant collaborative cultivation system.

  8. Improving multi-objective reservoir operation optimization with sensitivity-informed problem decomposition

    NASA Astrophysics Data System (ADS)

    Chu, J. G.; Zhang, C.; Fu, G. T.; Li, Y.; Zhou, H. C.

    2015-04-01

    This study investigates the effectiveness of a sensitivity-informed method for multi-objective operation of reservoir systems, which uses global sensitivity analysis as a screening tool to reduce the computational demands. Sobol's method is used to screen insensitive decision variables and guide the formulation of the optimization problems with a significantly reduced number of decision variables. This sensitivity-informed problem decomposition dramatically reduces the computational demands required for attaining high quality approximations of optimal tradeoff relationships between conflicting design objectives. The search results obtained from the reduced complexity multi-objective reservoir operation problems are then used to pre-condition the full search of the original optimization problem. In two case studies, the Dahuofang reservoir and the inter-basin multi-reservoir system in Liaoning province, China, sensitivity analysis results show that reservoir performance is strongly controlled by a small proportion of decision variables. Sensitivity-informed problem decomposition and pre-conditioning are evaluated in their ability to improve the efficiency and effectiveness of multi-objective evolutionary optimization. Overall, this study illustrates the efficiency and effectiveness of the sensitivity-informed method and the use of global sensitivity analysis to inform problem decomposition when solving the complex multi-objective reservoir operation problems.

  9. The "Russian Doll" Approach: Developing Nested Case-Studies to Support International Comparative Research in Education

    ERIC Educational Resources Information Center

    Chong, Pei Wen; Graham, Linda J.

    2013-01-01

    International comparison is complicated by the use of different terms, classification methods, policy frameworks and system structures, not to mention different languages and terminology. Multi-case studies can assist in the understanding of the influence wielded by cultural, social, economic, historical and political forces upon educational…

  10. Multi-stage 3D-2D registration for correction of anatomical deformation in image-guided spine surgery

    NASA Astrophysics Data System (ADS)

    Ketcha, M. D.; De Silva, T.; Uneri, A.; Jacobson, M. W.; Goerres, J.; Kleinszig, G.; Vogt, S.; Wolinsky, J.-P.; Siewerdsen, J. H.

    2017-06-01

    A multi-stage image-based 3D-2D registration method is presented that maps annotations in a 3D image (e.g. point labels annotating individual vertebrae in preoperative CT) to an intraoperative radiograph in which the patient has undergone non-rigid anatomical deformation due to changes in patient positioning or due to the intervention itself. The proposed method (termed msLevelCheck) extends a previous rigid registration solution (LevelCheck) to provide an accurate mapping of vertebral labels in the presence of spinal deformation. The method employs a multi-stage series of rigid 3D-2D registrations performed on sets of automatically determined and increasingly localized sub-images, with the final stage achieving a rigid mapping for each label to yield a locally rigid yet globally deformable solution. The method was evaluated first in a phantom study in which a CT image of the spine was acquired followed by a series of 7 mobile radiographs with increasing degree of deformation applied. Second, the method was validated using a clinical data set of patients exhibiting strong spinal deformation during thoracolumbar spine surgery. Registration accuracy was assessed using projection distance error (PDE) and failure rate (PDE  >  20 mm—i.e. label registered outside vertebra). The msLevelCheck method was able to register all vertebrae accurately for all cases of deformation in the phantom study, improving the maximum PDE of the rigid method from 22.4 mm to 3.9 mm. The clinical study demonstrated the feasibility of the approach in real patient data by accurately registering all vertebral labels in each case, eliminating all instances of failure encountered in the conventional rigid method. The multi-stage approach demonstrated accurate mapping of vertebral labels in the presence of strong spinal deformation. The msLevelCheck method maintains other advantageous aspects of the original LevelCheck method (e.g. compatibility with standard clinical workflow, large capture range, and robustness against mismatch in image content) and extends capability to cases exhibiting strong changes in spinal curvature.

  11. A Multiple Case Study: The Effect of Professional Development on Teachers' Perceived Ability to Foster Resilience

    ERIC Educational Resources Information Center

    Klinger, Mary A.

    2012-01-01

    The purpose of this mixed methods study was to determine the effect of professional development on teachers' perceptions of their ability to foster resilience. Secondary questions investigated the effects of school level and socioeconomic status. An exploratory multi-site case study was designed to compare the perceptions of educators from…

  12. Multi-Entity Bayesian Networks Learning in Predictive Situation Awareness

    DTIC Science & Technology

    2013-06-01

    evaluated on a case study from PROGNOS. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as Report (SAR) 18...algorithm for MEBN. The methods are evaluated on a case study from PROGNOS. 1 INTRODUCTION Over the past two decades, machine learning has...the MFrag of the child node. Lastly, in the third For-Loop, for all resident nodes in the MTheory, LPDs are generated by MLE. 5 CASE STUDY

  13. The effect of uncertainties in distance-based ranking methods for multi-criteria decision making

    NASA Astrophysics Data System (ADS)

    Jaini, Nor I.; Utyuzhnikov, Sergei V.

    2017-08-01

    Data in the multi-criteria decision making are often imprecise and changeable. Therefore, it is important to carry out sensitivity analysis test for the multi-criteria decision making problem. The paper aims to present a sensitivity analysis for some ranking techniques based on the distance measures in multi-criteria decision making. Two types of uncertainties are considered for the sensitivity analysis test. The first uncertainty is related to the input data, while the second uncertainty is towards the Decision Maker preferences (weights). The ranking techniques considered in this study are TOPSIS, the relative distance and trade-off ranking methods. TOPSIS and the relative distance method measure a distance from an alternative to the ideal and antiideal solutions. In turn, the trade-off ranking calculates a distance of an alternative to the extreme solutions and other alternatives. Several test cases are considered to study the performance of each ranking technique in both types of uncertainties.

  14. Data fusion of multi-scale representations for structural damage detection

    NASA Astrophysics Data System (ADS)

    Guo, Tian; Xu, Zili

    2018-01-01

    Despite extensive researches into structural health monitoring (SHM) in the past decades, there are few methods that can detect multiple slight damage in noisy environments. Here, we introduce a new hybrid method that utilizes multi-scale space theory and data fusion approach for multiple damage detection in beams and plates. A cascade filtering approach provides multi-scale space for noisy mode shapes and filters the fluctuations caused by measurement noise. In multi-scale space, a series of amplification and data fusion algorithms are utilized to search the damage features across all possible scales. We verify the effectiveness of the method by numerical simulation using damaged beams and plates with various types of boundary conditions. Monte Carlo simulations are conducted to illustrate the effectiveness and noise immunity of the proposed method. The applicability is further validated via laboratory cases studies focusing on different damage scenarios. Both results demonstrate that the proposed method has a superior noise tolerant ability, as well as damage sensitivity, without knowing material properties or boundary conditions.

  15. Absolute order-of-magnitude reasoning applied to a social multi-criteria evaluation framework

    NASA Astrophysics Data System (ADS)

    Afsordegan, A.; Sánchez, M.; Agell, N.; Aguado, J. C.; Gamboa, G.

    2016-03-01

    A social multi-criteria evaluation framework for solving a real-case problem of selecting a wind farm location in the regions of Urgell and Conca de Barberá in Catalonia (northeast of Spain) is studied. This paper applies a qualitative multi-criteria decision analysis approach based on linguistic labels assessment able to address uncertainty and deal with different levels of precision. This method is based on qualitative reasoning as an artificial intelligence technique for assessing and ranking multi-attribute alternatives with linguistic labels in order to handle uncertainty. This method is suitable for problems in the social framework such as energy planning which require the construction of a dialogue process among many social actors with high level of complexity and uncertainty. The method is compared with an existing approach, which has been applied previously in the wind farm location problem. This approach, consisting of an outranking method, is based on Condorcet's original method. The results obtained by both approaches are analysed and their performance in the selection of the wind farm location is compared in aggregation procedures. Although results show that both methods conduct to similar alternatives rankings, the study highlights both their advantages and drawbacks.

  16. Family members of older persons with multi-morbidity and their experiences of case managers in Sweden: an interpretive phenomenological approach

    PubMed Central

    Hjelm, Markus; Holmgren, Ann-Charlotte; Willman, Ania; Bohman, Doris; Holst, Göran

    2015-01-01

    Background Family members of older persons (75+) with multi-morbidity are likely to benefit from utilising case management services performed by case managers. However, research has not yet explored their experiences of case managers. Objectives The aim of the study was to deepen the understanding of the importance of case managers to family members of older persons (75+) with multi-morbidity. Design The study design was based on an interpretive phenomenological approach. Method Data were collected through individual interviews with 16 family members in Sweden. The interviews were analysed by means of an interpretive phenomenological approach. Results The findings revealed one overarching theme: “Helps to fulfil my unmet needs”, based on three sub-themes: (1) “Helps me feel secure – Experiencing a trusting relationship”, (2) “Confirms and strengthens me – Challenging my sense of being alone” and (3) “Being my personal guide – Increasing my competence”. Conclusion and discussion The findings indicate that case managers were able to fulfil unmet needs of family members. The latter recognised the importance of case managers providing them with professional services tailored to their individual needs. The findings can contribute to the improvement of case management models not only for older persons but also for their family members. PMID:25918497

  17. Multi-GPU implementation of a VMAT treatment plan optimization algorithm.

    PubMed

    Tian, Zhen; Peng, Fei; Folkerts, Michael; Tan, Jun; Jia, Xun; Jiang, Steve B

    2015-06-01

    Volumetric modulated arc therapy (VMAT) optimization is a computationally challenging problem due to its large data size, high degrees of freedom, and many hardware constraints. High-performance graphics processing units (GPUs) have been used to speed up the computations. However, GPU's relatively small memory size cannot handle cases with a large dose-deposition coefficient (DDC) matrix in cases of, e.g., those with a large target size, multiple targets, multiple arcs, and/or small beamlet size. The main purpose of this paper is to report an implementation of a column-generation-based VMAT algorithm, previously developed in the authors' group, on a multi-GPU platform to solve the memory limitation problem. While the column-generation-based VMAT algorithm has been previously developed, the GPU implementation details have not been reported. Hence, another purpose is to present detailed techniques employed for GPU implementation. The authors also would like to utilize this particular problem as an example problem to study the feasibility of using a multi-GPU platform to solve large-scale problems in medical physics. The column-generation approach generates VMAT apertures sequentially by solving a pricing problem (PP) and a master problem (MP) iteratively. In the authors' method, the sparse DDC matrix is first stored on a CPU in coordinate list format (COO). On the GPU side, this matrix is split into four submatrices according to beam angles, which are stored on four GPUs in compressed sparse row format. Computation of beamlet price, the first step in PP, is accomplished using multi-GPUs. A fast inter-GPU data transfer scheme is accomplished using peer-to-peer access. The remaining steps of PP and MP problems are implemented on CPU or a single GPU due to their modest problem scale and computational loads. Barzilai and Borwein algorithm with a subspace step scheme is adopted here to solve the MP problem. A head and neck (H&N) cancer case is then used to validate the authors' method. The authors also compare their multi-GPU implementation with three different single GPU implementation strategies, i.e., truncating DDC matrix (S1), repeatedly transferring DDC matrix between CPU and GPU (S2), and porting computations involving DDC matrix to CPU (S3), in terms of both plan quality and computational efficiency. Two more H&N patient cases and three prostate cases are used to demonstrate the advantages of the authors' method. The authors' multi-GPU implementation can finish the optimization process within ∼ 1 min for the H&N patient case. S1 leads to an inferior plan quality although its total time was 10 s shorter than the multi-GPU implementation due to the reduced matrix size. S2 and S3 yield the same plan quality as the multi-GPU implementation but take ∼4 and ∼6 min, respectively. High computational efficiency was consistently achieved for the other five patient cases tested, with VMAT plans of clinically acceptable quality obtained within 23-46 s. Conversely, to obtain clinically comparable or acceptable plans for all six of these VMAT cases that the authors have tested in this paper, the optimization time needed in a commercial TPS system on CPU was found to be in an order of several minutes. The results demonstrate that the multi-GPU implementation of the authors' column-generation-based VMAT optimization can handle the large-scale VMAT optimization problem efficiently without sacrificing plan quality. The authors' study may serve as an example to shed some light on other large-scale medical physics problems that require multi-GPU techniques.

  18. Peripleural lung disease detection based on multi-slice CT images

    NASA Astrophysics Data System (ADS)

    Matsuhiro, M.; Suzuki, H.; Kawata, Y.; Niki, N.; Nakano, Y.; Ohmatsu, H.; Kusumoto, M.; Tsuchida, T.; Eguchi, K.; Kaneko, M.

    2015-03-01

    With the development of multi-slice CT technology, obtaining accurate 3D images of lung field in a short time become possible. To support that, a lot of image processing methods need to be developed. Detection peripleural lung disease is difficult due to its existence out of lung region, because lung extraction is often performed based on threshold processing. The proposed method uses thoracic inner region extracted by inner cavity of bone as well as air region, covers peripleural lung diseased cases such as lung nodule, calcification, pleural effusion and pleural plaque. We applied this method to 50 cases including 39 peripleural lung diseased cases. This method was able to detect 39 peripleural lung disease with 2.9 false positive per case.

  19. An Examination of Physical Education Data Sources and Collection Procedures during a Federally Funded Grant

    ERIC Educational Resources Information Center

    Dauenhauer, Brian D.; Keating, Xiaofen D.; Lambdin, Dolly

    2018-01-01

    Purpose: This study aimed to conduct an in-depth investigation into physical education data sources and collection procedures in a district that was awarded a Physical Education Program (PEP) grant. Method: A qualitative, multi-site case study was conducted in which a single school district was the overarching case and eight schools served as…

  20. Collaborative simulation method with spatiotemporal synchronization process control

    NASA Astrophysics Data System (ADS)

    Zou, Yisheng; Ding, Guofu; Zhang, Weihua; Zhang, Jian; Qin, Shengfeng; Tan, John Kian

    2016-10-01

    When designing a complex mechatronics system, such as high speed trains, it is relatively difficult to effectively simulate the entire system's dynamic behaviors because it involves multi-disciplinary subsystems. Currently,a most practical approach for multi-disciplinary simulation is interface based coupling simulation method, but it faces a twofold challenge: spatial and time unsynchronizations among multi-directional coupling simulation of subsystems. A new collaborative simulation method with spatiotemporal synchronization process control is proposed for coupling simulating a given complex mechatronics system across multiple subsystems on different platforms. The method consists of 1) a coupler-based coupling mechanisms to define the interfacing and interaction mechanisms among subsystems, and 2) a simulation process control algorithm to realize the coupling simulation in a spatiotemporal synchronized manner. The test results from a case study show that the proposed method 1) can certainly be used to simulate the sub-systems interactions under different simulation conditions in an engineering system, and 2) effectively supports multi-directional coupling simulation among multi-disciplinary subsystems. This method has been successfully applied in China high speed train design and development processes, demonstrating that it can be applied in a wide range of engineering systems design and simulation with improved efficiency and effectiveness.

  1. Improvement of Coordination in the Multi-National Military Coordination Center of the Nepal Army in Respond to Disasters

    DTIC Science & Technology

    2017-06-09

    primary question. This thesis has used the case study research methodology with Capability-Based Assessment (CBA) approach. My engagement in this...protected by more restrictions in their home countries, in which case further publication or sale of copyrighted images is not permissible...effective coordinating mechanism. The research follows the case study method utilizing the Capability Based Analysis (CBA) approach to scrutinize the

  2. Coronary CT angiography using 64 detector rows: methods and design of the multi-centre trial CORE-64.

    PubMed

    Miller, Julie M; Dewey, Marc; Vavere, Andrea L; Rochitte, Carlos E; Niinuma, Hiroyuki; Arbab-Zadeh, Armin; Paul, Narinder; Hoe, John; de Roos, Albert; Yoshioka, Kunihiro; Lemos, Pedro A; Bush, David E; Lardo, Albert C; Texter, John; Brinker, Jeffery; Cox, Christopher; Clouse, Melvin E; Lima, João A C

    2009-04-01

    Multislice computed tomography (MSCT) for the noninvasive detection of coronary artery stenoses is a promising candidate for widespread clinical application because of its non-invasive nature and high sensitivity and negative predictive value as found in several previous studies using 16 to 64 simultaneous detector rows. A multi-centre study of CT coronary angiography using 16 simultaneous detector rows has shown that 16-slice CT is limited by a high number of nondiagnostic cases and a high false-positive rate. A recent meta-analysis indicated a significant interaction between the size of the study sample and the diagnostic odds ratios suggestive of small study bias, highlighting the importance of evaluating MSCT using 64 simultaneous detector rows in a multi-centre approach with a larger sample size. In this manuscript we detail the objectives and methods of the prospective "CORE-64" trial ("Coronary Evaluation Using Multidetector Spiral Computed Tomography Angiography using 64 Detectors"). This multi-centre trial was unique in that it assessed the diagnostic performance of 64-slice CT coronary angiography in nine centres worldwide in comparison to conventional coronary angiography. In conclusion, the multi-centre, multi-institutional and multi-continental trial CORE-64 has great potential to ultimately assess the per-patient diagnostic performance of coronary CT angiography using 64 simultaneous detector rows.

  3. THE FUNDAMENTAL SOLUTIONS FOR MULTI-TERM MODIFIED POWER LAW WAVE EQUATIONS IN A FINITE DOMAIN.

    PubMed

    Jiang, H; Liu, F; Meerschaert, M M; McGough, R J

    2013-01-01

    Fractional partial differential equations with more than one fractional derivative term in time, such as the Szabo wave equation, or the power law wave equation, describe important physical phenomena. However, studies of these multi-term time-space or time fractional wave equations are still under development. In this paper, multi-term modified power law wave equations in a finite domain are considered. The multi-term time fractional derivatives are defined in the Caputo sense, whose orders belong to the intervals (1, 2], [2, 3), [2, 4) or (0, n ) ( n > 2), respectively. Analytical solutions of the multi-term modified power law wave equations are derived. These new techniques are based on Luchko's Theorem, a spectral representation of the Laplacian operator, a method of separating variables and fractional derivative techniques. Then these general methods are applied to the special cases of the Szabo wave equation and the power law wave equation. These methods and techniques can also be extended to other kinds of the multi-term time-space fractional models including fractional Laplacian.

  4. The Ethnic/Racial Variations of Intracerebral Hemorrhage (ERICH) Study Protocol

    PubMed Central

    Woo, Daniel; Rosand, Jonathan; Kidwell, Chelsea; McCauley, Jacob L.; Osborne, Jennifer; Brown, Mark W.; West, Sandra E.; Rademacher, Eric W.; Waddy, Salina; Roberts, Jamie N.; Koch, Sebastian; Gonzales, Nicole R.; Sung, Gene; Kittner, Steven J.; Birnbaum, Lee; Frankel, Michael; Daniel Testai, Fernando; Hall, Christiana E.; Elkind, Mitchell S. V.; Flaherty, Matthew; Coull, Bruce; Chong, Ji Y.; Warwick, Tanya; Malkoff, Marc; James, Michael L.; Ali, Latisha K.; Worrall, Bradford B.; Jones, Floyd; Watson, Tiffany; Leonard, Anne; Martinez, Rebecca; Sacco, Ralph I; Langefeld, Carl D.

    2013-01-01

    Background and Purpose Epidemiologic studies of intracerebral hemorrhage (ICH) have consistently demonstrated variation in incidence, location, age at presentation, and outcomes among non-Hispanic white, black, and Hispanic populations. We report here the design and methods for this large, prospective, multi-center case-control study of ICH. Methods The ERICH study is a multi-center, prospective case-control study of ICH. Cases are identified by hot-pursuit and enrolled using standard phenotype and risk factor information and include neuroimaging and blood sample collection. Controls are centrally identified by random digit dialing to match cases by age (+/−5 years), race, ethnicity, gender and metropolitan region. Results As of March 22, 2013, 1,655 cases of ICH had been recruited into the study which is 101.5% of the target for that date and 851 controls had been recruited which is 67.2% of the target for that date (1,267 controls) for a total of 2,506 subjects which is 86.5% of the target for that date (2,897 subjects). Of the 1,655 cases enrolled, 1,640 cases had the case interview entered into the database of which 628 (38%) were non-Hispanic black, 458 (28%) were non-Hispanic white and 554 (34%) were Hispanic. Of the 1,197 cases with imaging submitted, 876 (73.2%) had a 24 hour follow-up CT available In addition to CT imaging, 607 cases have had MRI evaluation. Conclusion The ERICH study is a large, case-control study of ICH with particular emphasis on recruitment of minority populations for the identification of genetic and epidemiologic risk factors for ICH and outcomes after ICH. PMID:24021679

  5. [Applications of meta-analysis in multi-omics].

    PubMed

    Han, Mingfei; Zhu, Yunping

    2014-07-01

    As a statistical method integrating multi-features and multi-data, meta-analysis was introduced to the field of life science in the 1990s. With the rapid advances in high-throughput technologies, life omics, the core of which are genomics, transcriptomics and proteomics, is becoming the new hot spot of life science. Although the fast output of massive data has promoted the development of omics study, it results in excessive data that are difficult to integrate systematically. In this case, meta-analysis is frequently applied to analyze different types of data and is improved continuously. Here, we first summarize the representative meta-analysis methods systematically, and then study the current applications of meta-analysis in various omics fields, finally we discuss the still-existing problems and the future development of meta-analysis.

  6. Multi-GPU implementation of a VMAT treatment plan optimization algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Zhen, E-mail: Zhen.Tian@UTSouthwestern.edu, E-mail: Xun.Jia@UTSouthwestern.edu, E-mail: Steve.Jiang@UTSouthwestern.edu; Folkerts, Michael; Tan, Jun

    Purpose: Volumetric modulated arc therapy (VMAT) optimization is a computationally challenging problem due to its large data size, high degrees of freedom, and many hardware constraints. High-performance graphics processing units (GPUs) have been used to speed up the computations. However, GPU’s relatively small memory size cannot handle cases with a large dose-deposition coefficient (DDC) matrix in cases of, e.g., those with a large target size, multiple targets, multiple arcs, and/or small beamlet size. The main purpose of this paper is to report an implementation of a column-generation-based VMAT algorithm, previously developed in the authors’ group, on a multi-GPU platform tomore » solve the memory limitation problem. While the column-generation-based VMAT algorithm has been previously developed, the GPU implementation details have not been reported. Hence, another purpose is to present detailed techniques employed for GPU implementation. The authors also would like to utilize this particular problem as an example problem to study the feasibility of using a multi-GPU platform to solve large-scale problems in medical physics. Methods: The column-generation approach generates VMAT apertures sequentially by solving a pricing problem (PP) and a master problem (MP) iteratively. In the authors’ method, the sparse DDC matrix is first stored on a CPU in coordinate list format (COO). On the GPU side, this matrix is split into four submatrices according to beam angles, which are stored on four GPUs in compressed sparse row format. Computation of beamlet price, the first step in PP, is accomplished using multi-GPUs. A fast inter-GPU data transfer scheme is accomplished using peer-to-peer access. The remaining steps of PP and MP problems are implemented on CPU or a single GPU due to their modest problem scale and computational loads. Barzilai and Borwein algorithm with a subspace step scheme is adopted here to solve the MP problem. A head and neck (H and N) cancer case is then used to validate the authors’ method. The authors also compare their multi-GPU implementation with three different single GPU implementation strategies, i.e., truncating DDC matrix (S1), repeatedly transferring DDC matrix between CPU and GPU (S2), and porting computations involving DDC matrix to CPU (S3), in terms of both plan quality and computational efficiency. Two more H and N patient cases and three prostate cases are used to demonstrate the advantages of the authors’ method. Results: The authors’ multi-GPU implementation can finish the optimization process within ∼1 min for the H and N patient case. S1 leads to an inferior plan quality although its total time was 10 s shorter than the multi-GPU implementation due to the reduced matrix size. S2 and S3 yield the same plan quality as the multi-GPU implementation but take ∼4 and ∼6 min, respectively. High computational efficiency was consistently achieved for the other five patient cases tested, with VMAT plans of clinically acceptable quality obtained within 23–46 s. Conversely, to obtain clinically comparable or acceptable plans for all six of these VMAT cases that the authors have tested in this paper, the optimization time needed in a commercial TPS system on CPU was found to be in an order of several minutes. Conclusions: The results demonstrate that the multi-GPU implementation of the authors’ column-generation-based VMAT optimization can handle the large-scale VMAT optimization problem efficiently without sacrificing plan quality. The authors’ study may serve as an example to shed some light on other large-scale medical physics problems that require multi-GPU techniques.« less

  7. Using classification and NDVI differencing methods for monitoring sparse vegetation coverage: a case study of saltcedar in Nevada, USA.

    USDA-ARS?s Scientific Manuscript database

    A change detection experiment for an invasive species, saltcedar, near Lovelock, Nevada, was conducted with multi-date Compact Airborne Spectrographic Imager (CASI) hyperspectral datasets. Classification and NDVI differencing change detection methods were tested, In the classification strategy, a p...

  8. A comparison of single- and multi-site calibration and validation: a case study of SWAT in the Miyun Reservoir watershed, China

    NASA Astrophysics Data System (ADS)

    Bai, Jianwen; Shen, Zhenyao; Yan, Tiezhu

    2017-09-01

    An essential task in evaluating global water resource and pollution problems is to obtain the optimum set of parameters in hydrological models through calibration and validation. For a large-scale watershed, single-site calibration and validation may ignore spatial heterogeneity and may not meet the needs of the entire watershed. The goal of this study is to apply a multi-site calibration and validation of the Soil andWater Assessment Tool (SWAT), using the observed flow data at three monitoring sites within the Baihe watershed of the Miyun Reservoir watershed, China. Our results indicate that the multi-site calibration parameter values are more reasonable than those obtained from single-site calibrations. These results are mainly due to significant differences in the topographic factors over the large-scale area, human activities and climate variability. The multi-site method involves the division of the large watershed into smaller watersheds, and applying the calibrated parameters of the multi-site calibration to the entire watershed. It was anticipated that this case study could provide experience of multi-site calibration in a large-scale basin, and provide a good foundation for the simulation of other pollutants in followup work in the Miyun Reservoir watershed and other similar large areas.

  9. The Readiness of Lecturers in Embedding Soft Skills in the Bachelor's Degree Program in Malaysian Institutes of Teacher Education

    ERIC Educational Resources Information Center

    Hassan, Aminuddin; Maharoff, Marina; Abiddin, Norhasni Zainal

    2014-01-01

    This is a preliminary research to obtain information to formulate a problem statement for an overall study of the embedding of soft skills in the program courses in higher learning institutions. This research was conducted in the form of single case and multi-case studies. The research data was attained through mixed methods; the quantitative…

  10. 77 FR 12598 - Notice Correction; A Multi-Center International Hospital-Based Case-Control Study of Lymphoma in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-01

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Notice Correction; A Multi-Center International Hospital-Based Case-Control Study of Lymphoma in Asia (AsiaLymph) (NCI) The Federal... project titled, ``A multi-center international hospital-based case-control study of lymphoma in Asia (Asia...

  11. Evaluation of a Cubature Kalman Filtering-Based Phase Unwrapping Method for Differential Interferograms with High Noise in Coal Mining Areas

    PubMed Central

    Liu, Wanli; Bian, Zhengfu; Liu, Zhenguo; Zhang, Qiuzhao

    2015-01-01

    Differential interferometric synthetic aperture radar has been shown to be effective for monitoring subsidence in coal mining areas. Phase unwrapping can have a dramatic influence on the monitoring result. In this paper, a filtering-based phase unwrapping algorithm in combination with path-following is introduced to unwrap differential interferograms with high noise in mining areas. It can perform simultaneous noise filtering and phase unwrapping so that the pre-filtering steps can be omitted, thus usually retaining more details and improving the detectable deformation. For the method, the nonlinear measurement model of phase unwrapping is processed using a simplified Cubature Kalman filtering, which is an effective and efficient tool used in many nonlinear fields. Three case studies are designed to evaluate the performance of the method. In Case 1, two tests are designed to evaluate the performance of the method under different factors including the number of multi-looks and path-guiding indexes. The result demonstrates that the unwrapped results are sensitive to the number of multi-looks and that the Fisher Distance is the most suitable path-guiding index for our study. Two case studies are then designed to evaluate the feasibility of the proposed phase unwrapping method based on Cubature Kalman filtering. The results indicate that, compared with the popular Minimum Cost Flow method, the Cubature Kalman filtering-based phase unwrapping can achieve promising results without pre-filtering and is an appropriate method for coal mining areas with high noise. PMID:26153776

  12. A multi-criteria decision analysis perspective on the health economic evaluation of medical interventions.

    PubMed

    Postmus, Douwe; Tervonen, Tommi; van Valkenhoef, Gert; Hillege, Hans L; Buskens, Erik

    2014-09-01

    A standard practice in health economic evaluation is to monetize health effects by assuming a certain societal willingness-to-pay per unit of health gain. Although the resulting net monetary benefit (NMB) is easy to compute, the use of a single willingness-to-pay threshold assumes expressibility of the health effects on a single non-monetary scale. To relax this assumption, this article proves that the NMB framework is a special case of the more general stochastic multi-criteria acceptability analysis (SMAA) method. Specifically, as SMAA does not restrict the number of criteria to two and also does not require the marginal rates of substitution to be constant, there are problem instances for which the use of this more general method may result in a better understanding of the trade-offs underlying the reimbursement decision-making problem. This is illustrated by applying both methods in a case study related to infertility treatment.

  13. Multi-frame image processing with panning cameras and moving subjects

    NASA Astrophysics Data System (ADS)

    Paolini, Aaron; Humphrey, John; Curt, Petersen; Kelmelis, Eric

    2014-06-01

    Imaging scenarios commonly involve erratic, unpredictable camera behavior or subjects that are prone to movement, complicating multi-frame image processing techniques. To address these issues, we developed three techniques that can be applied to multi-frame image processing algorithms in order to mitigate the adverse effects observed when cameras are panning or subjects within the scene are moving. We provide a detailed overview of the techniques and discuss the applicability of each to various movement types. In addition to this, we evaluated algorithm efficacy with demonstrated benefits using field test video, which has been processed using our commercially available surveillance product. Our results show that algorithm efficacy is significantly improved in common scenarios, expanding our software's operational scope. Our methods introduce little computational burden, enabling their use in real-time and low-power solutions, and are appropriate for long observation periods. Our test cases focus on imaging through turbulence, a common use case for multi-frame techniques. We present results of a field study designed to test the efficacy of these techniques under expanded use cases.

  14. An adaptive multi-moment FVM approach for incompressible flows

    NASA Astrophysics Data System (ADS)

    Liu, Cheng; Hu, Changhong

    2018-04-01

    In this study, a multi-moment finite volume method (FVM) based on block-structured adaptive Cartesian mesh is proposed for simulating incompressible flows. A conservative interpolation scheme following the idea of the constrained interpolation profile (CIP) method is proposed for the prolongation operation of the newly created mesh. A sharp immersed boundary (IB) method is used to model the immersed rigid body. A moving least squares (MLS) interpolation approach is applied for reconstruction of the velocity field around the solid surface. An efficient method for discretization of Laplacian operators on adaptive meshes is proposed. Numerical simulations on several test cases are carried out for validation of the proposed method. For the case of viscous flow past an impulsively started cylinder (Re = 3000 , 9500), the computed surface vorticity coincides with the result of the body-fitted method. For the case of a fast pitching NACA 0015 airfoil at moderate Reynolds numbers (Re = 10000 , 45000), the predicted drag coefficient (CD) and lift coefficient (CL) agree well with other numerical or experimental results. For 2D and 3D simulations of viscous flow past a pitching plate with prescribed motions (Re = 5000 , 40000), the predicted CD, CL and CM (moment coefficient) are in good agreement with those obtained by other numerical methods.

  15. The role of economics in the QUERI program: QUERI Series

    PubMed Central

    Smith, Mark W; Barnett, Paul G

    2008-01-01

    Background The United States (U.S.) Department of Veterans Affairs (VA) Quality Enhancement Research Initiative (QUERI) has implemented economic analyses in single-site and multi-site clinical trials. To date, no one has reviewed whether the QUERI Centers are taking an optimal approach to doing so. Consistent with the continuous learning culture of the QUERI Program, this paper provides such a reflection. Methods We present a case study of QUERI as an example of how economic considerations can and should be integrated into implementation research within both single and multi-site studies. We review theoretical and applied cost research in implementation studies outside and within VA. We also present a critique of the use of economic research within the QUERI program. Results Economic evaluation is a key element of implementation research. QUERI has contributed many developments in the field of implementation but has only recently begun multi-site implementation trials across multiple regions within the national VA healthcare system. These trials are unusual in their emphasis on developing detailed costs of implementation, as well as in the use of business case analyses (budget impact analyses). Conclusion Economics appears to play an important role in QUERI implementation studies, only after implementation has reached the stage of multi-site trials. Economic analysis could better inform the choice of which clinical best practices to implement and the choice of implementation interventions to employ. QUERI economics also would benefit from research on costing methods and development of widely accepted international standards for implementation economics. PMID:18430199

  16. Multi-off-grid methods in multi-step integration of ordinary differential equations

    NASA Technical Reports Server (NTRS)

    Beaudet, P. R.

    1974-01-01

    Description of methods of solving first- and second-order systems of differential equations in which all derivatives are evaluated at off-grid locations in order to circumvent the Dahlquist stability limitation on the order of on-grid methods. The proposed multi-off-grid methods require off-grid state predictors for the evaluation of the n derivatives at each step. Progressing forward in time, the off-grid states are predicted using a linear combination of back on-grid state values and off-grid derivative evaluations. A comparison is made between the proposed multi-off-grid methods and the corresponding Adams and Cowell on-grid integration techniques in integrating systems of ordinary differential equations, showing a significant reduction in the error at larger step sizes in the case of the multi-off-grid integrator.

  17. Production Task Queue Optimization Based on Multi-Attribute Evaluation for Complex Product Assembly Workshop.

    PubMed

    Li, Lian-Hui; Mo, Rong

    2015-01-01

    The production task queue has a great significance for manufacturing resource allocation and scheduling decision. Man-made qualitative queue optimization method has a poor effect and makes the application difficult. A production task queue optimization method is proposed based on multi-attribute evaluation. According to the task attributes, the hierarchical multi-attribute model is established and the indicator quantization methods are given. To calculate the objective indicator weight, criteria importance through intercriteria correlation (CRITIC) is selected from three usual methods. To calculate the subjective indicator weight, BP neural network is used to determine the judge importance degree, and then the trapezoid fuzzy scale-rough AHP considering the judge importance degree is put forward. The balanced weight, which integrates the objective weight and the subjective weight, is calculated base on multi-weight contribution balance model. The technique for order preference by similarity to an ideal solution (TOPSIS) improved by replacing Euclidean distance with relative entropy distance is used to sequence the tasks and optimize the queue by the weighted indicator value. A case study is given to illustrate its correctness and feasibility.

  18. Production Task Queue Optimization Based on Multi-Attribute Evaluation for Complex Product Assembly Workshop

    PubMed Central

    Li, Lian-hui; Mo, Rong

    2015-01-01

    The production task queue has a great significance for manufacturing resource allocation and scheduling decision. Man-made qualitative queue optimization method has a poor effect and makes the application difficult. A production task queue optimization method is proposed based on multi-attribute evaluation. According to the task attributes, the hierarchical multi-attribute model is established and the indicator quantization methods are given. To calculate the objective indicator weight, criteria importance through intercriteria correlation (CRITIC) is selected from three usual methods. To calculate the subjective indicator weight, BP neural network is used to determine the judge importance degree, and then the trapezoid fuzzy scale-rough AHP considering the judge importance degree is put forward. The balanced weight, which integrates the objective weight and the subjective weight, is calculated base on multi-weight contribution balance model. The technique for order preference by similarity to an ideal solution (TOPSIS) improved by replacing Euclidean distance with relative entropy distance is used to sequence the tasks and optimize the queue by the weighted indicator value. A case study is given to illustrate its correctness and feasibility. PMID:26414758

  19. An approach for aerodynamic optimization of transonic fan blades

    NASA Astrophysics Data System (ADS)

    Khelghatibana, Maryam

    Aerodynamic design optimization of transonic fan blades is a highly challenging problem due to the complexity of flow field inside the fan, the conflicting design requirements and the high-dimensional design space. In order to address all these challenges, an aerodynamic design optimization method is developed in this study. This method automates the design process by integrating a geometrical parameterization method, a CFD solver and numerical optimization methods that can be applied to both single and multi-point optimization design problems. A multi-level blade parameterization is employed to modify the blade geometry. Numerical analyses are performed by solving 3D RANS equations combined with SST turbulence model. Genetic algorithms and hybrid optimization methods are applied to solve the optimization problem. In order to verify the effectiveness and feasibility of the optimization method, a singlepoint optimization problem aiming to maximize design efficiency is formulated and applied to redesign a test case. However, transonic fan blade design is inherently a multi-faceted problem that deals with several objectives such as efficiency, stall margin, and choke margin. The proposed multi-point optimization method in the current study is formulated as a bi-objective problem to maximize design and near-stall efficiencies while maintaining the required design pressure ratio. Enhancing these objectives significantly deteriorate the choke margin, specifically at high rotational speeds. Therefore, another constraint is embedded in the optimization problem in order to prevent the reduction of choke margin at high speeds. Since capturing stall inception is numerically very expensive, stall margin has not been considered as an objective in the problem statement. However, improving near-stall efficiency results in a better performance at stall condition, which could enhance the stall margin. An investigation is therefore performed on the Pareto-optimal solutions to demonstrate the relation between near-stall efficiency and stall margin. The proposed method is applied to redesign NASA rotor 67 for single and multiple operating conditions. The single-point design optimization showed +0.28 points improvement of isentropic efficiency at design point, while the design pressure ratio and mass flow are, respectively, within 0.12% and 0.11% of the reference blade. Two cases of multi-point optimization are performed: First, the proposed multi-point optimization problem is relaxed by removing the choke margin constraint in order to demonstrate the relation between near-stall efficiency and stall margin. An investigation on the Pareto-optimal solutions of this optimization shows that the stall margin has been increased with improving near-stall efficiency. The second multi-point optimization case is performed with considering all the objectives and constraints. One selected optimized design on the Pareto front presents +0.41, +0.56 and +0.9 points improvement in near-peak efficiency, near-stall efficiency and stall margin, respectively. The design pressure ratio and mass flow are, respectively, within 0.3% and 0.26% of the reference blade. Moreover the optimized design maintains the required choking margin. Detailed aerodynamic analyses are performed to investigate the effect of shape optimization on shock occurrence, secondary flows, tip leakage and shock/tip-leakage interactions in both single and multi-point optimizations.

  20. Diagnosis and treatment of unconsummated marriage in an Iranian couple.

    PubMed

    Bokaie, Mahshid; Khalesi, Zahra Bostani; Yasini-Ardekani, Seyed Mojtaba

    2017-09-01

    Unconsummated marriage is a problem among couples who would not be able to perform natural sexual intercourse and vaginal penetration. This disorder is more common in developing countries and sometimes couples would come up with non-technical and non-scientific methods to overcome their problem. Multi-dimensional approach and narrative exposure therapy used in this case. This study would report a case of unconsummated marriage between a couple after 6 years. The main problem of this couple was vaginismus and post-traumatic stress. Treatment with multi-dimensional approach for this couple included methods like narrative exposure therapy, educating the anatomy of female and male reproductive system, correcting misconceptions, educating foreplay, educating body exploring and non-sexual and sexual massage and penetrating the vagina first by women finger and then men's after relaxation. The entire stages of the treatment lasted for four sessions and at the one-month follow-up couple's satisfaction was desirable. Unconsummated marriage is one of the main sexual problems; it is more common in developing countries than developed countries and cultural factors are effective on intensifying this disorder. The use of multi-dimensional approach in this study led to expedite diagnosis and treatment of vaginismus.

  1. Lost in the Labyrinthine Library: A Multi-Method Case Study Investigating Public Library User Wayfinding Behavior

    ERIC Educational Resources Information Center

    Mandel, Lauren Heather

    2012-01-01

    Wayfinding is the method by which humans orient and navigate in space, and particularly in built environments such as cities and complex buildings, including public libraries. In order to wayfind successfully in the built environment, humans need information provided by wayfinding systems and tools, for instance architectural cues, signs, and…

  2. Multi-Stakeholder Dynamic Optimization Framework for System-of-Systems Development and Evolution

    NASA Astrophysics Data System (ADS)

    Fang, Zhemei

    Architecture design for an "acknowledged" System-of-Systems (SoS), under performance uncertainty and constrained resources, remains a difficult problem. Composing an SoS via a proper mix of systems under the special control structure of an "acknowledged" SoS requires efficient distribution of the limited resources. However, due to the special traits of SoS, achieving an efficient distribution of the resources is not a trivial challenge. Currently, the major causes that lead to inefficient resource management for an "acknowledged" SoS include: 1) no central SoS managers with absolute authority to address conflict; 2) difficult balance between current and future decisions; 3) various uncertainties during development and operations (e.g., technology maturation, policy stability); 4) diverse sources of the resources; 5) high complexity in efficient formulation and computation due to the previous four factors. Although it is beyond the scope of this dissertation to simultaneously address all the five items, the thesis will focus on the first, second, and fifth points, and partially cover the third point. In a word, the dissertation aims to develop a generic framework for "acknowledged" SoS that leads to appropriate mathematical formulation and a solution approach that generates a near-optimal set of multi-stage architectural decisions with limited collaboration between conflicted and independent stakeholders. This dissertation proposes a multi-stakeholder dynamic optimization (MUSTDO) method, which integrates approximate dynamic programming and transfer contract coordination mechanism. The method solves a multi-stage architecture selection problem with an embedded formal, but simple, transfer contract coordination mechanism to address resource conflict. Once the values of transfer contract are calculated appropriately, even though the SoS participants make independent decisions, the aggregate solutions are close to the solutions from a hypothetical ideal centralized case where the top-level SoS managers have full authority. In addition, the thesis builds the bridge between a given SoS problem and the mathematical interpretations of the MUSTDO method using a three-phase approach for real world applications. The method is applied to two case studies: one in the defense realm and one in the commercial realm. The first application uses a naval warfare scenario to demonstrate that the aggregated capabilities in the decentralized case using MUSTDO method are close to the aggregated capabilities in a hypothetical centralized case. This evidence demonstrates that the MUSTDO method can help approach the SoS-level optimality with limited funding resource even if the participants make independent decisions. The solution also provides suggestions to the participants about the sequence of architecting decisions and the amount of transfer contract to be sent out to maximize individual capability over time. The suggested decisions incorporate the potential capability increase in the future, which differentiates itself from allocating all the resources to the current development. The quantified numbers of transfer contract in this case study are equivalent capabilities that are relevant to equipment loan or technology transfer. The second case study applies the MUSTDO-based framework to address a multi-airline fleet allocation problem with emissions allowances constraint provided by the regulators. Two representative airlines including the low-cost airline and the legacy airline aim to maximize individual profit by allocating six type of aircraft to a given ten-route network under the emissions constraint. Both the deterministic and stochastic experiments verify the effectiveness of the MUSTDO method by comparing the profit in the decentralized case and profit in a utopian centralized case. Meanwhile, sensitivity studies demonstrate that higher minimum demand requirement and lower discount factor can further improve the efficiency of emissions allowances utilization in MUSTDO method. Comparing to an alternate grandfathering approach, the MUSTDO method can guarantee a high-level efficiency of resource allocation by avoiding failed allocation decisions due to inaccurate information for the regulators. In summary, the framework aids the SoS managers and participants in the selection of the best architecture over a period of time with limited resources; the framework helps the decision makers to understand how they can affect each other and cooperate to achieve a more efficient solution without sharing full information. The major contribution of this dissertation includes: 1) provide a method to address multi-stage SoS composition decisions over time with resource constraint; 2) provide a method to manage resource conflict for stakeholders in an "acknowledged" system-of-systems; 2) provide a new perspective of long-term interactions between stakeholders in an SoS; 3) provide procedural framework to implement the MUSTDO method; 4) provide comparison of different applications of the MUSTDO framework in distinct fields.

  3. Based on a multi-agent system for multi-scale simulation and application of household's LUCC: a case study for Mengcha village, Mizhi county, Shaanxi province.

    PubMed

    Chen, Hai; Liang, Xiaoying; Li, Rui

    2013-01-01

    Multi-Agent Systems (MAS) offer a conceptual approach to include multi-actor decision making into models of land use change. Through the simulation based on the MAS, this paper tries to show the application of MAS in the micro scale LUCC, and reveal the transformation mechanism of difference scale. This paper starts with a description of the context of MAS research. Then, it adopts the Nested Spatial Choice (NSC) method to construct the multi-scale LUCC decision-making model. And a case study for Mengcha village, Mizhi County, Shaanxi Province is reported. Finally, the potentials and drawbacks of the following approach is discussed and concluded. From our design and implementation of the MAS in multi-scale model, a number of observations and conclusions can be drawn on the implementation and future research directions. (1) The use of the LUCC decision-making and multi-scale transformation framework provides, according to us, a more realistic modeling of multi-scale decision making process. (2) By using continuous function, rather than discrete function, to construct the decision-making of the households is more realistic to reflect the effect. (3) In this paper, attempts have been made to give a quantitative analysis to research the household interaction. And it provides the premise and foundation for researching the communication and learning among the households. (4) The scale transformation architecture constructed in this paper helps to accumulate theory and experience for the interaction research between the micro land use decision-making and the macro land use landscape pattern. Our future research work will focus on: (1) how to rational use risk aversion principle, and put the rule on rotation between household parcels into model. (2) Exploring the methods aiming at researching the household decision-making over a long period, it allows us to find the bridge between the long-term LUCC data and the short-term household decision-making. (3) Researching the quantitative method and model, especially the scenario analysis model which may reflect the interaction among different household types.

  4. Teaching and Learning in Two iPad-Infused Classrooms: A Descriptive Case Study of A Dual Classroom, School-Based Pilot Project

    ERIC Educational Resources Information Center

    Maich, Kimberly; Hall, Carmen L.; van Rhijn, Tricia Marie; Henning, Megan

    2017-01-01

    This multi-methods, descriptive case study examines attitudes and practices of classroom-based iPad use. The site is one inner-city, urban, publicly funded school, focused on two iPad-infused classrooms (Grade 2/3 and Grade 4/5). Data were collected from 5 educators and 35 students to investigate two research questions: How are iPads being…

  5. How to use multi-criteria decision analysis methods for reimbursement decision-making in healthcare: a step-by-step guide.

    PubMed

    Diaby, Vakaramoko; Goeree, Ron

    2014-02-01

    In recent years, the quest for more comprehensiveness, structure and transparency in reimbursement decision-making in healthcare has prompted the research into alternative decision-making frameworks. In this environment, multi-criteria decision analysis (MCDA) is arising as a valuable tool to support healthcare decision-making. In this paper, we present the main MCDA decision support methods (elementary methods, value-based measurement models, goal programming models and outranking models) using a case study approach. For each family of methods, an example of how an MCDA model would operate in a real decision-making context is presented from a critical perspective, highlighting the parameters setting, the selection of the appropriate evaluation model as well as the role of sensitivity and robustness analyses. This study aims to provide a step-by-step guide on how to use MCDA methods for reimbursement decision-making in healthcare.

  6. Demonstration of Multi- and Single-Reader Sample Size Program for Diagnostic Studies software.

    PubMed

    Hillis, Stephen L; Schartz, Kevin M

    2015-02-01

    The recently released software Multi- and Single-Reader Sample Size Sample Size Program for Diagnostic Studies , written by Kevin Schartz and Stephen Hillis, performs sample size computations for diagnostic reader-performance studies. The program computes the sample size needed to detect a specified difference in a reader performance measure between two modalities, when using the analysis methods initially proposed by Dorfman, Berbaum, and Metz (DBM) and Obuchowski and Rockette (OR), and later unified and improved by Hillis and colleagues. A commonly used reader performance measure is the area under the receiver-operating-characteristic curve. The program can be used with typical common reader-performance measures which can be estimated parametrically or nonparametrically. The program has an easy-to-use step-by-step intuitive interface that walks the user through the entry of the needed information. Features of the software include the following: (1) choice of several study designs; (2) choice of inputs obtained from either OR or DBM analyses; (3) choice of three different inference situations: both readers and cases random, readers fixed and cases random, and readers random and cases fixed; (4) choice of two types of hypotheses: equivalence or noninferiority; (6) choice of two output formats: power for specified case and reader sample sizes, or a listing of case-reader combinations that provide a specified power; (7) choice of single or multi-reader analyses; and (8) functionality in Windows, Mac OS, and Linux.

  7. a New Multi-Spectral Threshold Normalized Difference Water Index Mst-Ndwi Water Extraction Method - a Case Study in Yanhe Watershed

    NASA Astrophysics Data System (ADS)

    Zhou, Y.; Zhao, H.; Hao, H.; Wang, C.

    2018-05-01

    Accurate remote sensing water extraction is one of the primary tasks of watershed ecological environment study. Since the Yanhe water system has typical characteristics of a small water volume and narrow river channel, which leads to the difficulty for conventional water extraction methods such as Normalized Difference Water Index (NDWI). A new Multi-Spectral Threshold segmentation of the NDWI (MST-NDWI) water extraction method is proposed to achieve the accurate water extraction in Yanhe watershed. In the MST-NDWI method, the spectral characteristics of water bodies and typical backgrounds on the Landsat/TM images have been evaluated in Yanhe watershed. The multi-spectral thresholds (TM1, TM4, TM5) based on maximum-likelihood have been utilized before NDWI water extraction to realize segmentation for a division of built-up lands and small linear rivers. With the proposed method, a water map is extracted from the Landsat/TM images in 2010 in China. An accuracy assessment is conducted to compare the proposed method with the conventional water indexes such as NDWI, Modified NDWI (MNDWI), Enhanced Water Index (EWI), and Automated Water Extraction Index (AWEI). The result shows that the MST-NDWI method generates better water extraction accuracy in Yanhe watershed and can effectively diminish the confusing background objects compared to the conventional water indexes. The MST-NDWI method integrates NDWI and Multi-Spectral Threshold segmentation algorithms, with richer valuable information and remarkable results in accurate water extraction in Yanhe watershed.

  8. Community engagement to enhance trust between Gypsy/Travellers, and maternity, early years' and child dental health services: protocol for a multi-method exploratory study.

    PubMed

    McFadden, Alison; Atkin, Karl; Bell, Kerry; Innes, Nicola; Jackson, Cath; Jones, Helen; MacGillivray, Steve; Siebelt, Lindsay

    2016-11-14

    Gypsy/Travellers have poor health and experience discrimination alongside structural and cultural barriers when accessing health services and consequently may mistrust those services. Our study aims to investigate which approaches to community engagement are most likely to be effective at enhancing trust between Gypsy/Travellers and mainstream health services. This multi-method 30-month study, commenced in June 2015, and comprises four stages. 1. Three related reviews: a) systematic review of Gypsy/Travellers' access to health services; b) systematic review of reviews of how trust has been conceptualised within healthcare; c) realist synthesis of community engagement approaches to enhance trust and increase Gypsy/Travellers' participation in health services. The reviews will consider any economic literature; 2. Online consultation with health and social care practitioners, and civil society organisations on existing engagement activities, including perceptions of barriers and good practice; 3. Four in-depth case studies of different Gypsy/Traveller communities, focusing on maternity, early years and child dental health services. The case studies include the views of 32-48 mothers of pre-school children, 32-40 healthcare providers and 8-12 informants from third sector organisations. 4. Two stakeholder workshops exploring whether policy options are realistic, sustainable and replicable. Case study data will be analysed thematically informed by the evaluative framework derived from the realist synthesis in stage one. The main outputs will be: a) an evaluative framework of Gypsy/Travellers' engagement with health services; b) recommendations for policy and practice; c) evidence on which to base future implementation strategies including estimation of costs. Our novel multi-method study seeks to provide recommendations for policy and practice that have potential to improve uptake and delivery of health services, and to reduce lifetime health inequalities for Gypsy/Travellers. The findings may have wider resonance for other marginalised populations. Strengths and limitations of the study are discussed. Prospero registration for literature reviews: CRD42015021955 and CRD42015021950 UKCRN reference: 20036.

  9. Implementation of the Geological Hazard Monitoring and Early Warning System Based on Multi - source Data -A Case Study of Deqin Tibetan County, Yunnan Province

    NASA Astrophysics Data System (ADS)

    Zhao, Junsan; Chen, Guoping; Yuan, Lei

    2017-04-01

    The new technologies, such as 3D laser scanning, InSAR, GNSS, unmanned aerial vehicle and Internet of things, will provide much more data resources for the surveying and monitoring, as well as the development of Early Warning System (EWS). This paper provides the solutions of the design and implementation of a geological disaster monitoring and early warning system (GDMEWS), which includes landslides and debris flows hazard, based on the multi-sources of the date by use of technologies above mentioned. The complex and changeable characteristics of the GDMEWS are described. The architecture of the system, composition of the multi-source database, development mode and service logic, the methods and key technologies of system development are also analyzed. To elaborate the process of the implementation of the GDMEWS, Deqin Tibetan County is selected as a case study area, which has the unique terrain and diverse types of typical landslides and debris flows. Firstly, the system functional requirements, monitoring and forecasting models of the system are discussed. Secondly, the logic relationships of the whole process of disaster including pre-disaster, disaster rescue and post-disaster reconstruction are studied, and the support tool for disaster prevention, disaster reduction and geological disaster management are developed. Thirdly, the methods of the multi - source monitoring data integration and the generation of the mechanism model of Geological hazards and simulation are expressed. Finally, the construction of the GDMEWS is issued, which will be applied to management, monitoring and forecasting of whole disaster process in real-time and dynamically in Deqin Tibetan County. Keywords: multi-source spatial data; geological disaster; monitoring and warning system; Deqin Tibetan County

  10. Leadership and the Design of Data-Driven Professional Networks in Schools

    ERIC Educational Resources Information Center

    Liou, Yi-Hwa; Grigg, Jeffrey; Halverson, Richard

    2014-01-01

    Using data from a multi-method comparative case study of two matched schools, this paper adds to the growing body of applications of social network analysis to the study of distributed leadership and accountability. We contrast two approaches to instructional leadership, prescriptive and discretionary, to investigate how leaders design…

  11. Impactful Student Learning Outcomes of One-to-One Student Laptop Programs in Low Socioeconomic Schools

    ERIC Educational Resources Information Center

    Harris, Matthew Joseph

    2010-01-01

    At present, a majority of one-to-one student laptop programs exist in schools that serve affluent communities, which denies low socioeconomic students the learning benefits of ubiquitous access to technology. Using a "Studying Up-Studying Down" paradigm, this multi-site case study collected mixed method data from program participants at five…

  12. An enhanced SOCP-based method for feeder load balancing using the multi-terminal soft open point in active distribution networks

    DOE PAGES

    Ji, Haoran; Wang, Chengshan; Li, Peng; ...

    2017-09-20

    The integration of distributed generators (DGs) exacerbates the feeder power flow fluctuation and load unbalanced condition in active distribution networks (ADNs). The unbalanced feeder load causes inefficient use of network assets and network congestion during system operation. The flexible interconnection based on the multi-terminal soft open point (SOP) significantly benefits the operation of ADNs. The multi-terminal SOP, which is a controllable power electronic device installed to replace the normally open point, provides accurate active and reactive power flow control to enable the flexible connection of feeders. An enhanced SOCP-based method for feeder load balancing using the multi-terminal SOP is proposedmore » in this paper. Furthermore, by regulating the operation of the multi-terminal SOP, the proposed method can mitigate the unbalanced condition of feeder load and simultaneously reduce the power losses of ADNs. Then, the original non-convex model is converted into a second-order cone programming (SOCP) model using convex relaxation. In order to tighten the SOCP relaxation and improve the computation efficiency, an enhanced SOCP-based approach is developed to solve the proposed model. Finally, case studies are performed on the modified IEEE 33-node system to verify the effectiveness and efficiency of the proposed method.« less

  13. An enhanced SOCP-based method for feeder load balancing using the multi-terminal soft open point in active distribution networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ji, Haoran; Wang, Chengshan; Li, Peng

    The integration of distributed generators (DGs) exacerbates the feeder power flow fluctuation and load unbalanced condition in active distribution networks (ADNs). The unbalanced feeder load causes inefficient use of network assets and network congestion during system operation. The flexible interconnection based on the multi-terminal soft open point (SOP) significantly benefits the operation of ADNs. The multi-terminal SOP, which is a controllable power electronic device installed to replace the normally open point, provides accurate active and reactive power flow control to enable the flexible connection of feeders. An enhanced SOCP-based method for feeder load balancing using the multi-terminal SOP is proposedmore » in this paper. Furthermore, by regulating the operation of the multi-terminal SOP, the proposed method can mitigate the unbalanced condition of feeder load and simultaneously reduce the power losses of ADNs. Then, the original non-convex model is converted into a second-order cone programming (SOCP) model using convex relaxation. In order to tighten the SOCP relaxation and improve the computation efficiency, an enhanced SOCP-based approach is developed to solve the proposed model. Finally, case studies are performed on the modified IEEE 33-node system to verify the effectiveness and efficiency of the proposed method.« less

  14. On controllability of homogeneous and inhomogeneous discrete-time multi-input bilinear systems in dimension two

    NASA Astrophysics Data System (ADS)

    Tie, Lin

    2017-08-01

    In this paper, the controllability problem of two-dimensional discrete-time multi-input bilinear systems is completely solved. The homogeneous and the inhomogeneous cases are studied separately and necessary and sufficient conditions for controllability are established by using a linear algebraic method, which are easy to apply. Moreover, for the uncontrollable systems, near-controllability is considered and similar necessary and sufficient conditions are also obtained. Finally, examples are provided to demonstrate the results of this paper.

  15. Somatic Consequences and Symptomatic Responses to Stress: Directions for Future Research

    DTIC Science & Technology

    1999-07-01

    endeavors, some early work in developing multi- method , multi-source assessment approaches for identifying cases of PTSD; some clinical studies ... research dealing with the entire concept of the cultural shaping of what he calls the illness narrative and the way in which this tends to control the...talk for five to ten minutes about the pattern of the research you’ve been doing and the directions it’s been going in and the directions you think it

  16. A Multi-center Milestone Study of Clinical Vertebral CT Segmentation

    PubMed Central

    Yao, Jianhua; Burns, Joseph E.; Forsberg, Daniel; Seitel, Alexander; Rasoulian, Abtin; Abolmaesumi, Purang; Hammernik, Kerstin; Urschler, Martin; Ibragimov, Bulat; Korez, Robert; Vrtovec, Tomaž; Castro-Mateos, Isaac; Pozo, Jose M.; Frangi, Alejandro F.; Summers, Ronald M.; Li, Shuo

    2017-01-01

    A multiple center milestone study of clinical vertebra segmentation is presented in this paper. Vertebra segmentation is a fundamental step for spinal image analysis and intervention. The first half of the study was conducted in the spine segmentation challenge in 2014 International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI) Workshop on Computational Spine Imaging (CSI 2014). The objective was to evaluate the performance of several state-of-the-art vertebra segmentation algorithms on computed tomography (CT) scans using ten training and five testing dataset, all healthy cases; the second half of the study was conducted after the challenge, where additional 5 abnormal cases are used for testing to evaluate the performance under abnormal cases. Dice coefficients and absolute surface distances were used as evaluation metrics. Segmentation of each vertebra as a single geometric unit, as well as separate segmentation of vertebra substructures, was evaluated. Five teams participated in the comparative study. The top performers in the study achieved Dice coefficient of 0.93 in the upper thoracic, 0.95 in the lower thoracic and 0.96 in the lumbar spine for healthy cases, and 0.88 in the upper thoracic, 0.89 in the lower thoracic and 0.92 in the lumbar spine for osteoporotic and fractured cases. The strengths and weaknesses of each method as well as future suggestion for improvement are discussed. This is the first multi-center comparative study for vertebra segmentation methods, which will provide an up-to-date performance milestone for the fast growing spinal image analysis and intervention. PMID:26878138

  17. Technology selection for ballast water treatment by multi-stakeholders: A multi-attribute decision analysis approach based on the combined weights and extension theory.

    PubMed

    Ren, Jingzheng

    2018-01-01

    This objective of this study is to develop a generic multi-attribute decision analysis framework for ranking the technologies for ballast water treatment and determine their grades. An evaluation criteria system consisting of eight criteria in four categories was used to evaluate the technologies for ballast water treatment. The Best-Worst method, which is a subjective weighting method and Criteria importance through inter-criteria correlation method, which is an objective weighting method, were combined to determine the weights of the evaluation criteria. The extension theory was employed to prioritize the technologies for ballast water treatment and determine their grades. An illustrative case including four technologies for ballast water treatment, i.e. Alfa Laval (T 1 ), Hyde (T 2 ), Unitor (T 3 ), and NaOH (T 4 ), were studied by the proposed method, and the Hyde (T 2 ) was recognized as the best technology. Sensitivity analysis was also carried to investigate the effects of the combined coefficients and the weights of the evaluation criteria on the final priority order of the four technologies for ballast water treatment. The sum weighted method and the TOPSIS was also employed to rank the four technologies, and the results determined by these two methods are consistent to that determined by the proposed method in this study. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Pedagogical Practices: The Case of Multi-Class Teaching in Fiji Primary School

    ERIC Educational Resources Information Center

    Lingam, Govinda I.

    2007-01-01

    Multi-class teaching is a common phenomenon in small schools not only in Fiji, but also in many countries. The aim of the present study was to determine the teaching styles adopted by teachers in the context of multi-class teaching. A qualitative case study research design was adopted. This included a school with multi-class teaching as the norm.…

  19. Practical aspects and applications of the biological effective dose three-dimensional calculation for multi-phase radiotherapy treatment plans

    NASA Astrophysics Data System (ADS)

    Kauweloa, Kevin Ikaika

    The approximate BED (BEDA) is calculated for multi-phase cases due to current treatment planning systems (TPSs) being incapable of performing BED calculations. There has been no study on the mathematical accuracy and precision of BEDA relative to the true BED (BEDT), and how that might negatively impact patient care. The purpose of the first aim was to study the mathematical accuracy and precision in both hypothetical and clinical situations, while the next two aims were to create multi-phase BED optimization ideas for both multi-target liver stereotactic body radiation therapy (SBRT) cases, and gynecological cases where patients are treated with high-dose rate (HDR) brachytherapy along with external beam radiotherapy (EBRT). MATLAB algorithms created for this work were used to mathematically analyze the accuracy and precision of BEDA relative to BEDT in both hypothetical and clinical situations on a 3D basis. The organs-at-risk (OARs) of ten head & neck and ten prostate cancer patients were studied for the clinical situations. The accuracy of BEDA was shown to vary between OARs as well as between patients. The percentage of patients with an overall BEDA percent error less than 1% were, 50% for the Optic Chiasm and Brainstem, 70% for the Left and Right Optic Nerves, as well as the Rectum and Bladder, and 80% for the Normal Brain and Spinal Cord. As seen for each OAR among different patients, there were always cases where the percent error was greater than 1%. This is a cause for concern since the goal of radiation therapy is to reduce the overall uncertainty of treatment, and calculating BEDA distributions increases the treatment uncertainty with percent errors greater than 1%. The revealed inaccuracy and imprecision of BEDA supports the argument to use BEDT. The multi-target liver study involved applying BEDT in order to reduce the number of dose limits to one rather than have one for each fractionation scheme in multi-target liver SBRT treatments. A BEDT limit was found using the current, clinically accepted dose limits, allowing the BEDT distributions to be calculated, which could be used to determine whether at least 700 cc of the healthy liver did not receive the BEDT limit. Three previously multi-target liver cancer patients were studied. For each case, it was shown that the conventional treatment plans were relatively conservative and that more than 700 cc of the healthy liver received less than the BED T limit. These results show that greater doses can be delivered to the targets without exceeding the BEDT limit to the healthy tissue, which typically causes radiation toxicity. When applying BEDT to gynecological cases, the BEDT can reveal the relative effect each treatment would have individually hence the cumulative BEDT would better inform the physician of the potential results with the patient's treatment. The problem presented for these cases, however, is the method in summing dose distributions together when there is significant motion between treatments and the presence of applicators for the HDR phase. One way to calculate the cumulative BEDT is to use structure guided deformable image registration (SG-DIR) that only focuses on the anatomical contours, to avoid errors introduced by the applicators. Eighteen gynecological patients were studied and VelocityAI was used to perform this SG- DIR. In addition, formalism was developed to assess and characterize the remnant dose-mapping error from this approach, from the shortest distance between contour points (SDBP). The results revealed that warping errors rendered relatively large normal tissue complication probability (NTCP) values which are certainly non negligible and does render this method not clinically viable. However, a more accurate SG-DIR algorithm could improve the accuracy of BEDT distributions in these multi-phase cases.

  20. Undergraduate Psychology Students' Experiences with Creative Drama: A Multi-Case Study

    ERIC Educational Resources Information Center

    Wilcox, Ruth A.

    2015-01-01

    This qualitative multi-case study explored undergraduate psychology students' experiences participating in creative drama activities the instructor/researcher developed to teach psychological concepts. The study was conducted in three introductory and developmental courses in a mid-western community college setting. Participants (cases) included…

  1. Earth's rotation irregularities derived from UTIBLI by method of multi-composing of ordinates

    NASA Astrophysics Data System (ADS)

    Segan, S.; Damjanov, I.; Surlan, B.

    Using the method of multi-composing of ordinates we have identified in Earth's rotation a long-periodic term with a period similar to the relaxation time of Chandler nutation. There was not enough information to assess its origin. We demonstrate that the method can be used even in the case when the data time span is comparable to the period of harmonic component.

  2. Integrated seismic stochastic inversion and multi-attributes to delineate reservoir distribution: Case study MZ fields, Central Sumatra Basin

    NASA Astrophysics Data System (ADS)

    Haris, A.; Novriyani, M.; Suparno, S.; Hidayat, R.; Riyanto, A.

    2017-07-01

    This study presents the integration of seismic stochastic inversion and multi-attributes for delineating the reservoir distribution in term of lithology and porosity in the formation within depth interval between the Top Sihapas and Top Pematang. The method that has been used is a stochastic inversion, which is integrated with multi-attribute seismic by applying neural network Probabilistic Neural Network (PNN). Stochastic methods are used to predict the probability mapping sandstone as the result of impedance varied with 50 realizations that will produce a good probability. Analysis of Stochastic Seismic Tnversion provides more interpretive because it directly gives the value of the property. Our experiment shows that AT of stochastic inversion provides more diverse uncertainty so that the probability value will be close to the actual values. The produced AT is then used for an input of a multi-attribute analysis, which is used to predict the gamma ray, density and porosity logs. To obtain the number of attributes that are used, stepwise regression algorithm is applied. The results are attributes which are used in the process of PNN. This PNN method is chosen because it has the best correlation of others neural network method. Finally, we interpret the product of the multi-attribute analysis are in the form of pseudo-gamma ray volume, density volume and volume of pseudo-porosity to delineate the reservoir distribution. Our interpretation shows that the structural trap is identified in the southeastern part of study area, which is along the anticline.

  3. Healthcare managers' decision making: findings of a small scale exploratory study.

    PubMed

    Macdonald, Jackie; Bath, Peter A; Booth, Andrew

    2008-12-01

    Managers who work in publicly funded healthcare organizations are an understudied group. Some of the influences on their decisions may be unique to healthcare. This study considers how to integrate research knowledge effectively into healthcare managers' decision making, and how to manage and integrate information that will include community data. This first phase in a two-phase mixed methods research study used a qualitative, multiple case studies design. Nineteen semi-structured interviews were undertaken using the critical incident technique. Interview transcripts were analysed using the NatCen Framework. One theme represented ;information and decisions'. Cases were determined to involve complex multi-level, multi-situational decisions with participants in practical rather than ceremonial work roles. Most considered organizational knowledge in the first two decision phases and external knowledge, including research, in the third phase. All participants engaged in satisficing to some degree.

  4. Boundary value problems for multi-term fractional differential equations

    NASA Astrophysics Data System (ADS)

    Daftardar-Gejji, Varsha; Bhalekar, Sachin

    2008-09-01

    Multi-term fractional diffusion-wave equation along with the homogeneous/non-homogeneous boundary conditions has been solved using the method of separation of variables. It is observed that, unlike in the one term case, solution of multi-term fractional diffusion-wave equation is not necessarily non-negative, and hence does not represent anomalous diffusion of any kind.

  5. Multislice spiral CT simulator for dynamic cardiopulmonary studies

    NASA Astrophysics Data System (ADS)

    De Francesco, Silvia; Ferreira da Silva, Augusto M.

    2002-04-01

    We've developed a Multi-slice Spiral CT Simulator modeling the acquisition process of a real tomograph over a 4-dimensional phantom (4D MCAT) of the human thorax. The simulator allows us to visually characterize artifacts due to insufficient temporal sampling and a priori evaluate the quality of the images obtained in cardio-pulmonary studies (both with single-/multi-slice and ECG gated acquisition processes). The simulating environment allows both for conventional and spiral scanning modes and includes a model of noise in the acquisition process. In case of spiral scanning, reconstruction facilities include longitudinal interpolation methods (360LI and 180LI both for single and multi-slice). Then, the reconstruction of the section is performed through FBP. The reconstructed images/volumes are affected by distortion due to insufficient temporal sampling of the moving object. The developed simulating environment allows us to investigate the nature of the distortion characterizing it qualitatively and quantitatively (using, for example, Herman's measures). Much of our work is focused on the determination of adequate temporal sampling and sinogram regularization techniques. At the moment, the simulator model is limited to the case of multi-slice tomograph, being planned as a next step of development the extension to cone beam or area detectors.

  6. Risk analysis based on hazards interactions

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  7. MRMC analysis of agreement studies

    NASA Astrophysics Data System (ADS)

    Gallas, Brandon D.; Anam, Amrita; Chen, Weijie; Wunderlich, Adam; Zhang, Zhiwei

    2016-03-01

    The purpose of this work is to present and evaluate methods based on U-statistics to compare intra- or inter-reader agreement across different imaging modalities. We apply these methods to multi-reader multi-case (MRMC) studies. We measure reader-averaged agreement and estimate its variance accounting for the variability from readers and cases (an MRMC analysis). In our application, pathologists (readers) evaluate patient tissue mounted on glass slides (cases) in two ways. They evaluate the slides on a microscope (reference modality) and they evaluate digital scans of the slides on a computer display (new modality). In the current work, we consider concordance as the agreement measure, but many of the concepts outlined here apply to other agreement measures. Concordance is the probability that two readers rank two cases in the same order. Concordance can be estimated with a U-statistic and thus it has some nice properties: it is unbiased, asymptotically normal, and its variance is given by an explicit formula. Another property of a U-statistic is that it is symmetric in its inputs; it doesn't matter which reader is listed first or which case is listed first, the result is the same. Using this property and a few tricks while building the U-statistic kernel for concordance, we get a mathematically tractable problem and efficient software. Simulations show that our variance and covariance estimates are unbiased.

  8. Layering, interface and edge effects in multi-layered composite medium

    NASA Technical Reports Server (NTRS)

    Datta, S. K.; Shah, A. H.; Karunesena, W.

    1990-01-01

    Guided waves in a cross-ply laminated plate are studied. Because of the complexity of the exact dispersion equation that governs the wave propagation in a multi-layered fiber-reinforced plate, a stiffness method that can be applied to any number of layers is presented. It is shown that, for a sufficiently large number of layers, the plate can be modeled as a homogeneous anisotropic plate. Also studied is the reflection of guided waves from the edge of a multilayered plate. These results are quite different than in the case of a single homogeneous plate.

  9. The Worst-Case Weighted Multi-Objective Game with an Application to Supply Chain Competitions.

    PubMed

    Qu, Shaojian; Ji, Ying

    2016-01-01

    In this paper, we propose a worst-case weighted approach to the multi-objective n-person non-zero sum game model where each player has more than one competing objective. Our "worst-case weighted multi-objective game" model supposes that each player has a set of weights to its objectives and wishes to minimize its maximum weighted sum objectives where the maximization is with respect to the set of weights. This new model gives rise to a new Pareto Nash equilibrium concept, which we call "robust-weighted Nash equilibrium". We prove that the robust-weighted Nash equilibria are guaranteed to exist even when the weight sets are unbounded. For the worst-case weighted multi-objective game with the weight sets of players all given as polytope, we show that a robust-weighted Nash equilibrium can be obtained by solving a mathematical program with equilibrium constraints (MPEC). For an application, we illustrate the usefulness of the worst-case weighted multi-objective game to a supply chain risk management problem under demand uncertainty. By the comparison with the existed weighted approach, we show that our method is more robust and can be more efficiently used for the real-world applications.

  10. Advanced technology development multi-color holography

    NASA Technical Reports Server (NTRS)

    Vikram, Chandra S.

    1993-01-01

    This is the final report of the Multi-color Holography project. The comprehensive study considers some strategic aspects of multi-color holography. First, various methods of available techniques for accurate fringe counting are reviewed. These are heterodyne interferometry, quasi-heterodyne interferometry, and phase-shifting interferometry. Phase-shifting interferometry was found to be the most suitable for multi-color holography. Details of experimentation with a sugar solution are also reported where better than 1/200 of a fringe order measurement capability was established. Rotating plate glass phase shifter was used for the experimentation. The report then describes the possible role of using more than two wavelengths with special reference-to-object beam intensity ratio needs in multicolor holography. Some specific two- and three-color cases are also described in detail. Then some new analysis methods of the reconstructed wavefront are considered. These are deflectometry, speckle metrology, confocal optical signal processing, and phase shifting technique related applications. Finally, design aspects of an experimental breadboard are presented.

  11. Lean Manufacturing Principles Improving the Targeting Process

    DTIC Science & Technology

    2012-06-08

    author has familiarity with Lean manufacturing principles. Third, Lean methods have been used in different industries and have proven adaptable to the...92 The case study also demonstrates the multi organizational application of VSM, JIT and the 5S method ...new members not knowing the process, this will serve as a start point for the developing of understanding. Within the Food industry we observed “the

  12. Decision Rules Used in Academic Program Closure: Where the Rubber Meets the Road.

    ERIC Educational Resources Information Center

    Eckel, Peter D.

    This study examines, from an organizational perspective, decision rules guiding program discontinuance, testing the framework of decision rule rationality versus action rationality. A multi-site case study method was used; interviews were conducted with 11-16 individuals at each of four research I or II universities that had discontinued at least…

  13. A Discursive Formation that Undermined Integration at a Historically Advantaged School in South Africa

    ERIC Educational Resources Information Center

    Naidoo, Devika

    2010-01-01

    This paper provides an analysis of the extent of integration at a historically advantaged school. A qualitative multi-method case study allowed for in-depth analysis of integration in the school. Bernstein's theory of code, classification, boundary and power framed the study. Data analysis showed that: racial desegregation was achieved at student…

  14. Frontline Leaders: The Entry Point for Leadership Development in the Manufacturing Industry

    ERIC Educational Resources Information Center

    Liu, Lucy; McMurray, Adela J.

    2004-01-01

    This multi-method case study examined the roles, functions, capabilities, job satisfaction, strengths, weaknesses and skill gaps of frontline team leaders working on the shopfloor in the Australian automobile industry. The study was conducted in a large automobile manufacturing company employing 4,500 employees and rated as one of the top 22…

  15. Large Margin Multi-Modal Multi-Task Feature Extraction for Image Classification.

    PubMed

    Yong Luo; Yonggang Wen; Dacheng Tao; Jie Gui; Chao Xu

    2016-01-01

    The features used in many image analysis-based applications are frequently of very high dimension. Feature extraction offers several advantages in high-dimensional cases, and many recent studies have used multi-task feature extraction approaches, which often outperform single-task feature extraction approaches. However, most of these methods are limited in that they only consider data represented by a single type of feature, even though features usually represent images from multiple modalities. We, therefore, propose a novel large margin multi-modal multi-task feature extraction (LM3FE) framework for handling multi-modal features for image classification. In particular, LM3FE simultaneously learns the feature extraction matrix for each modality and the modality combination coefficients. In this way, LM3FE not only handles correlated and noisy features, but also utilizes the complementarity of different modalities to further help reduce feature redundancy in each modality. The large margin principle employed also helps to extract strongly predictive features, so that they are more suitable for prediction (e.g., classification). An alternating algorithm is developed for problem optimization, and each subproblem can be efficiently solved. Experiments on two challenging real-world image data sets demonstrate the effectiveness and superiority of the proposed method.

  16. Mediating Factors in Literacy Instruction: How Novice Elementary Teachers Navigate New Teaching Contexts

    ERIC Educational Resources Information Center

    Scales, Roya Qualls; Wolsey, Thomas DeVere; Young, Janet; Smetana, Linda; Grisham, Dana L.; Lenski, Susan; Dobler, Elizabeth; Yoder, Karen Kreider; Chambers, Sandra A.

    2017-01-01

    This longitudinal study, framed by activity theory, examines what seven novice teachers' talk and actions reveal about their literacy teaching practices then delves into mediating influences of the teaching context. Utilizing collective, multi-case methods, data sources included interviews, observations, and artifacts. Findings indicate novices…

  17. Multi-objective game-theory models for conflict analysis in reservoir watershed management.

    PubMed

    Lee, Chih-Sheng

    2012-05-01

    This study focuses on the development of a multi-objective game-theory model (MOGM) for balancing economic and environmental concerns in reservoir watershed management and for assistance in decision. Game theory is used as an alternative tool for analyzing strategic interaction between economic development (land use and development) and environmental protection (water-quality protection and eutrophication control). Geographic information system is used to concisely illustrate and calculate the areas of various land use types. The MOGM methodology is illustrated in a case study of multi-objective watershed management in the Tseng-Wen reservoir, Taiwan. The innovation and advantages of MOGM can be seen in the results, which balance economic and environmental concerns in watershed management and which can be interpreted easily by decision makers. For comparison, the decision-making process using conventional multi-objective method to produce many alternatives was found to be more difficult. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Application of dragonfly algorithm for optimal performance analysis of process parameters in turn-mill operations- A case study

    NASA Astrophysics Data System (ADS)

    Vikram, K. Arun; Ratnam, Ch; Lakshmi, VVK; Kumar, A. Sunny; Ramakanth, RT

    2018-02-01

    Meta-heuristic multi-response optimization methods are widely in use to solve multi-objective problems to obtain Pareto optimal solutions during optimization. This work focuses on optimal multi-response evaluation of process parameters in generating responses like surface roughness (Ra), surface hardness (H) and tool vibration displacement amplitude (Vib) while performing operations like tangential and orthogonal turn-mill processes on A-axis Computer Numerical Control vertical milling center. Process parameters like tool speed, feed rate and depth of cut are considered as process parameters machined over brass material under dry condition with high speed steel end milling cutters using Taguchi design of experiments (DOE). Meta-heuristic like Dragonfly algorithm is used to optimize the multi-objectives like ‘Ra’, ‘H’ and ‘Vib’ to identify the optimal multi-response process parameters combination. Later, the results thus obtained from multi-objective dragonfly algorithm (MODA) are compared with another multi-response optimization technique Viz. Grey relational analysis (GRA).

  19. Methodological Considerations for an Evolving Model of Institutional Research.

    ERIC Educational Resources Information Center

    Jones, Timothy B.; Essien-Barrett, Barbara; Gill, Peggy B.

    A multi-case study was used in the self-study of three programs within an academic department of a mid-sized Southern university. Multi-case methodology as a form of self-study encourages a process of self-renewal and programmatic change as it defines an active stakeholder role. The participants in the three case studies were university faculty…

  20. Full multi grid method for electric field computation in point-to-plane streamer discharge in air at atmospheric pressure

    NASA Astrophysics Data System (ADS)

    Kacem, S.; Eichwald, O.; Ducasse, O.; Renon, N.; Yousfi, M.; Charrada, K.

    2012-01-01

    Streamers dynamics are characterized by the fast propagation of ionized shock waves at the nanosecond scale under very sharp space charge variations. The streamer dynamics modelling needs the solution of charged particle transport equations coupled to the elliptic Poisson's equation. The latter has to be solved at each time step of the streamers evolution in order to follow the propagation of the resulting space charge electric field. In the present paper, a full multi grid (FMG) and a multi grid (MG) methods have been adapted to solve Poisson's equation for streamer discharge simulations between asymmetric electrodes. The validity of the FMG method for the computation of the potential field is first shown by performing direct comparisons with analytic solution of the Laplacian potential in the case of a point-to-plane geometry. The efficiency of the method is also compared with the classical successive over relaxation method (SOR) and MUltifrontal massively parallel solver (MUMPS). MG method is then applied in the case of the simulation of positive streamer propagation and its efficiency is evaluated from comparisons to SOR and MUMPS methods in the chosen point-to-plane configuration. Very good agreements are obtained between the three methods for all electro-hydrodynamics characteristics of the streamer during its propagation in the inter-electrode gap. However in the case of MG method, the computational time to solve the Poisson's equation is at least 2 times faster in our simulation conditions.

  1. Economic Analysis of Centralized vs. Decentralized Electronic Data Capture in Multi-Center Clinical Studies

    PubMed Central

    Walden, Anita; Nahm, Meredith; Barnett, M. Edwina; Conde, Jose G.; Dent, Andrew; Fadiel, Ahmed; Perry, Theresa; Tolk, Chris; Tcheng, James E.; Eisenstein, Eric L.

    2012-01-01

    Background New data management models are emerging in multi-center clinical studies. We evaluated the incremental costs associated with decentralized vs. centralized models. Methods We developed clinical research network economic models to evaluate three data management models: centralized, decentralized with local software, and decentralized with shared database. Descriptive information from three clinical research studies served as inputs for these models. Main Outcome Measures The primary outcome was total data management costs. Secondary outcomes included: data management costs for sites, local data centers, and central coordinating centers. Results Both decentralized models were more costly than the centralized model for each clinical research study: the decentralized with local software model was the most expensive. Decreasing the number of local data centers and case book pages reduced cost differentials between models. Conclusion Decentralized vs. centralized data management in multi-center clinical research studies is associated with increases in data management costs. PMID:21335692

  2. The work of case managers as experienced by older persons (75+) with multi-morbidity - a focused ethnography.

    PubMed

    Hjelm, Markus; Holst, Göran; Willman, Ania; Bohman, Doris; Kristensson, Jimmie

    2015-12-17

    Complex health systems make it difficult for older persons (75+) with multi-morbidity to achieve continuity of care. Case management could be one way to address this difficulty. Currently, there is a need to extend the knowledge regarding case management as experienced by those utilising the services, namely older persons (75+) with multi-morbidity. The study aimed to explore older persons' (75+) with multi-morbidity experiences of case managers. The study design was qualitative and used a focused ethnographic approach. Data was collected through individual interviews with 13 older persons and by participant observations with accompanying field notes, all conducted in 2012-2013. The data revealed four themes illustrating the older persons' experiences of case managers: 1) Someone providing me with a trusting relationship; 2) Someone assisting me; 3) Someone who is on my side; and 4) Someone I do not need at present. This study illustrates the importance of establishing trusting relationships between older persons and their case managers in order to truly provide assistance. The older persons valued the case managers acting as informed but unbiased facilitators. The findings could be of help in the development of case management interventions better designed for older persons with multi-morbidity.

  3. Pilot users in agile development processes: motivational factors.

    PubMed

    Johannessen, Liv Karen; Gammon, Deede

    2010-01-01

    Despite a wealth of research on user participation, few studies offer insights into how to involve multi-organizational users in agile development methods. This paper is a case study of user involvement in developing a system for electronic laboratory requisitions using agile methodologies in a multi-organizational context. Building on an interpretive approach, we illuminate questions such as: How does collaboration between users and developers evolve and how might it be improved? What key motivational aspects are at play when users volunteer and continue contributing in the face of considerable added burdens? The study highlights how agile methods in themselves appear to facilitate mutually motivating collaboration between user groups and developers. Lessons learned for leveraging the advantages of agile development processes include acknowledging the substantial and ongoing contributions of users and their roles as co-designers of the system.

  4. Multi-energy x-ray detectors to improve air-cargo security

    NASA Astrophysics Data System (ADS)

    Paulus, Caroline; Moulin, Vincent; Perion, Didier; Radisson, Patrick; Verger, Loïck

    2017-05-01

    X-ray based systems have been used for decades to screen luggage or cargo to detect illicit material. The advent of energy-sensitive photon-counting x-ray detectors mainly based on Cd(Zn)Te semi-conductor technology enables to improve discrimination between materials compared to single or dual energy technology. The presented work is part of the EUROSKY European project to develop a Single European Secure Air-Cargo Space. "Cargo" context implies the presence of relatively heavy objects and with potentially high atomic number. All the study is conducted on simulations with three different detectors: a typical dual energy sandwich detector, a realistic model of the commercial ME100 multi-energy detector marketed by MULTIX, and a ME100 "Cargo": a not yet existing modified multi-energy version of the ME100 more suited to air freight cargo inspection. Firstly, a comparison on simulated measurements shows the performances improvement of the new multi-energy detectors compared to the current dual-energy one. The relative performances are evaluated according to different criteria of separability or contrast-to-noise ratio and the impact of different parameters is studied (influence of channel number, type of materials and tube voltage). Secondly, performances of multi-energy detectors for overlaps processing in a dual-view system is accessed: the case of orthogonal projections has been studied, one giving dimensional values, the other one providing spectral data to assess effective atomic number. A method of overlap correction has been proposed and extended to multi-layer objects case. Therefore, Calibration and processing based on bi-material decomposition have been adapted for this purpose.

  5. Optimized scheme in coal-fired boiler combustion based on information entropy and modified K-prototypes algorithm

    NASA Astrophysics Data System (ADS)

    Gu, Hui; Zhu, Hongxia; Cui, Yanfeng; Si, Fengqi; Xue, Rui; Xi, Han; Zhang, Jiayu

    2018-06-01

    An integrated combustion optimization scheme is proposed for the combined considering the restriction in coal-fired boiler combustion efficiency and outlet NOx emissions. Continuous attribute discretization and reduction techniques are handled as optimization preparation by E-Cluster and C_RED methods, in which the segmentation numbers don't need to be provided in advance and can be continuously adapted with data characters. In order to obtain results of multi-objections with clustering method for mixed data, a modified K-prototypes algorithm is then proposed. This algorithm can be divided into two stages as K-prototypes algorithm for clustering number self-adaptation and clustering for multi-objective optimization, respectively. Field tests were carried out at a 660 MW coal-fired boiler to provide real data as a case study for controllable attribute discretization and reduction in boiler system and obtaining optimization parameters considering [ maxηb, minyNOx ] multi-objective rule.

  6. Assessment of health-care waste disposal methods using a VIKOR-based fuzzy multi-criteria decision making method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Hu-Chen; Department of Industrial Engineering and Management, Tokyo Institute of Technology, Tokyo 152-8552; Wu, Jing

    Highlights: • Propose a VIKOR-based fuzzy MCDM technique for evaluating HCW disposal methods. • Linguistic variables are used to assess the ratings and weights for the criteria. • The OWA operator is utilized to aggregate individual opinions of decision makers. • A case study is given to illustrate the procedure of the proposed framework. - Abstract: Nowadays selection of the appropriate treatment method in health-care waste (HCW) management has become a challenge task for the municipal authorities especially in developing countries. Assessment of HCW disposal alternatives can be regarded as a complicated multi-criteria decision making (MCDM) problem which requires considerationmore » of multiple alternative solutions and conflicting tangible and intangible criteria. The objective of this paper is to present a new MCDM technique based on fuzzy set theory and VIKOR method for evaluating HCW disposal methods. Linguistic variables are used by decision makers to assess the ratings and weights for the established criteria. The ordered weighted averaging (OWA) operator is utilized to aggregate individual opinions of decision makers into a group assessment. The computational procedure of the proposed framework is illustrated through a case study in Shanghai, one of the largest cities of China. The HCW treatment alternatives considered in this study include “incineration”, “steam sterilization”, “microwave” and “landfill”. The results obtained using the proposed approach are analyzed in a comparative way.« less

  7. Multi-Frequency Signal Detection Based on Frequency Exchange and Re-Scaling Stochastic Resonance and Its Application to Weak Fault Diagnosis.

    PubMed

    Liu, Jinjun; Leng, Yonggang; Lai, Zhihui; Fan, Shengbo

    2018-04-25

    Mechanical fault diagnosis usually requires not only identification of the fault characteristic frequency, but also detection of its second and/or higher harmonics. However, it is difficult to detect a multi-frequency fault signal through the existing Stochastic Resonance (SR) methods, because the characteristic frequency of the fault signal as well as its second and higher harmonics frequencies tend to be large parameters. To solve the problem, this paper proposes a multi-frequency signal detection method based on Frequency Exchange and Re-scaling Stochastic Resonance (FERSR). In the method, frequency exchange is implemented using filtering technique and Single SideBand (SSB) modulation. This new method can overcome the limitation of "sampling ratio" which is the ratio of the sampling frequency to the frequency of target signal. It also ensures that the multi-frequency target signals can be processed to meet the small-parameter conditions. Simulation results demonstrate that the method shows good performance for detecting a multi-frequency signal with low sampling ratio. Two practical cases are employed to further validate the effectiveness and applicability of this method.

  8. The Effectiveness of Simulated Robots for Supporting the Learning of Introductory Programming: A Multi-Case Case Study

    ERIC Educational Resources Information Center

    Major, Louis; Kyriacou, Theocharis; Brereton, Pearl

    2014-01-01

    This work investigates the effectiveness of simulated robots as tools to support the learning of programming. After the completion of a systematic review and exploratory research, a multi-case case study was undertaken. A simulator, named Kebot, was developed and used to run four 10-hour programming workshops. Twenty-three student participants…

  9. A Multi-Dimensional Approach to Gradient Change in Phonological Acquisition: A Case Study of Disordered Speech Development

    ERIC Educational Resources Information Center

    Glaspey, Amy M.; MacLeod, Andrea A. N.

    2010-01-01

    The purpose of the current study is to document phonological change from a multidimensional perspective for a 3-year-old boy with phonological disorder by comparing three measures: (1) accuracy of consonant productions, (2) dynamic assessment, and (3) acoustic analysis. The methods included collecting a sample of the targets /s, [image omitted],…

  10. Unpacking the "Value Added" Impact of Continuing Professional Education: A Multi-Method Case Study Approach.

    ERIC Educational Resources Information Center

    Smith, Jo; Topping, Annie

    2001-01-01

    A study of 14 nurses who completed a children's neuroscience course found evidence of improved knowledge and increased ability to care for neurology patients. Although the direct impact of continuing education on patient care is difficult to assess, participants' assessment of their learning and its potential to affect patient care is a valid…

  11. Multi-angle Indicators System of Non-point Pollution Source Assessment in Rural Areas: A Case Study Near Taihu Lake

    NASA Astrophysics Data System (ADS)

    Huang, Lei; Ban, Jie; Han, Yu Ting; Yang, Jie; Bi, Jun

    2013-04-01

    This study aims to identify key environmental risk sources contributing to water eutrophication and to suggest certain risk management strategies for rural areas. The multi-angle indicators included in the risk source assessment system were non-point source pollution, deficient waste treatment, and public awareness of environmental risk, which combined psychometric paradigm methods, the contingent valuation method, and personal interviews to describe the environmental sensitivity of local residents. Total risk values of different villages near Taihu Lake were calculated in the case study, which resulted in a geographic risk map showing which village was the critical risk source of Taihu eutrophication. The increased application of phosphorus (P) and nitrogen (N), loss vulnerability of pollutant, and a lack of environmental risk awareness led to more serious non-point pollution, especially in rural China. Interesting results revealed by the quotient between the scores of objective risk sources and subjective risk sources showed what should be improved for each study village. More environmental investments, control of agricultural activities, and promotion of environmental education are critical considerations for rural environmental management. These findings are helpful for developing targeted and effective risk management strategies in rural areas.

  12. Shifting Preservice Teachers' Beliefs and Understandings to Support Pedagogical Change in Mathematics

    ERIC Educational Resources Information Center

    Letwinsky, Karim Medico; Cavender, Monica

    2018-01-01

    Many preservice teacher (PST) programs throughout the world are preparing students to implement the Core Standards, which require deeper conceptual understandings of mathematics and an informed approach for teaching. In this qualitative multi-case study, researchers explored the teaching methods for two university instructors and changes in PSTs…

  13. Individual Differences in Written Corrective Feedback: A Multi-Case Study

    ERIC Educational Resources Information Center

    Li, Su; Li, Pengjing

    2012-01-01

    Written corrective feedback (WCF) has been a long time practice in L2 writing instruction. However, in many cases, the effects are not satisfactory. There have been controversies about it both theoretically and empirically. This paper reports a multi-case study exploring individual differences that impact learners' responses to WCF. Four students'…

  14. A method for optimizing multi-objective reservoir operation upon human and riverine ecosystem demands

    NASA Astrophysics Data System (ADS)

    Ai, Xueshan; Dong, Zuo; Mo, Mingzhu

    2017-04-01

    The optimal reservoir operation is in generally a multi-objective problem. In real life, most of the reservoir operation optimization problems involve conflicting objectives, for which there is no single optimal solution which can simultaneously gain an optimal result of all the purposes, but rather a set of well distributed non-inferior solutions or Pareto frontier exists. On the other hand, most of the reservoirs operation rules is to gain greater social and economic benefits at the expense of ecological environment, resulting to the destruction of riverine ecology and reduction of aquatic biodiversity. To overcome these drawbacks, this study developed a multi-objective model for the reservoir operating with the conflicting functions of hydroelectric energy generation, irrigation and ecological protection. To solve the model with the objectives of maximize energy production, maximize the water demand satisfaction rate of irrigation and ecology, we proposed a multi-objective optimization method of variable penalty coefficient (VPC), which was based on integrate dynamic programming (DP) with discrete differential dynamic programming (DDDP), to generate a well distributed non-inferior along the Pareto front by changing the penalties coefficient of different objectives. This method was applied to an existing China reservoir named Donggu, through a course of a year, which is a multi-annual storage reservoir with multiple purposes. The case study results showed a good relationship between any two of the objectives and a good Pareto optimal solutions, which provide a reference for the reservoir decision makers.

  15. Contaminant source and release history identification in groundwater: A multi-step approach

    NASA Astrophysics Data System (ADS)

    Gzyl, G.; Zanini, A.; Frączek, R.; Kura, K.

    2014-02-01

    The paper presents a new multi-step approach aiming at source identification and release history estimation. The new approach consists of three steps: performing integral pumping tests, identifying sources, and recovering the release history by means of a geostatistical approach. The present paper shows the results obtained from the application of the approach within a complex case study in Poland in which several areal sources were identified. The investigated site is situated in the vicinity of a former chemical plant in southern Poland in the city of Jaworzno in the valley of the Wąwolnica River; the plant has been in operation since the First World War producing various chemicals. From an environmental point of view the most relevant activity was the production of pesticides, especially lindane. The application of the multi-step approach enabled a significant increase in the knowledge of contamination at the site. Some suspected contamination sources have been proven to have minor effect on the overall contamination. Other suspected sources have been proven to have key significance. Some areas not taken into consideration previously have now been identified as key sources. The method also enabled estimation of the magnitude of the sources and, a list of the priority reclamation actions will be drawn as a result. The multi-step approach has proven to be effective and may be applied to other complicated contamination cases. Moreover, the paper shows the capability of the geostatistical approach to manage a complex real case study.

  16. The Worst-Case Weighted Multi-Objective Game with an Application to Supply Chain Competitions

    PubMed Central

    Qu, Shaojian; Ji, Ying

    2016-01-01

    In this paper, we propose a worst-case weighted approach to the multi-objective n-person non-zero sum game model where each player has more than one competing objective. Our “worst-case weighted multi-objective game” model supposes that each player has a set of weights to its objectives and wishes to minimize its maximum weighted sum objectives where the maximization is with respect to the set of weights. This new model gives rise to a new Pareto Nash equilibrium concept, which we call “robust-weighted Nash equilibrium”. We prove that the robust-weighted Nash equilibria are guaranteed to exist even when the weight sets are unbounded. For the worst-case weighted multi-objective game with the weight sets of players all given as polytope, we show that a robust-weighted Nash equilibrium can be obtained by solving a mathematical program with equilibrium constraints (MPEC). For an application, we illustrate the usefulness of the worst-case weighted multi-objective game to a supply chain risk management problem under demand uncertainty. By the comparison with the existed weighted approach, we show that our method is more robust and can be more efficiently used for the real-world applications. PMID:26820512

  17. Evaluation of a Multi-Case Participatory Action Research Project: The Case of SOLINSA

    ERIC Educational Resources Information Center

    Home, Robert; Rump, Niels

    2015-01-01

    Purpose: Scholars agree that evaluation of participatory action research is inherently valuable; however there have been few attempts at evaluating across methods and across interventions because the perceived success of a method is affected by context, researcher skills and the aims of the participants. This paper describes the systematic…

  18. Signal Processing in Functional Near-Infrared Spectroscopy (fNIRS): Methodological Differences Lead to Different Statistical Results.

    PubMed

    Pfeifer, Mischa D; Scholkmann, Felix; Labruyère, Rob

    2017-01-01

    Even though research in the field of functional near-infrared spectroscopy (fNIRS) has been performed for more than 20 years, consensus on signal processing methods is still lacking. A significant knowledge gap exists between established researchers and those entering the field. One major issue regularly observed in publications from researchers new to the field is the failure to consider possible signal contamination by hemodynamic changes unrelated to neurovascular coupling (i.e., scalp blood flow and systemic blood flow). This might be due to the fact that these researchers use the signal processing methods provided by the manufacturers of their measurement device without an advanced understanding of the performed steps. The aim of the present study was to investigate how different signal processing approaches (including and excluding approaches that partially correct for the possible signal contamination) affect the results of a typical functional neuroimaging study performed with fNIRS. In particular, we evaluated one standard signal processing method provided by a commercial company and compared it to three customized approaches. We thereby investigated the influence of the chosen method on the statistical outcome of a clinical data set (task-evoked motor cortex activity). No short-channels were used in the present study and therefore two types of multi-channel corrections based on multiple long-channels were applied. The choice of the signal processing method had a considerable influence on the outcome of the study. While methods that ignored the contamination of the fNIRS signals by task-evoked physiological noise yielded several significant hemodynamic responses over the whole head, the statistical significance of these findings disappeared when accounting for part of the contamination using a multi-channel regression. We conclude that adopting signal processing methods that correct for physiological confounding effects might yield more realistic results in cases where multi-distance measurements are not possible. Furthermore, we recommend using manufacturers' standard signal processing methods only in case the user has an advanced understanding of every signal processing step performed.

  19. A method based on the Jacobi tau approximation for solving multi-term time-space fractional partial differential equations

    NASA Astrophysics Data System (ADS)

    Bhrawy, A. H.; Zaky, M. A.

    2015-01-01

    In this paper, we propose and analyze an efficient operational formulation of spectral tau method for multi-term time-space fractional differential equation with Dirichlet boundary conditions. The shifted Jacobi operational matrices of Riemann-Liouville fractional integral, left-sided and right-sided Caputo fractional derivatives are presented. By using these operational matrices, we propose a shifted Jacobi tau method for both temporal and spatial discretizations, which allows us to present an efficient spectral method for solving such problem. Furthermore, the error is estimated and the proposed method has reasonable convergence rates in spatial and temporal discretizations. In addition, some known spectral tau approximations can be derived as special cases from our algorithm if we suitably choose the corresponding special cases of Jacobi parameters θ and ϑ. Finally, in order to demonstrate its accuracy, we compare our method with those reported in the literature.

  20. Additive Partial Least Squares for efficient modelling of independent variance sources demonstrated on practical case studies.

    PubMed

    Luoma, Pekka; Natschläger, Thomas; Malli, Birgit; Pawliczek, Marcin; Brandstetter, Markus

    2018-05-12

    A model recalibration method based on additive Partial Least Squares (PLS) regression is generalized for multi-adjustment scenarios of independent variance sources (referred to as additive PLS - aPLS). aPLS allows for effortless model readjustment under changing measurement conditions and the combination of independent variance sources with the initial model by means of additive modelling. We demonstrate these distinguishing features on two NIR spectroscopic case-studies. In case study 1 aPLS was used as a readjustment method for an emerging offset. The achieved RMS error of prediction (1.91 a.u.) was of similar level as before the offset occurred (2.11 a.u.). In case-study 2 a calibration combining different variance sources was conducted. The achieved performance was of sufficient level with an absolute error being better than 0.8% of the mean concentration, therefore being able to compensate negative effects of two independent variance sources. The presented results show the applicability of the aPLS approach. The main advantages of the method are that the original model stays unadjusted and that the modelling is conducted on concrete changes in the spectra thus supporting efficient (in most cases straightforward) modelling. Additionally, the method is put into context of existing machine learning algorithms. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Penalty methods for the numerical solution of American multi-asset option problems

    NASA Astrophysics Data System (ADS)

    Nielsen, Bjørn Fredrik; Skavhaug, Ola; Tveito, Aslak

    2008-12-01

    We derive and analyze a penalty method for solving American multi-asset option problems. A small, non-linear penalty term is added to the Black-Scholes equation. This approach gives a fixed solution domain, removing the free and moving boundary imposed by the early exercise feature of the contract. Explicit, implicit and semi-implicit finite difference schemes are derived, and in the case of independent assets, we prove that the approximate option prices satisfy some basic properties of the American option problem. Several numerical experiments are carried out in order to investigate the performance of the schemes. We give examples indicating that our results are sharp. Finally, the experiments indicate that in the case of correlated underlying assets, the same properties are valid as in the independent case.

  2. [Diagnostic values of bronchoscopy and multi-slice spiral CT for congenital dysplasia of the respiratory system in infants: a comparative study].

    PubMed

    Wang, Xing-Lu; Huang, Ying; Li, Qu-Bei; Dai, Ji-Hong

    2013-09-01

    To investigate and compare the diagnostic values of bronchoscopy and multi-slice spiral computed tomography (CT) for congenital dysplasia of the respiratory system in infants. Analysis was performed on the clinical data, bronchoscopic findings and multi-slice spiral CT findings of 319 infants (≤1 years old) who underwent bronchoscopy and/or multi-slice spiral CT and were diagnosed with congenital dysplasia of the respiratory system. A total of 476 cases of congenital dysplasia of the respiratory system were found in the 319 infants, including primary dysplasia of the respiratory system (392 cases) and compressive dysplasia of the respiratory system (84 cases). Of the 392 cases of primary dysplasia of the respiratory system, 225 (57.4%) were diagnosed by bronchoscopy versus 167 (42.6%) by multi-slice spiral CT. There were significant differences in etiological diagnosis between bronchoscopy and multi-slice spiral CT in infants with congenital dysplasia of the respiratory system (P<0.05). All 76 cases of primary dysplasia of the respiratory system caused by tracheobronchomalacia were diagnosed by bronchoscopy and all 17 cases of primary dysplasia of the respiratory system caused by lung tissue dysplasia were diagnosed by multi-slice spiral CT. Of the 84 cases of compressive dysplasia of the respiratory system, 74 cases were diagnosed by multi-slice spiral CT and only 10 cases were diagnosed by bronchoscopy. Compared with multi-slice spiral CT, bronchoscopy can detect primary dysplasia of the respiratory system more directly. Bronchoscopy is valuable in the confirmed diagnosis of tracheobronchomalacia. Multi-slice spiral CT has a higher diagnostic value for lung tissue dysplasia than bronchoscopy.

  3. The Importance of Proximal Fusion Level Selection for Outcomes of Multi-Level Lumbar Posterolateral Fusion

    PubMed Central

    Nam, Woo Dong

    2015-01-01

    Background There are few studies about risk factors for poor outcomes from multi-level lumbar posterolateral fusion limited to three or four level lumbar posterolateral fusions. The purpose of this study was to analyze the outcomes of multi-level lumbar posterolateral fusion and to search for possible risk factors for poor surgical outcomes. Methods We retrospectively analyzed 37 consecutive patients who underwent multi-level lumbar or lumbosacral posterolateral fusion with posterior instrumentation. The outcomes were deemed either 'good' or 'bad' based on clinical and radiological results. Many demographic and radiological factors were analyzed to examine potential risk factors for poor outcomes. Student t-test, Fisher exact test, and the chi-square test were used based on the nature of the variables. Multiple logistic regression analysis was used to exclude confounding factors. Results Twenty cases showed a good outcome (group A, 54.1%) and 17 cases showed a bad outcome (group B, 45.9%). The overall fusion rate was 70.3%. The revision procedures (group A: 1/20, 5.0%; group B: 4/17, 23.5%), proximal fusion to L2 (group A: 5/20, 25.0%; group B: 10/17, 58.8%), and severity of stenosis (group A: 12/19, 63.3%; group B: 3/11, 27.3%) were adopted as possible related factors to the outcome in univariate analysis. Multiple logistic regression analysis revealed that only the proximal fusion level (superior instrumented vertebra, SIV) was a significant risk factor. The cases in which SIV was L2 showed inferior outcomes than those in which SIV was L3. The odds ratio was 6.562 (95% confidence interval, 1.259 to 34.203). Conclusions The overall outcome of multi-level lumbar or lumbosacral posterolateral fusion was not as high as we had hoped it would be. Whether the SIV was L2 or L3 was the only significant risk factor identified for poor outcomes in multi-level lumbar or lumbosacral posterolateral fusion in the current study. Thus, the authors recommend that proximal fusion levels be carefully determined when multi-level lumbar fusions are considered. PMID:25729522

  4. Adoption of the HPV vaccine: a case study of three emerging countries.

    PubMed

    Caro Martínez, Araceli; Espín Balbino, Jaime; Lemgruber, Alexandre; Martín Ruiz, Eva; Olry de Labry Lima, Antonio; García-Mochón, Leticia; Lessa, Fernanda

    2017-05-01

    The human papillomavirus (HPV) vaccine has recently attracted considerable attention in emerging countries, due to its potential to reduce the impact of HPV-related diseases. This case study sheds new light about the variety of HTA arrangements, methods and processes involved in the adoption and use of HPV vaccines in a selected sample of central, eastern and southern Europe and Latin America and the Caribbean, all of them emerging in the use of HTA. A multi-country case study was designed. Mixed methods, document review, semi-structured surveys and personal communication with experts, were used for data collection and triangulation. This study shows that common elements of good practice exist in the processes and methods used, with all countries arriving at the same appraisal recommendations. However, the influence of socio-politico-economic factors appears to be determinant on the final decisions and restrictions to access made. This case study intends to draw useful lessons for policymakers in emerging settings interested in the adoption of the HPV vaccine supported by evidence-informed processes, such as those offered by institutionalized HTA. Future studies are also recommended to elucidate the specific roles that social values and uncertainties play in vaccine decision-making across different societies.

  5. The Role of Thermodynamic Processes in the Evolution of Single and Multi-banding within Winter Storms

    NASA Astrophysics Data System (ADS)

    Ganetis, Sara Anne

    Mesoscale precipitation bands within Northeast U.S. (NEUS) winter storms result in heterogeneous spatial and temporal snowfall. Several studies have provided analysis of snowbands focusing on larger, meso-beta scale bands with lengths (L) > 200 km known as single bands. NEUS winter storms can also exhibit multiple bands with meso-beta scale (L < 200 km) and similar spatial orientation and when ≥ 3 occur are termed multi-bands; however, the genesis and evolution of multi-bands is less well understood. Unlike single bands, there is no multi-bands climatological study. In addition, there has been little detailed thermodynamic analysis of snowbands. This dissertation utilizes radar observations, reanalyses, and high-resolution model simulations to explore the thermodynamic evolution of single and multi-bands. Bands are identified within 20 cool season (October-April) NEUS storms. The 110-case dataset was classified using a combination of automated and manual methods into: single band only (SINGLE), multi-bands only (MULTI), both single and multi-bands (BOTH), and non-banded (NONE). Multi-bands occur with the presence of a single band in 55.4% of times used in this study, without the presence of a single band 18.1% of the time, and precipitation exhibits no banded characteristics 23.8% of the time. Most MULTI events occur in the northeast quadrant of a developing cyclone poleward of weak-midlevel forcing along a warm front, whereas multi-bands associated with BOTH events mostly occur in the northwest quadrant of mature cyclones associated with strong mid-level frontogenesis and conditional symmetric instability. The non-banded precipitation associated with NONE events occur in the eastern quadrants of developing and mature cyclones lacking mid-level forcing to concentrate the precipitation into bands. A high-resolution mesoscale model is used to explore the evolution of single and multi-bands based on two case studies, one of a single band and one of multi-bands. The multi-bands form in response to intermittent mid-level frontogenetical forcing in a conditionally unstable environment. The bands within their genesis location southeast of the single band move northwest towards the single band by 700-hPa steering flow. This allows for the formation of new multi-bands within the genesis region, unlike the single band that remains fixed to a 700-hPa frontogenesis maximum. Latent heating within the band is shown to increase the intensity and duration of single and multi-bands through decreased geopotential height below the heating maximum that leads to increased convergence within the band.

  6. Decision tree-based method for integrating gene expression, demographic, and clinical data to determine disease endotypes

    PubMed Central

    2013-01-01

    Background Complex diseases are often difficult to diagnose, treat and study due to the multi-factorial nature of the underlying etiology. Large data sets are now widely available that can be used to define novel, mechanistically distinct disease subtypes (endotypes) in a completely data-driven manner. However, significant challenges exist with regard to how to segregate individuals into suitable subtypes of the disease and understand the distinct biological mechanisms of each when the goal is to maximize the discovery potential of these data sets. Results A multi-step decision tree-based method is described for defining endotypes based on gene expression, clinical covariates, and disease indicators using childhood asthma as a case study. We attempted to use alternative approaches such as the Student’s t-test, single data domain clustering and the Modk-prototypes algorithm, which incorporates multiple data domains into a single analysis and none performed as well as the novel multi-step decision tree method. This new method gave the best segregation of asthmatics and non-asthmatics, and it provides easy access to all genes and clinical covariates that distinguish the groups. Conclusions The multi-step decision tree method described here will lead to better understanding of complex disease in general by allowing purely data-driven disease endotypes to facilitate the discovery of new mechanisms underlying these diseases. This application should be considered a complement to ongoing efforts to better define and diagnose known endotypes. When coupled with existing methods developed to determine the genetics of gene expression, these methods provide a mechanism for linking genetics and exposomics data and thereby accounting for both major determinants of disease. PMID:24188919

  7. Unconditionally energy stable time stepping scheme for Cahn–Morral equation: Application to multi-component spinodal decomposition and optimal space tiling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tavakoli, Rouhollah, E-mail: rtavakoli@sharif.ir

    An unconditionally energy stable time stepping scheme is introduced to solve Cahn–Morral-like equations in the present study. It is constructed based on the combination of David Eyre's time stepping scheme and Schur complement approach. Although the presented method is general and independent of the choice of homogeneous free energy density function term, logarithmic and polynomial energy functions are specifically considered in this paper. The method is applied to study the spinodal decomposition in multi-component systems and optimal space tiling problems. A penalization strategy is developed, in the case of later problem, to avoid trivial solutions. Extensive numerical experiments demonstrate themore » success and performance of the presented method. According to the numerical results, the method is convergent and energy stable, independent of the choice of time stepsize. Its MATLAB implementation is included in the appendix for the numerical evaluation of algorithm and reproduction of the presented results. -- Highlights: •Extension of Eyre's convex–concave splitting scheme to multiphase systems. •Efficient solution of spinodal decomposition in multi-component systems. •Efficient solution of least perimeter periodic space partitioning problem. •Developing a penalization strategy to avoid trivial solutions. •Presentation of MATLAB implementation of the introduced algorithm.« less

  8. A Multi-Method Treatment for Child Survivors of Sexual Abuse: An Intervention Informed by Relational and Trauma Theories.

    ERIC Educational Resources Information Center

    Levendosky, Alytia A.; Buttenheim, Margaret

    2002-01-01

    Presents a case study of the treatment of a pre-adolescent female survivor of incest. The treatment integrated relational and trauma theory perspectives in focusing on reducing self-blame, preventing further isolation, creating a safe, secure environment, and helping the patient develop positive connections with others and feelings of…

  9. Silviculture and multi-resource management case studies for southwestern pinyon-juniper woodlands

    Treesearch

    Gerald J. Gottfried

    2008-01-01

    Southwestern pinyon-juniper and juniper woodlands cover large areas of the Western United States. The woodlands are heterogeneous, consisting of numerous combinations of tree, shrub, and herbaceous species and stand densities that are representative of the wide range of sites and habitat types they occupy. Silvicultural methods can be employed on better sites to meet...

  10. Engaging communities and climate change futures with Multi-Scale, Iterative Scenario Building (MISB) in the western United States

    Treesearch

    Daniel Murphy; Carina Wyborn; Laurie Yung; Daniel R. Williams; Cory Cleveland; Lisa Eby; Solomon Dobrowski; Erin Towler

    2016-01-01

    Current projections of future climate change foretell potentially transformative ecological changes that threaten communities globally. Using two case studies from the United States Intermountain West, this article highlights the ways in which a better articulation between theory and methods in research design can generate proactive applied tools that enable...

  11. Validating Remotely Sensed Land Surface Evapotranspiration Based on Multi-scale Field Measurements

    NASA Astrophysics Data System (ADS)

    Jia, Z.; Liu, S.; Ziwei, X.; Liang, S.

    2012-12-01

    The land surface evapotranspiration plays an important role in the surface energy balance and the water cycle. There have been significant technical and theoretical advances in our knowledge of evapotranspiration over the past two decades. Acquisition of the temporally and spatially continuous distribution of evapotranspiration using remote sensing technology has attracted the widespread attention of researchers and managers. However, remote sensing technology still has many uncertainties coming from model mechanism, model inputs, parameterization schemes, and scaling issue in the regional estimation. Achieving remotely sensed evapotranspiration (RS_ET) with confident certainty is required but difficult. As a result, it is indispensable to develop the validation methods to quantitatively assess the accuracy and error sources of the regional RS_ET estimations. This study proposes an innovative validation method based on multi-scale evapotranspiration acquired from field measurements, with the validation results including the accuracy assessment, error source analysis, and uncertainty analysis of the validation process. It is a potentially useful approach to evaluate the accuracy and analyze the spatio-temporal properties of RS_ET at both the basin and local scales, and is appropriate to validate RS_ET in diverse resolutions at different time-scales. An independent RS_ET validation using this method was presented over the Hai River Basin, China in 2002-2009 as a case study. Validation at the basin scale showed good agreements between the 1 km annual RS_ET and the validation data such as the water balanced evapotranspiration, MODIS evapotranspiration products, precipitation, and landuse types. Validation at the local scale also had good results for monthly, daily RS_ET at 30 m and 1 km resolutions, comparing to the multi-scale evapotranspiration measurements from the EC and LAS, respectively, with the footprint model over three typical landscapes. Although some validation experiments demonstrated that the models yield accurate estimates at flux measurement sites, the question remains whether they are performing well over the broader landscape. Moreover, a large number of RS_ET products have been released in recent years. Thus, we also pay attention to the cross-validation method of RS_ET derived from multi-source models. "The Multi-scale Observation Experiment on Evapotranspiration over Heterogeneous Land Surfaces: Flux Observation Matrix" campaign is carried out at the middle reaches of the Heihe River Basin, China in 2012. Flux measurements from an observation matrix composed of 22 EC and 4 LAS are acquired to investigate the cross-validation of multi-source models over different landscapes. In this case, six remote sensing models, including the empirical statistical model, the one-source and two-source models, the Penman-Monteith equation based model, the Priestley-Taylor equation based model, and the complementary relationship based model, are used to perform an intercomparison. All the results from the two cases of RS_ET validation showed that the proposed validation methods are reasonable and feasible.

  12. Multi-criteria decision making development of ion chromatographic method for determination of inorganic anions in oilfield waters based on artificial neural networks retention model.

    PubMed

    Stefanović, Stefica Cerjan; Bolanča, Tomislav; Luša, Melita; Ukić, Sime; Rogošić, Marko

    2012-02-24

    This paper describes the development of ad hoc methodology for determination of inorganic anions in oilfield water, since their composition often significantly differs from the average (concentration of components and/or matrix). Therefore, fast and reliable method development has to be performed in order to ensure the monitoring of desired properties under new conditions. The method development was based on computer assisted multi-criteria decision making strategy. The used criteria were: maximal value of objective functions used, maximal robustness of the separation method, minimal analysis time, and maximal retention distance between two nearest components. Artificial neural networks were used for modeling of anion retention. The reliability of developed method was extensively tested by the validation of performance characteristics. Based on validation results, the developed method shows satisfactory performance characteristics, proving the successful application of computer assisted methodology in the described case study. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Multi-Frequency Signal Detection Based on Frequency Exchange and Re-Scaling Stochastic Resonance and Its Application to Weak Fault Diagnosis

    PubMed Central

    Leng, Yonggang; Fan, Shengbo

    2018-01-01

    Mechanical fault diagnosis usually requires not only identification of the fault characteristic frequency, but also detection of its second and/or higher harmonics. However, it is difficult to detect a multi-frequency fault signal through the existing Stochastic Resonance (SR) methods, because the characteristic frequency of the fault signal as well as its second and higher harmonics frequencies tend to be large parameters. To solve the problem, this paper proposes a multi-frequency signal detection method based on Frequency Exchange and Re-scaling Stochastic Resonance (FERSR). In the method, frequency exchange is implemented using filtering technique and Single SideBand (SSB) modulation. This new method can overcome the limitation of "sampling ratio" which is the ratio of the sampling frequency to the frequency of target signal. It also ensures that the multi-frequency target signals can be processed to meet the small-parameter conditions. Simulation results demonstrate that the method shows good performance for detecting a multi-frequency signal with low sampling ratio. Two practical cases are employed to further validate the effectiveness and applicability of this method. PMID:29693577

  14. Multi-method Assessment of Psychopathy in Relation to Factors of Internalizing and Externalizing from the Personality Assessment Inventory: The Impact of Method Variance and Suppressor Effects

    PubMed Central

    Blonigen, Daniel M.; Patrick, Christopher J.; Douglas, Kevin S.; Poythress, Norman G.; Skeem, Jennifer L.; Lilienfeld, Scott O.; Edens, John F.; Krueger, Robert F.

    2010-01-01

    Research to date has revealed divergent relations across factors of psychopathy measures with criteria of internalizing (INT; anxiety, depression) and externalizing (EXT; antisocial behavior, substance use). However, failure to account for method variance and suppressor effects has obscured the consistency of these findings across distinct measures of psychopathy. Using a large correctional sample, the current study employed a multi-method approach to psychopathy assessment (self-report, interview/file review) to explore convergent and discriminant relations between factors of psychopathy measures and latent criteria of INT and EXT derived from the Personality Assessment Inventory (PAI; L. Morey, 2007). Consistent with prediction, scores on the affective-interpersonal factor of psychopathy were negatively associated with INT and negligibly related to EXT, whereas scores on the social deviance factor exhibited positive associations (moderate and large, respectively) with both INT and EXT. Notably, associations were highly comparable across the psychopathy measures when accounting for method variance (in the case of EXT) and when assessing for suppressor effects (in the case of INT). Findings are discussed in terms of implications for clinical assessment and evaluation of the validity of interpretations drawn from scores on psychopathy measures. PMID:20230156

  15. Assessment of health-care waste disposal methods using a VIKOR-based fuzzy multi-criteria decision making method.

    PubMed

    Liu, Hu-Chen; Wu, Jing; Li, Ping

    2013-12-01

    Nowadays selection of the appropriate treatment method in health-care waste (HCW) management has become a challenge task for the municipal authorities especially in developing countries. Assessment of HCW disposal alternatives can be regarded as a complicated multi-criteria decision making (MCDM) problem which requires consideration of multiple alternative solutions and conflicting tangible and intangible criteria. The objective of this paper is to present a new MCDM technique based on fuzzy set theory and VIKOR method for evaluating HCW disposal methods. Linguistic variables are used by decision makers to assess the ratings and weights for the established criteria. The ordered weighted averaging (OWA) operator is utilized to aggregate individual opinions of decision makers into a group assessment. The computational procedure of the proposed framework is illustrated through a case study in Shanghai, one of the largest cities of China. The HCW treatment alternatives considered in this study include "incineration", "steam sterilization", "microwave" and "landfill". The results obtained using the proposed approach are analyzed in a comparative way. Copyright © 2013. Published by Elsevier Ltd.

  16. The role of economics in the QUERI program: QUERI Series.

    PubMed

    Smith, Mark W; Barnett, Paul G

    2008-04-22

    The United States (U.S.) Department of Veterans Affairs (VA) Quality Enhancement Research Initiative (QUERI) has implemented economic analyses in single-site and multi-site clinical trials. To date, no one has reviewed whether the QUERI Centers are taking an optimal approach to doing so. Consistent with the continuous learning culture of the QUERI Program, this paper provides such a reflection. We present a case study of QUERI as an example of how economic considerations can and should be integrated into implementation research within both single and multi-site studies. We review theoretical and applied cost research in implementation studies outside and within VA. We also present a critique of the use of economic research within the QUERI program. Economic evaluation is a key element of implementation research. QUERI has contributed many developments in the field of implementation but has only recently begun multi-site implementation trials across multiple regions within the national VA healthcare system. These trials are unusual in their emphasis on developing detailed costs of implementation, as well as in the use of business case analyses (budget impact analyses). Economics appears to play an important role in QUERI implementation studies, only after implementation has reached the stage of multi-site trials. Economic analysis could better inform the choice of which clinical best practices to implement and the choice of implementation interventions to employ. QUERI economics also would benefit from research on costing methods and development of widely accepted international standards for implementation economics.

  17. Vacuum tube operation analysis under multi-harmonic driving and heavy beam loading effect in J-PARC RCS

    NASA Astrophysics Data System (ADS)

    Yamamoto, M.; Nomura, M.; Shimada, T.; Tamura, F.; Hara, K.; Hasegawa, K.; Ohmori, C.; Toda, M.; Yoshii, M.; Schnase, A.

    2016-11-01

    An rf cavity in the J-PARC RCS not only covers the frequency range of a fundamental acceleration pattern but also generates multi-harmonic rf voltage because it has a broadband impedance. However, analyzing the vacuum tube operation in the case of multi-harmonics is very complicated because many variables must be solved in a self-consistent manner. We developed a method to analyze the vacuum tube operation using a well-known formula and which includes the dependence on anode current for some variables. The calculation method is verified with beam tests, and the results indicate that it is efficient under condition of multi-harmonics with a heavy beam loading effect.

  18. A Multi-Year Program Developing an Explicit Reflective Pedagogy for Teaching Pre-service Teachers the Nature of Science by Ostention

    NASA Astrophysics Data System (ADS)

    Smith, Mike U.; Scharmann, Lawrence

    2008-02-01

    This investigation delineates a multi-year action research agenda designed to develop an instructional model for teaching the nature of science (NOS) to preservice science teachers. Our past research strongly supports the use of explicit reflective instructional methods, which includes Thomas Kuhn’s notion of learning by ostention and treating science as a continuum (i.e., comparing fields of study to one another for relative placement as less to more scientific). Instruction based on conceptual change precepts, however, also exhibits promise. Thus, the investigators sought to ascertain the degree to which conceptual change took place among students (n = 15) participating in the NOS instructional model. Three case studies are presented to illustrate successful conceptual changes that took place as a result of the NOS instructional model. All three cases represent students who claim a very conservative Christian heritage and for whom evolution was not considered a legitimate scientific theory prior to participating in the NOS instructional model. All three case study individuals, along with their twelve classmates, placed evolution as most scientific when compared to intelligent design and a fictional field of study called “Umbrellaology.”

  19. A WENO-Limited, ADER-DT, Finite-Volume Scheme for Efficient, Robust, and Communication-Avoiding Multi-Dimensional Transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Norman, Matthew R

    2014-01-01

    The novel ADER-DT time discretization is applied to two-dimensional transport in a quadrature-free, WENO- and FCT-limited, Finite-Volume context. Emphasis is placed on (1) the serial and parallel computational properties of ADER-DT and this framework and (2) the flexibility of ADER-DT and this framework in efficiently balancing accuracy with other constraints important to transport applications. This study demonstrates a range of choices for the user when approaching their specific application while maintaining good parallel properties. In this method, genuine multi-dimensionality, single-step and single-stage time stepping, strict positivity, and a flexible range of limiting are all achieved with only one parallel synchronizationmore » and data exchange per time step. In terms of parallel data transfers per simulated time interval, this improves upon multi-stage time stepping and post-hoc filtering techniques such as hyperdiffusion. This method is evaluated with standard transport test cases over a range of limiting options to demonstrate quantitatively and qualitatively what a user should expect when employing this method in their application.« less

  20. A multi-method approach toward de novo glycan characterization: a Man-5 case study.

    PubMed

    Prien, Justin M; Prater, Bradley D; Cockrill, Steven L

    2010-05-01

    Regulatory agencies' expectations for biotherapeutic approval are becoming more stringent with regard to product characterization, where minor species as low as 0.1% of a given profile are typically identified. The mission of this manuscript is to demonstrate a multi-method approach toward de novo glycan characterization and quantitation, including minor species at or approaching the 0.1% benchmark. Recently, unexpected isomers of the Man(5)GlcNAc(2) (M(5)) were reported (Prien JM, Ashline DJ, Lapadula AJ, Zhang H, Reinhold VN. 2009. The high mannose glycans from bovine ribonuclease B isomer characterization by ion trap mass spectrometry (MS). J Am Soc Mass Spectrom. 20:539-556). In the current study, quantitative analysis of these isomers found in commercial M(5) standard demonstrated that they are in low abundance (<1% of the total) and therefore an exemplary "litmus test" for minor species characterization. A simple workflow devised around three core well-established analytical procedures: (1) fluorescence derivatization; (2) online rapid resolution reversed-phase separation coupled with negative-mode sequential mass spectrometry (RRRP-(-)-MS(n)); and (3) permethylation derivatization with nanospray sequential mass spectrometry (NSI-MS(n)) provides comprehensive glycan structural determination. All methods have limitations; however, a multi-method workflow is an at-line stopgap/solution which mitigates each method's individual shortcoming(s) providing greater opportunity for more comprehensive characterization. This manuscript is the first to demonstrate quantitative chromatographic separation of the M(5) isomers and the use of a commercially available stable isotope variant of 2-aminobenzoic acid to detect and chromatographically resolve multiple M(5) isomers in bovine ribonuclease B. With this multi-method approach, we have the capabilities to comprehensively characterize a biotherapeutic's glycan array in a de novo manner, including structural isomers at >/=0.1% of the total chromatographic peak area.

  1. Optimal subsystem approach to multi-qubit quantum state discrimination and experimental investigation

    NASA Astrophysics Data System (ADS)

    Xue, ShiChuan; Wu, JunJie; Xu, Ping; Yang, XueJun

    2018-02-01

    Quantum computing is a significant computing capability which is superior to classical computing because of its superposition feature. Distinguishing several quantum states from quantum algorithm outputs is often a vital computational task. In most cases, the quantum states tend to be non-orthogonal due to superposition; quantum mechanics has proved that perfect outcomes could not be achieved by measurements, forcing repetitive measurement. Hence, it is important to determine the optimum measuring method which requires fewer repetitions and a lower error rate. However, extending current measurement approaches mainly aiming at quantum cryptography to multi-qubit situations for quantum computing confronts challenges, such as conducting global operations which has considerable costs in the experimental realm. Therefore, in this study, we have proposed an optimum subsystem method to avoid these difficulties. We have provided an analysis of the comparison between the reduced subsystem method and the global minimum error method for two-qubit problems; the conclusions have been verified experimentally. The results showed that the subsystem method could effectively discriminate non-orthogonal two-qubit states, such as separable states, entangled pure states, and mixed states; the cost of the experimental process had been significantly reduced, in most circumstances, with acceptable error rate. We believe the optimal subsystem method is the most valuable and promising approach for multi-qubit quantum computing applications.

  2. Waste management barriers in developing country hospitals: Case study and AHP analysis.

    PubMed

    Delmonico, Diego V de Godoy; Santos, Hugo H Dos; Pinheiro, Marco Ap; de Castro, Rosani; de Souza, Regiane M

    2018-01-01

    Healthcare waste management is an essential field for both researchers and practitioners. Although there have been few studies using statistical methods for its evaluation, it has been the subject of several studies in different contexts. Furthermore, the known precarious practices for waste management in developing countries raise questions about its potential barriers. This study aims to investigate the barriers in healthcare waste management and their relevance. For this purpose, this paper analyses waste management practices in two Brazilian hospitals by using case study and the Analytic Hierarchy Process method. The barriers were organized into three categories - human factors, management, and infrastructure, and the main findings suggest that cost and employee awareness were the most significant barriers. These results highlight the main barriers to more sustainable waste management, and provide an empirical basis for multi-criteria evaluation of the literature.

  3. Loop-mediated isothermal amplification (LAMP) assay for speedy diagnosis of tubercular lymphadenitis: The multi-targeted 60-minute approach.

    PubMed

    Sharma, Megha; Sharma, Kusum; Sharma, Aman; Gupta, Nalini; Rajwanshi, Arvind

    2016-09-01

    Tuberculous lymphadenitis (TBLA), the most common presentation of tuberculosis, poses a significant diagnostic challenge in the developing countries. Timely, accurate and cost-effective diagnosis can decrease the high morbidity associated with TBLA especially in resource-poor high-endemic regions. The loop-mediated isothermal amplification assay (LAMP), using two targets, was evaluated for the diagnosis of TBLA. LAMP assay using 3 sets of primers (each for IS6110 and MPB64) was performed on 170 fine needle aspiration samples (85 confirmed, 35 suspected, 50 control cases of TBLA). Results were compared against IS6110 PCR, cytology, culture and smear. The overall sensitivity and specificity of LAMP assay, using multi-targeted approach, was 90% and 100% respectively in diagnosing TBLA. The sensitivity of multi-targeted LAMP, only MPB64 LAMP, only IS6110 LAMP and IS6110 PCR was 91.7%, 89.4%, 84.7% and 75.2%, respectively among confirmed cases and 85.7%, 77.1%, 68.5% and 60%, respectively among suspected cases of TBLA. Additional 12/120 (10%) cases were detected using multi-targeted method. The multi-targeted LAMP, with its speedy and reliable results, is a potential diagnostic test for TBLA in low-resource countries. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Water supply management using an extended group fuzzy decision-making method: a case study in north-eastern Iran

    NASA Astrophysics Data System (ADS)

    Minatour, Yasser; Bonakdari, Hossein; Zarghami, Mahdi; Bakhshi, Maryam Ali

    2015-09-01

    The purpose of this study was to develop a group fuzzy multi-criteria decision-making method to be applied in rating problems associated with water resources management. Thus, here Chen's group fuzzy TOPSIS method extended by a difference technique to handle uncertainties of applying a group decision making. Then, the extended group fuzzy TOPSIS method combined with a consistency check. In the presented method, initially linguistic judgments are being surveyed via a consistency checking process, and afterward these judgments are being used in the extended Chen's fuzzy TOPSIS method. Here, each expert's opinion is turned to accurate mathematical numbers and, then, to apply uncertainties, the opinions of group are turned to fuzzy numbers using three mathematical operators. The proposed method is applied to select the optimal strategy for the rural water supply of Nohoor village in north-eastern Iran, as a case study and illustrated example. Sensitivity analyses test over results and comparing results with project reality showed that proposed method offered good results for water resources projects.

  5. A custom correlation coefficient (CCC) approach for fast identification of multi-SNP association patterns in genome-wide SNPs data.

    PubMed

    Climer, Sharlee; Yang, Wei; de las Fuentes, Lisa; Dávila-Román, Victor G; Gu, C Charles

    2014-11-01

    Complex diseases are often associated with sets of multiple interacting genetic factors and possibly with unique sets of the genetic factors in different groups of individuals (genetic heterogeneity). We introduce a novel concept of custom correlation coefficient (CCC) between single nucleotide polymorphisms (SNPs) that address genetic heterogeneity by measuring subset correlations autonomously. It is used to develop a 3-step process to identify candidate multi-SNP patterns: (1) pairwise (SNP-SNP) correlations are computed using CCC; (2) clusters of so-correlated SNPs identified; and (3) frequencies of these clusters in disease cases and controls compared to identify disease-associated multi-SNP patterns. This method identified 42 candidate multi-SNP associations with hypertensive heart disease (HHD), among which one cluster of 22 SNPs (six genes) included 13 in SLC8A1 (aka NCX1, an essential component of cardiac excitation-contraction coupling) and another of 32 SNPs had 29 from a different segment of SLC8A1. While allele frequencies show little difference between cases and controls, the cluster of 22 associated alleles were found in 20% of controls but no cases and the other in 3% of controls but 20% of cases. These suggest that both protective and risk effects on HHD could be exerted by combinations of variants in different regions of SLC8A1, modified by variants from other genes. The results demonstrate that this new correlation metric identifies disease-associated multi-SNP patterns overlooked by commonly used correlation measures. Furthermore, computation time using CCC is a small fraction of that required by other methods, thereby enabling the analyses of large GWAS datasets. © 2014 WILEY PERIODICALS, INC.

  6. A custom correlation coefficient (CCC) approach for fast identification of multi-SNP association patterns in genome-wide SNPs data

    PubMed Central

    Climer, Sharlee; Yang, Wei; de las Fuentes, Lisa; Dávila-Román, Victor G.; Gu, C. Charles

    2014-01-01

    Complex diseases are often associated with sets of multiple interacting genetic factors and possibly with unique sets of the genetic factors in different groups of individuals (genetic heterogeneity). We introduce a novel concept of Custom Correlation Coefficient (CCC) between single nucleotide polymorphisms (SNPs) that address genetic heterogeneity by measuring subset correlations autonomously. It is used to develop a 3-step process to identify candidate multi-SNP patterns: (1) pairwise (SNP-SNP) correlations are computed using CCC; (2) clusters of so-correlated SNPs identified; and (3) frequencies of these clusters in disease cases and controls compared to identify disease-associated multi-SNP patterns. This method identified 42 candidate multi-SNP associations with hypertensive heart disease (HHD), among which one cluster of 22 SNPs (6 genes) included 13 in SLC8A1 (aka NCX1, an essential component of cardiac excitation-contraction coupling) and another of 32 SNPs had 29 from a different segment of SLC8A1. While allele frequencies show little difference between cases and controls, the cluster of 22 associated alleles were found in 20% of controls but no cases and the other in 3% of controls but 20% of cases. These suggest that both protective and risk effects on HHD could be exerted by combinations of variants in different regions of SLC8A1, modified by variants from other genes. The results demonstrate that this new correlation metric identifies disease-associated multi-SNP patterns overlooked by commonly used correlation measures. Furthermore, computation time using CCC is a small fraction of that required by other methods, thereby enabling the analyses of large GWAS datasets. PMID:25168954

  7. Case managers for older persons with multi-morbidity and their everyday work – a focused ethnography

    PubMed Central

    2013-01-01

    Background Modern-day health systems are complex, making it difficult to assure continuity of care for older persons with multi-morbidity. One way of intervening in a health system that is leading to fragmented care is by utilising Case Management (CM). CM aims to improve co-ordination of healthcare and social services. To better understand and advance the development of CM, there is a need for additional research that provides rich descriptions of CM in practice. This knowledge is important as there could be unknown mechanisms, contextual or interpersonal, that contribute to the success or failure of a CM intervention. Furthermore, the CM intervention in this study is conducted in the context of the Swedish health system, which prior to this intervention was unfamiliar with this kind of coordinative service. The aim of this study was to explore the everyday work undertaken by case managers within a CM intervention, with a focus on their experiences. Methods The study design was qualitative and inductive, utilising a focused ethnographic approach. Data collection consisted of participant observations with field notes as well as a group interview and individual interviews with nine case managers, conducted in 2012/2013. The interviews were recorded, transcribed verbatim and subjected to thematic analysis. Results An overarching theme emerged from the data: Challenging current professional identity, with three sub-themes. The sub-themes were 1) Adjusting to familiar work in an unfamiliar role; 2) Striving to improve the health system through a new role; 3) Trust is vital to advocacy. Conclusions Findings from this study shed some light on the complexity of CM for older persons with multi-morbidity, as seen from the perspective of case managers. The findings illustrate how their everyday work as case managers represents a challenge to their current professional identity. These findings could help to understand and promote the development of CM models aimed at a population of older persons with complex health needs. PMID:24279695

  8. Toward an Understanding of Development of Learning to Solve Ill-Defined Problems in an Online Context: A Multi-Year Qualitative Exploratory Study

    ERIC Educational Resources Information Center

    Peddibhotla, Naren

    2016-01-01

    The case study is a classic tool used in several educational programs that emphasizes solving of illdefined problems. Though it has been used in classroom-based teaching and educators have developed a rich repertoire of methods, its use in online courses presents different challenges. To explore factors that develop skills in solving ill-defined…

  9. Monte Carlo-based interval transformation analysis for multi-criteria decision analysis of groundwater management strategies under uncertain naphthalene concentrations and health risks

    NASA Astrophysics Data System (ADS)

    Ren, Lixia; He, Li; Lu, Hongwei; Chen, Yizhong

    2016-08-01

    A new Monte Carlo-based interval transformation analysis (MCITA) is used in this study for multi-criteria decision analysis (MCDA) of naphthalene-contaminated groundwater management strategies. The analysis can be conducted when input data such as total cost, contaminant concentration and health risk are represented as intervals. Compared to traditional MCDA methods, MCITA-MCDA has the advantages of (1) dealing with inexactness of input data represented as intervals, (2) mitigating computational time due to the introduction of Monte Carlo sampling method, (3) identifying the most desirable management strategies under data uncertainty. A real-world case study is employed to demonstrate the performance of this method. A set of inexact management alternatives are considered in each duration on the basis of four criteria. Results indicated that the most desirable management strategy lied in action 15 for the 5-year, action 8 for the 10-year, action 12 for the 15-year, and action 2 for the 20-year management.

  10. A Case Study of Dynamic Response Analysis and Safety Assessment for a Suspended Monorail System.

    PubMed

    Bao, Yulong; Li, Yongle; Ding, Jiajie

    2016-11-10

    A suspended monorail transit system is a category of urban rail transit, which is effective in alleviating traffic pressure and injury prevention. Meanwhile, with the advantages of low cost and short construction time, suspended monorail transit systems show vast potential for future development. However, the suspended monorail has not been systematically studied in China, and there is a lack of relevant knowledge and analytical methods. To ensure the health and reliability of a suspended monorail transit system, the driving safety of vehicles and structure dynamic behaviors when vehicles are running on the bridge should be analyzed and evaluated. Based on the method of vehicle-bridge coupling vibration theory, the finite element method (FEM) software ANSYS and multi-body dynamics software SIMPACK are adopted respectively to establish the finite element model for bridge and the multi-body vehicle. A co-simulation method is employed to investigate the vehicle-bridge coupling vibration for the transit system. The traffic operation factors, including train formation, track irregularity and tire stiffness, are incorporated into the models separately to analyze the bridge and vehicle responses. The results show that the coupling of dynamic effects of the suspended monorail system between vehicle and bridge are significant in the case studied, and it is strongly suggested to take necessary measures for vibration suppression. The simulation of track irregularity is a critical factor for its vibration safety, and the track irregularity of A-level road roughness negatively influences the system vibration safety.

  11. A Case Study of Dynamic Response Analysis and Safety Assessment for a Suspended Monorail System

    PubMed Central

    Bao, Yulong; Li, Yongle; Ding, Jiajie

    2016-01-01

    A suspended monorail transit system is a category of urban rail transit, which is effective in alleviating traffic pressure and injury prevention. Meanwhile, with the advantages of low cost and short construction time, suspended monorail transit systems show vast potential for future development. However, the suspended monorail has not been systematically studied in China, and there is a lack of relevant knowledge and analytical methods. To ensure the health and reliability of a suspended monorail transit system, the driving safety of vehicles and structure dynamic behaviors when vehicles are running on the bridge should be analyzed and evaluated. Based on the method of vehicle-bridge coupling vibration theory, the finite element method (FEM) software ANSYS and multi-body dynamics software SIMPACK are adopted respectively to establish the finite element model for bridge and the multi-body vehicle. A co-simulation method is employed to investigate the vehicle-bridge coupling vibration for the transit system. The traffic operation factors, including train formation, track irregularity and tire stiffness, are incorporated into the models separately to analyze the bridge and vehicle responses. The results show that the coupling of dynamic effects of the suspended monorail system between vehicle and bridge are significant in the case studied, and it is strongly suggested to take necessary measures for vibration suppression. The simulation of track irregularity is a critical factor for its vibration safety, and the track irregularity of A-level road roughness negatively influences the system vibration safety. PMID:27834923

  12. Application of GA-SVM method with parameter optimization for landslide development prediction

    NASA Astrophysics Data System (ADS)

    Li, X. Z.; Kong, J. M.

    2013-10-01

    Prediction of landslide development process is always a hot issue in landslide research. So far, many methods for landslide displacement series prediction have been proposed. Support vector machine (SVM) has been proved to be a novel algorithm with good performance. However, the performance strongly depends on the right selection of the parameters (C and γ) of SVM model. In this study, we presented an application of GA-SVM method with parameter optimization in landslide displacement rate prediction. We selected a typical large-scale landslide in some hydro - electrical engineering area of Southwest China as a case. On the basis of analyzing the basic characteristics and monitoring data of the landslide, a single-factor GA-SVM model and a multi-factor GA-SVM model of the landslide were built. Moreover, the models were compared with single-factor and multi-factor SVM models of the landslide. The results show that, the four models have high prediction accuracies, but the accuracies of GA-SVM models are slightly higher than those of SVM models and the accuracies of multi-factor models are slightly higher than those of single-factor models for the landslide prediction. The accuracy of the multi-factor GA-SVM models is the highest, with the smallest RSME of 0.0009 and the biggest RI of 0.9992.

  13. Care pathways across the primary-hospital care continuum: using the multi-level framework in explaining care coordination

    PubMed Central

    2013-01-01

    Background Care pathways are widely used in hospitals for a structured and detailed planning of the care process. There is a growing interest in extending care pathways into primary care to improve quality of care by increasing care coordination. Evidence is sparse about the relationship between care pathways and care coordination. The multi-level framework explores care coordination across organizations and states that (inter)organizational mechanisms have an effect on the relationships between healthcare professionals, resulting in quality and efficiency of care. The aim of this study was to assess the extent to which care pathways support or create elements of the multi-level framework necessary to improve care coordination across the primary - hospital care continuum. Methods This study is an in-depth analysis of five existing local community projects located in four different regions in Flanders (Belgium) to determine whether the available empirical evidence supported or refuted the theoretical expectations from the multi-level framework. Data were gathered using mixed methods, including structured face-to-face interviews, participant observations, documentation and a focus group. Multiple cases were analyzed performing a cross case synthesis to strengthen the results. Results The development of a care pathway across the primary-hospital care continuum, supported by a step-by-step scenario, led to the use of existing and newly constructed structures, data monitoring and the development of information tools. The construction and use of these inter-organizational mechanisms had a positive effect on exchanging information, formulating and sharing goals, defining and knowing each other’s roles, expectations and competences and building qualitative relationships. Conclusion Care pathways across the primary-hospital care continuum enhance the components of care coordination. PMID:23919518

  14. Path synthesis of four-bar mechanisms using synergy of polynomial neural network and Stackelberg game theory

    NASA Astrophysics Data System (ADS)

    Ahmadi, Bahman; Nariman-zadeh, Nader; Jamali, Ali

    2017-06-01

    In this article, a novel approach based on game theory is presented for multi-objective optimal synthesis of four-bar mechanisms. The multi-objective optimization problem is modelled as a Stackelberg game. The more important objective function, tracking error, is considered as the leader, and the other objective function, deviation of the transmission angle from 90° (TA), is considered as the follower. In a new approach, a group method of data handling (GMDH)-type neural network is also utilized to construct an approximate model for the rational reaction set (RRS) of the follower. Using the proposed game-theoretic approach, the multi-objective optimal synthesis of a four-bar mechanism is then cast into a single-objective optimal synthesis using the leader variables and the obtained RRS of the follower. The superiority of using the synergy game-theoretic method of Stackelberg with a GMDH-type neural network is demonstrated for two case studies on the synthesis of four-bar mechanisms.

  15. Mass measurement using energy spectra in three-body decays

    DOE PAGES

    Agashe, Kaustubh; Franceschini, Roberto; Kim, Doojin; ...

    2016-05-24

    In previous works we have demonstrated how the energy distribution of massless decay products in two body decays can be used to measure the mass of decaying particles. In this study, we show how such results can be generalized to the case of multi-body decays. The key ideas that allow us to deal with multi-body final states are an extension of our previous results to the case of massive decay products and the factorization of the multi-body phase space. The mass measurement strategy that we propose is distinct from alternative methods because it does not require an accurate reconstruction ofmore » the entire event, as it does not involve, for instance, the missing transverse momentum, but rather requires measuring only the visible decay products of the decay of interest. To demonstrate the general strategy, we study a supersymmetric model wherein pair-produced gluinos each decay to a stable neutralino and a bottom quark-antiquark pair via an off -shell bottom squark. The combinatorial background stemming from the indistinguishable visible final states on both decay sides can be treated by an “event mixing” technique, the performance of which is discussed in detail. In conclusion, taking into account dominant backgrounds, we are able to show that the mass of the gluino and, in favorable cases, that of the neutralino can be determined by this mass measurement strategy.« less

  16. How do strategic decisions and operative practices affect operating room productivity?

    PubMed

    Peltokorpi, Antti

    2011-12-01

    Surgical operating rooms are cost-intensive parts of health service production. Managing operating units efficiently is essential when hospitals and healthcare systems aim to maximize health outcomes with limited resources. Previous research about operating room management has focused on studying the effect of management practices and decisions on efficiency by utilizing mainly modeling approach or before-after analysis in single hospital case. The purpose of this research is to analyze the synergic effect of strategic decisions and operative management practices on operating room productivity and to use a multiple case study method enabling statistical hypothesis testing with empirical data. 11 hypotheses that propose connections between the use of strategic and operative practices and productivity were tested in a multi-hospital study that included 26 units. The results indicate that operative practices, such as personnel management, case scheduling and performance measurement, affect productivity more remarkably than do strategic decisions that relate to, e.g., units' size, scope or academic status. Units with different strategic positions should apply different operative practices: Focused hospital units benefit most from sophisticated case scheduling and parallel processing whereas central and ambulatory units should apply flexible working hours, incentives and multi-skilled personnel. Operating units should be more active in applying management practices which are adequate for their strategic orientation.

  17. A fuzzy MCDM model with objective and subjective weights for evaluating service quality in hotel industries

    NASA Astrophysics Data System (ADS)

    Zoraghi, Nima; Amiri, Maghsoud; Talebi, Golnaz; Zowghi, Mahdi

    2013-12-01

    This paper presents a fuzzy multi-criteria decision-making (FMCDM) model by integrating both subjective and objective weights for ranking and evaluating the service quality in hotels. The objective method selects weights of criteria through mathematical calculation, while the subjective method uses judgments of decision makers. In this paper, we use a combination of weights obtained by both approaches in evaluating service quality in hotel industries. A real case study that considered ranking five hotels is illustrated. Examples are shown to indicate capabilities of the proposed method.

  18. Decerns: A framework for multi-criteria decision analysis

    DOE PAGES

    Yatsalo, Boris; Didenko, Vladimir; Gritsyuk, Sergey; ...

    2015-02-27

    A new framework, Decerns, for multicriteria decision analysis (MCDA) of a wide range of practical problems on risk management is introduced. Decerns framework contains a library of modules that are the basis for two scalable systems: DecernsMCDA for analysis of multicriteria problems, and DecernsSDSS for multicriteria analysis of spatial options. DecernsMCDA includes well known MCDA methods and original methods for uncertainty treatment based on probabilistic approaches and fuzzy numbers. As a result, these MCDA methods are described along with a case study on analysis of multicriteria location problem.

  19. A Qualitative Multi-Site Case Study: Examining Principals' Leadership Styles and School Performance

    ERIC Educational Resources Information Center

    Preyear, Loukisha

    2015-01-01

    The purpose of this qualitative multi-site case study was to explore the impact of principals' leadership styles on student academic achievement in a high-poverty low-performing school district in Louisiana. A total of 17 participants, principals and teachers, from this school district were used in this study. Data source triangulation of…

  20. Effects of Early and Later Family Violence on Children's Behavior Problems and Depression: A Longitudinal, Multi-Informant Perspective

    ERIC Educational Resources Information Center

    Sternberg, Kathleen J.; Lamb, Michael E.; Guterman, Eva; Abbott, Craig B.

    2006-01-01

    Objectives: To examine the effects of different forms of family violence at two developmental stages by assessing a sample of 110 Israeli children, drawn from the case files of Israeli family service agencies, studied longitudinally in both middle childhood and adolescence. Methods: Information about the children's adjustment was obtained from…

  1. Application of Standard Project Management Tools to Research--A Case Study from a Multi-National Clinical Trial

    ERIC Educational Resources Information Center

    Gist, Peter; Langley, David

    2007-01-01

    PRINCE2, which stands for Projects in Controlled Environments, is a project management method covering the organisation, management, and control of projects and is widely used in both government and commercial IT and building projects in the UK. This paper describes the application of PRINCE2 to the management of large clinical trials…

  2. An integrative multi-criteria decision making techniques for supplier evaluation problem with its application

    NASA Astrophysics Data System (ADS)

    Fatrias, D.; Kamil, I.; Meilani, D.

    2018-03-01

    Coordinating business operation with suppliers becomes increasingly important to survive and prosper under the dynamic business environment. A good partnership with suppliers not only increase efficiency, but also strengthen corporate competitiveness. Associated with such concern, this study aims to develop a practical approach of multi-criteria supplier evaluation using combined methods of Taguchi loss function (TLF), best-worst method (BWM) and VIse Kriterijumska Optimizacija kompromisno Resenje (VIKOR). A new framework of integrative approach adopting these methods is our main contribution for supplier evaluation in literature. In this integrated approach, a compromised supplier ranking list based on the loss score of suppliers is obtained using efficient steps of a pairwise comparison based decision making process. Implemetation to the case problem with real data from crumb rubber industry shows the usefulness of the proposed approach. Finally, a suitable managerial implication is presented.

  3. An interval-parameter mixed integer multi-objective programming for environment-oriented evacuation management

    NASA Astrophysics Data System (ADS)

    Wu, C. Z.; Huang, G. H.; Yan, X. P.; Cai, Y. P.; Li, Y. P.

    2010-05-01

    Large crowds are increasingly common at political, social, economic, cultural and sports events in urban areas. This has led to attention on the management of evacuations under such situations. In this study, we optimise an approximation method for vehicle allocation and route planning in case of an evacuation. This method, based on an interval-parameter multi-objective optimisation model, has potential for use in a flexible decision support system for evacuation management. The modeling solutions are obtained by sequentially solving two sub-models corresponding to lower- and upper-bounds for the desired objective function value. The interval solutions are feasible and stable in the given decision space, and this may reduce the negative effects of uncertainty, thereby improving decision makers' estimates under different conditions. The resulting model can be used for a systematic analysis of the complex relationships among evacuation time, cost and environmental considerations. The results of a case study used to validate the proposed model show that the model does generate useful solutions for planning evacuation management and practices. Furthermore, these results are useful for evacuation planners, not only in making vehicle allocation decisions but also for providing insight into the tradeoffs among evacuation time, environmental considerations and economic objectives.

  4. School-Based Cognitive-Behavioral Therapy for an Adolescent Presenting with ADHD and Explosive Anger: A Case Study

    ERIC Educational Resources Information Center

    Parker, Janise; Zaboski, Brian; Joyce-Beaulieu, Diana

    2016-01-01

    This case demonstrates the efficacy of utilizing an intensive, multi-faceted behavioral intervention paradigm. A comprehensive, integrative, school-based service model was applied to address attention deficit hyperactivity disorder symptomology, oppositional behaviors, and explosive anger at the secondary level. The case reviews a multi-modal…

  5. Density Effects on Post-shock Turbulence Structure

    NASA Astrophysics Data System (ADS)

    Tian, Yifeng; Jaberi, Farhad; Livescu, Daniel; Li, Zhaorui; Michigan State University Collaboration; Los Alamos National Laboratory Collaboration; Texas A&M University-Corpus Christi Collaboration

    2017-11-01

    The effects of density variations due to mixture composition on post-shock turbulence structure are studied using turbulence-resolving shock-capturing simulations. This work extends the canonical Shock-Turbulence Interaction (STI) problem to involve significant variable density effects. The numerical method has been verified using a series of grid and LIA convergence tests, and is used to generate accurate post-shock turbulence data for a detailed flow study. Density effects on post-shock turbulent statistics are shown to be significant, leading to an increased amplification of turbulent kinetic energy (TKE). Eulerian and Lagrangian analyses show that the increase in the post-shock correlation between rotation and strain is weakened in the case with significant density variations (referred to as the ``multi-fluid'' case). Similar to previous single-fluid results and LIA predictions, the shock wave significantly changes the topology of the turbulent structures, exhibiting a symmetrization of the joint PDF of second and third invariant of the deviatoric part of velocity gradient tensor. In the multi-fluid case, this trend is more significant and mainly manifested in the heavy fluid regions. Lagrangian data are also used to study the evolution of turbulence structure away from the shock wave and assess the accuracy of Lagrangian dynamical models.

  6. An adaptive multi-level simulation algorithm for stochastic biological systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lester, C., E-mail: lesterc@maths.ox.ac.uk; Giles, M. B.; Baker, R. E.

    2015-01-14

    Discrete-state, continuous-time Markov models are widely used in the modeling of biochemical reaction networks. Their complexity often precludes analytic solution, and we rely on stochastic simulation algorithms (SSA) to estimate system statistics. The Gillespie algorithm is exact, but computationally costly as it simulates every single reaction. As such, approximate stochastic simulation algorithms such as the tau-leap algorithm are often used. Potentially computationally more efficient, the system statistics generated suffer from significant bias unless tau is relatively small, in which case the computational time can be comparable to that of the Gillespie algorithm. The multi-level method [Anderson and Higham, “Multi-level Montemore » Carlo for continuous time Markov chains, with applications in biochemical kinetics,” SIAM Multiscale Model. Simul. 10(1), 146–179 (2012)] tackles this problem. A base estimator is computed using many (cheap) sample paths at low accuracy. The bias inherent in this estimator is then reduced using a number of corrections. Each correction term is estimated using a collection of paired sample paths where one path of each pair is generated at a higher accuracy compared to the other (and so more expensive). By sharing random variables between these paired paths, the variance of each correction estimator can be reduced. This renders the multi-level method very efficient as only a relatively small number of paired paths are required to calculate each correction term. In the original multi-level method, each sample path is simulated using the tau-leap algorithm with a fixed value of τ. This approach can result in poor performance when the reaction activity of a system changes substantially over the timescale of interest. By introducing a novel adaptive time-stepping approach where τ is chosen according to the stochastic behaviour of each sample path, we extend the applicability of the multi-level method to such cases. We demonstrate the efficiency of our method using a number of examples.« less

  7. Quasilinear parabolic variational inequalities with multi-valued lower-order terms

    NASA Astrophysics Data System (ADS)

    Carl, Siegfried; Le, Vy K.

    2014-10-01

    In this paper, we provide an analytical frame work for the following multi-valued parabolic variational inequality in a cylindrical domain : Find and an such that where is some closed and convex subset, A is a time-dependent quasilinear elliptic operator, and the multi-valued function is assumed to be upper semicontinuous only, so that Clarke's generalized gradient is included as a special case. Thus, parabolic variational-hemivariational inequalities are special cases of the problem considered here. The extension of parabolic variational-hemivariational inequalities to the general class of multi-valued problems considered in this paper is not only of disciplinary interest, but is motivated by the need in applications. The main goals are as follows. First, we provide an existence theory for the above-stated problem under coercivity assumptions. Second, in the noncoercive case, we establish an appropriate sub-supersolution method that allows us to get existence, comparison, and enclosure results. Third, the order structure of the solution set enclosed by sub-supersolutions is revealed. In particular, it is shown that the solution set within the sector of sub-supersolutions is a directed set. As an application, a multi-valued parabolic obstacle problem is treated.

  8. Single-scale renormalisation group improvement of multi-scale effective potentials

    NASA Astrophysics Data System (ADS)

    Chataignier, Leonardo; Prokopec, Tomislav; Schmidt, Michael G.; Świeżewska, Bogumiła

    2018-03-01

    We present a new method for renormalisation group improvement of the effective potential of a quantum field theory with an arbitrary number of scalar fields. The method amounts to solving the renormalisation group equation for the effective potential with the boundary conditions chosen on the hypersurface where quantum corrections vanish. This hypersurface is defined through a suitable choice of a field-dependent value for the renormalisation scale. The method can be applied to any order in perturbation theory and it is a generalisation of the standard procedure valid for the one-field case. In our method, however, the choice of the renormalisation scale does not eliminate individual logarithmic terms but rather the entire loop corrections to the effective potential. It allows us to evaluate the improved effective potential for arbitrary values of the scalar fields using the tree-level potential with running coupling constants as long as they remain perturbative. This opens the possibility of studying various applications which require an analysis of multi-field effective potentials across different energy scales. In particular, the issue of stability of the scalar potential can be easily studied beyond tree level.

  9. Orientation of airborne laser scanning point clouds with multi-view, multi-scale image blocks.

    PubMed

    Rönnholm, Petri; Hyyppä, Hannu; Hyyppä, Juha; Haggrén, Henrik

    2009-01-01

    Comprehensive 3D modeling of our environment requires integration of terrestrial and airborne data, which is collected, preferably, using laser scanning and photogrammetric methods. However, integration of these multi-source data requires accurate relative orientations. In this article, two methods for solving relative orientation problems are presented. The first method includes registration by minimizing the distances between of an airborne laser point cloud and a 3D model. The 3D model was derived from photogrammetric measurements and terrestrial laser scanning points. The first method was used as a reference and for validation. Having completed registration in the object space, the relative orientation between images and laser point cloud is known. The second method utilizes an interactive orientation method between a multi-scale image block and a laser point cloud. The multi-scale image block includes both aerial and terrestrial images. Experiments with the multi-scale image block revealed that the accuracy of a relative orientation increased when more images were included in the block. The orientations of the first and second methods were compared. The comparison showed that correct rotations were the most difficult to detect accurately by using the interactive method. Because the interactive method forces laser scanning data to fit with the images, inaccurate rotations cause corresponding shifts to image positions. However, in a test case, in which the orientation differences included only shifts, the interactive method could solve the relative orientation of an aerial image and airborne laser scanning data repeatedly within a couple of centimeters.

  10. Orientation of Airborne Laser Scanning Point Clouds with Multi-View, Multi-Scale Image Blocks

    PubMed Central

    Rönnholm, Petri; Hyyppä, Hannu; Hyyppä, Juha; Haggrén, Henrik

    2009-01-01

    Comprehensive 3D modeling of our environment requires integration of terrestrial and airborne data, which is collected, preferably, using laser scanning and photogrammetric methods. However, integration of these multi-source data requires accurate relative orientations. In this article, two methods for solving relative orientation problems are presented. The first method includes registration by minimizing the distances between of an airborne laser point cloud and a 3D model. The 3D model was derived from photogrammetric measurements and terrestrial laser scanning points. The first method was used as a reference and for validation. Having completed registration in the object space, the relative orientation between images and laser point cloud is known. The second method utilizes an interactive orientation method between a multi-scale image block and a laser point cloud. The multi-scale image block includes both aerial and terrestrial images. Experiments with the multi-scale image block revealed that the accuracy of a relative orientation increased when more images were included in the block. The orientations of the first and second methods were compared. The comparison showed that correct rotations were the most difficult to detect accurately by using the interactive method. Because the interactive method forces laser scanning data to fit with the images, inaccurate rotations cause corresponding shifts to image positions. However, in a test case, in which the orientation differences included only shifts, the interactive method could solve the relative orientation of an aerial image and airborne laser scanning data repeatedly within a couple of centimeters. PMID:22454569

  11. Factors Influencing Teachers' Technology Self-Efficacy: A Case Study

    ERIC Educational Resources Information Center

    Farah, Amy Caroline

    2012-01-01

    Factors influencing teachers' levels of technology self-efficacy were examined through a qualitative multi-site, multi-subject case study research design. An initial survey was administered to all full-time, certified teachers at three school sites in order to gauge teachers' current level of technology self-efficacy. From that…

  12. The association of contact lens solution use and Acanthamoeba keratitis

    PubMed Central

    Joslin, Charlotte E.; Tu, Elmer Y.; Shoff, Megan E.; Booton, Gregory C.; Fuerst, Paul A.; McMahon, Timothy T.; Anderson, Robert J.; Dworkin, Mark S.; Sugar, Joel; Davis, Faith G.; Stayner, Leslie T.

    2009-01-01

    Purpose Diagnosis of Acanthamoeba keratitis, a rare but serious corneal infection, has recently increased significantly at the University of Illinois at Chicago (UIC) Cornea Service. The purpose is to investigate Acanthamoeba keratitis risk factors. Design Retrospective case-control study. Methods Setting University, tertiary care hospital. Patients Fifty-five Acanthamoeba keratitis cases with contact lens use were diagnosed between May 1, 2003 and September 15, 2006. Clinic-matched controls with contact lens use were recruited. Subjects completed surveys targeting lens hygiene, contact lens solution use, and water exposure. Main Outcome Measure Acanthamoeba keratitis. Results Thirty-nine (73.6%) cases and 113 (65.3%) controls participated; 38 cases had complete contact lens data. Thirty-five of 38 cases (92.1%) and 47 of 100 controls (47.0%) used soft lenses. Analysis was performed on 30 cases and 39 controls with matched pairs with soft lens use. Exclusive use of AMO Complete MoisturePlus Multi-Purpose Solution was independently associated with Acanthamoeba keratitis in multivariable analysis (55.2% vs. 10.5%; OR, 16.67; 95% CI, 2.11–162.63; p = 0.008). However, 38.8% of cases reported no use of AMO Complete MoisturePlus Multi-Purpose Solution or used it in combination with other solutions. Although not statistically significant, additional hygiene-related variables (solution ‘reuse’, lack of ‘rubbing’, and showering with lenses) suggest a pattern of risk,. Conclusions AMO Complete MoisturePlus Multi-Purpose Solution use is independently associated with Acanthamoeba keratitis among soft contact lens users. However, it does not explain all cases, suggesting additional factors. Further research into environmental risk factors and hygiene practices is warranted, especially considering this is the second outbreak of an atypical, contact lens-related infection. PMID:17588524

  13. New Multi-HAzard and MulTi-RIsk Assessment MethodS for Europe (MATRIX): A research program towards mitigating multiple hazards and risks in Europe

    NASA Astrophysics Data System (ADS)

    Fleming, K. M.; Zschau, J.; Gasparini, P.; Modaressi, H.; Matrix Consortium

    2011-12-01

    Scientists, engineers, civil protection and disaster managers typically treat natural hazards and risks individually. This leads to the situation where the frequent causal relationships between the different hazards and risks, e.g., earthquakes and volcanos, or floods and landslides, are ignored. Such an oversight may potentially lead to inefficient mitigation planning. As part of their efforts to confront this issue, the European Union, under its FP7 program, is supporting the New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe or MATRIX project. The focus of MATRIX is on natural hazards, in particular earthquakes, landslides, volcanos, wild fires, storms and fluvial and coastal flooding. MATRIX will endeavour to develop methods and tools to tackle multi-type natural hazards and risks within a common framework, focusing on methodologies that are suited to the European context. The work will involve an assessment of current single-type hazard and risk assessment methodologies, including a comparison and quantification of uncertainties and harmonization of single-type methods, examining the consequence of cascade effects within a multi-hazard environment, time-dependent vulnerability, decision making and support for multi-hazard mitigation and adaption, and a series of test cases. Three test sites are being used to assess the methods developed within the project (Naples, Cologne, and the French West Indies), as well as a "virtual city" based on a comprehensive IT platform that will allow scenarios not represented by the test cases to be examined. In addition, a comprehensive dissemination program that will involve national platforms for disaster management, as well as various outreach activities, will be undertaken. The MATRIX consortium consists of ten research institutions (nine European and one Canadian), an end-user (i.e., one of the European national platforms for disaster reduction) and a partner from industry.

  14. A multi-resolution strategy for a multi-objective deformable image registration framework that accommodates large anatomical differences

    NASA Astrophysics Data System (ADS)

    Alderliesten, Tanja; Bosman, Peter A. N.; Sonke, Jan-Jakob; Bel, Arjan

    2014-03-01

    Currently, two major challenges dominate the field of deformable image registration. The first challenge is related to the tuning of the developed methods to specific problems (i.e. how to best combine different objectives such as similarity measure and transformation effort). This is one of the reasons why, despite significant progress, clinical implementation of such techniques has proven to be difficult. The second challenge is to account for large anatomical differences (e.g. large deformations, (dis)appearing structures) that occurred between image acquisitions. In this paper, we study a framework based on multi-objective optimization to improve registration robustness and to simplify tuning for specific applications. Within this framework we specifically consider the use of an advanced model-based evolutionary algorithm for optimization and a dual-dynamic transformation model (i.e. two "non-fixed" grids: one for the source- and one for the target image) to accommodate for large anatomical differences. The framework computes and presents multiple outcomes that represent efficient trade-offs between the different objectives (a so-called Pareto front). In image processing it is common practice, for reasons of robustness and accuracy, to use a multi-resolution strategy. This is, however, only well-established for single-objective registration methods. Here we describe how such a strategy can be realized for our multi-objective approach and compare its results with a single-resolution strategy. For this study we selected the case of prone-supine breast MRI registration. Results show that the well-known advantages of a multi-resolution strategy are successfully transferred to our multi-objective approach, resulting in superior (i.e. Pareto-dominating) outcomes.

  15. Improved Topographic Mapping Through Multi-Baseline SAR Interferometry with MAP Estimation

    NASA Astrophysics Data System (ADS)

    Dong, Yuting; Jiang, Houjun; Zhang, Lu; Liao, Mingsheng; Shi, Xuguo

    2015-05-01

    There is an inherent contradiction between the sensitivity of height measurement and the accuracy of phase unwrapping for SAR interferometry (InSAR) over rough terrain. This contradiction can be resolved by multi-baseline InSAR analysis, which exploits multiple phase observations with different normal baselines to improve phase unwrapping accuracy, or even avoid phase unwrapping. In this paper we propose a maximum a posteriori (MAP) estimation method assisted by SRTM DEM data for multi-baseline InSAR topographic mapping. Based on our method, a data processing flow is established and applied in processing multi-baseline ALOS/PALSAR dataset. The accuracy of resultant DEMs is evaluated by using a standard Chinese national DEM of scale 1:10,000 as reference. The results show that multi-baseline InSAR can improve DEM accuracy compared with single-baseline case. It is noteworthy that phase unwrapping is avoided and the quality of multi-baseline InSAR DEM can meet the DTED-2 standard.

  16. A GIS-based multi-source and multi-box modeling approach (GMSMB) for air pollution assessment--a North American case study.

    PubMed

    Wang, Bao-Zhen; Chen, Zhi

    2013-01-01

    This article presents a GIS-based multi-source and multi-box modeling approach (GMSMB) to predict the spatial concentration distributions of airborne pollutant on local and regional scales. In this method, an extended multi-box model combined with a multi-source and multi-grid Gaussian model are developed within the GIS framework to examine the contributions from both point- and area-source emissions. By using GIS, a large amount of data including emission sources, air quality monitoring, meteorological data, and spatial location information required for air quality modeling are brought into an integrated modeling environment. It helps more details of spatial variation in source distribution and meteorological condition to be quantitatively analyzed. The developed modeling approach has been examined to predict the spatial concentration distribution of four air pollutants (CO, NO(2), SO(2) and PM(2.5)) for the State of California. The modeling results are compared with the monitoring data. Good agreement is acquired which demonstrated that the developed modeling approach could deliver an effective air pollution assessment on both regional and local scales to support air pollution control and management planning.

  17. Multi-GPU and multi-CPU accelerated FDTD scheme for vibroacoustic applications

    NASA Astrophysics Data System (ADS)

    Francés, J.; Otero, B.; Bleda, S.; Gallego, S.; Neipp, C.; Márquez, A.; Beléndez, A.

    2015-06-01

    The Finite-Difference Time-Domain (FDTD) method is applied to the analysis of vibroacoustic problems and to study the propagation of longitudinal and transversal waves in a stratified media. The potential of the scheme and the relevance of each acceleration strategy for massively computations in FDTD are demonstrated in this work. In this paper, we propose two new specific implementations of the bi-dimensional scheme of the FDTD method using multi-CPU and multi-GPU, respectively. In the first implementation, an open source message passing interface (OMPI) has been included in order to massively exploit the resources of a biprocessor station with two Intel Xeon processors. Moreover, regarding CPU code version, the streaming SIMD extensions (SSE) and also the advanced vectorial extensions (AVX) have been included with shared memory approaches that take advantage of the multi-core platforms. On the other hand, the second implementation called the multi-GPU code version is based on Peer-to-Peer communications available in CUDA on two GPUs (NVIDIA GTX 670). Subsequently, this paper presents an accurate analysis of the influence of the different code versions including shared memory approaches, vector instructions and multi-processors (both CPU and GPU) and compares them in order to delimit the degree of improvement of using distributed solutions based on multi-CPU and multi-GPU. The performance of both approaches was analysed and it has been demonstrated that the addition of shared memory schemes to CPU computing improves substantially the performance of vector instructions enlarging the simulation sizes that use efficiently the cache memory of CPUs. In this case GPU computing is slightly twice times faster than the fine tuned CPU version in both cases one and two nodes. However, for massively computations explicit vector instructions do not worth it since the memory bandwidth is the limiting factor and the performance tends to be the same than the sequential version with auto-vectorisation and also shared memory approach. In this scenario GPU computing is the best option since it provides a homogeneous behaviour. More specifically, the speedup of GPU computing achieves an upper limit of 12 for both one and two GPUs, whereas the performance reaches peak values of 80 GFlops and 146 GFlops for the performance for one GPU and two GPUs respectively. Finally, the method is applied to an earth crust profile in order to demonstrate the potential of our approach and the necessity of applying acceleration strategies in these type of applications.

  18. Implementing service improvement projects within pre-registration nursing education: a multi-method case study evaluation.

    PubMed

    Baillie, Lesley; Bromley, Barbara; Walker, Moira; Jones, Rebecca; Mhlanga, Fortune

    2014-01-01

    Preparing healthcare students for quality and service improvement is important internationally. A United Kingdom (UK) initiative aims to embed service improvement in pre-registration education. A UK university implemented service improvement teaching for all nursing students. In addition, the degree pathway students conducted service improvement projects as the basis for their dissertations. The study aimed to evaluate the implementation of service improvement projects within a pre-registration nursing curriculum. A multi-method case study was conducted, using student questionnaires, focus groups with students and academic staff, and observation of action learning sets. Questionnaire data were analysed using SPSS v19. Qualitative data were analysed using Ritchie and Spencer's (1994) Framework Approach. Students were very positive about service improvement. The degree students, who conducted service improvement projects in practice, felt more knowledgeable than advanced diploma students. Selecting the project focus was a key issue and students encountered some challenges in practice. Support for student service improvement projects came from action learning sets, placement staff, and academic staff. Service improvement projects had a positive effect on students' learning. An effective partnership between the university and partner healthcare organisations, and support for students in practice, is essential. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Performance of an Artificial Multi-observer Deep Neural Network for Fully Automated Segmentation of Polycystic Kidneys.

    PubMed

    Kline, Timothy L; Korfiatis, Panagiotis; Edwards, Marie E; Blais, Jaime D; Czerwiec, Frank S; Harris, Peter C; King, Bernard F; Torres, Vicente E; Erickson, Bradley J

    2017-08-01

    Deep learning techniques are being rapidly applied to medical imaging tasks-from organ and lesion segmentation to tissue and tumor classification. These techniques are becoming the leading algorithmic approaches to solve inherently difficult image processing tasks. Currently, the most critical requirement for successful implementation lies in the need for relatively large datasets that can be used for training the deep learning networks. Based on our initial studies of MR imaging examinations of the kidneys of patients affected by polycystic kidney disease (PKD), we have generated a unique database of imaging data and corresponding reference standard segmentations of polycystic kidneys. In the study of PKD, segmentation of the kidneys is needed in order to measure total kidney volume (TKV). Automated methods to segment the kidneys and measure TKV are needed to increase measurement throughput and alleviate the inherent variability of human-derived measurements. We hypothesize that deep learning techniques can be leveraged to perform fast, accurate, reproducible, and fully automated segmentation of polycystic kidneys. Here, we describe a fully automated approach for segmenting PKD kidneys within MR images that simulates a multi-observer approach in order to create an accurate and robust method for the task of segmentation and computation of TKV for PKD patients. A total of 2000 cases were used for training and validation, and 400 cases were used for testing. The multi-observer ensemble method had mean ± SD percent volume difference of 0.68 ± 2.2% compared with the reference standard segmentations. The complete framework performs fully automated segmentation at a level comparable with interobserver variability and could be considered as a replacement for the task of segmentation of PKD kidneys by a human.

  20. Principal Sensemaking of Inclusion: A Multi-Case Study of Five Urban School Principals

    ERIC Educational Resources Information Center

    DeMatthews, David Edward

    2012-01-01

    This study examined how five principals working in one urban school district made sense of inclusion. I employed a multi-case study guided by the theoretical framework of sensemaking. Weick's sensemaking theory was useful in examining the way principals made sense of inclusion. Each of the seven characteristics of Weick's sensemaking…

  1. Slowly changing potential problems in Quantum Mechanics: Adiabatic theorems, ergodic theorems, and scattering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fishman, S., E-mail: fishman@physics.technion.ac.il; Soffer, A., E-mail: soffer@math.rutgers.edu

    2016-07-15

    We employ the recently developed multi-time scale averaging method to study the large time behavior of slowly changing (in time) Hamiltonians. We treat some known cases in a new way, such as the Zener problem, and we give another proof of the adiabatic theorem in the gapless case. We prove a new uniform ergodic theorem for slowly changing unitary operators. This theorem is then used to derive the adiabatic theorem, do the scattering theory for such Hamiltonians, and prove some classical propagation estimates and asymptotic completeness.

  2. Methods for using clinical laboratory test results as baseline confounders in multi-site observational database studies when missing data are expected.

    PubMed

    Raebel, Marsha A; Shetterly, Susan; Lu, Christine Y; Flory, James; Gagne, Joshua J; Harrell, Frank E; Haynes, Kevin; Herrinton, Lisa J; Patorno, Elisabetta; Popovic, Jennifer; Selvan, Mano; Shoaibi, Azadeh; Wang, Xingmei; Roy, Jason

    2016-07-01

    Our purpose was to quantify missing baseline laboratory results, assess predictors of missingness, and examine performance of missing data methods. Using the Mini-Sentinel Distributed Database from three sites, we selected three exposure-outcome scenarios with laboratory results as baseline confounders. We compared hazard ratios (HRs) or risk differences (RDs) and 95% confidence intervals (CIs) from models that omitted laboratory results, included only available results (complete cases), and included results after applying missing data methods (multiple imputation [MI] regression, MI predictive mean matching [PMM] indicator). Scenario 1 considered glucose among second-generation antipsychotic users and diabetes. Across sites, glucose was available for 27.7-58.9%. Results differed between complete case and missing data models (e.g., olanzapine: HR 0.92 [CI 0.73, 1.12] vs 1.02 [0.90, 1.16]). Across-site models employing different MI approaches provided similar HR and CI; site-specific models provided differing estimates. Scenario 2 evaluated creatinine among individuals starting high versus low dose lisinopril and hyperkalemia. Creatinine availability: 44.5-79.0%. Results differed between complete case and missing data models (e.g., HR 0.84 [CI 0.77, 0.92] vs. 0.88 [0.83, 0.94]). HR and CI were identical across MI methods. Scenario 3 examined international normalized ratio (INR) among warfarin users starting interacting versus noninteracting antimicrobials and bleeding. INR availability: 20.0-92.9%. Results differed between ignoring INR versus including INR using missing data methods (e.g., RD 0.05 [CI -0.03, 0.13] vs 0.09 [0.00, 0.18]). Indicator and PMM methods gave similar estimates. Multi-site studies must consider site variability in missing data. Different missing data methods performed similarly. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  3. Stochastic resource allocation in emergency departments with a multi-objective simulation optimization algorithm.

    PubMed

    Feng, Yen-Yi; Wu, I-Chin; Chen, Tzu-Li

    2017-03-01

    The number of emergency cases or emergency room visits rapidly increases annually, thus leading to an imbalance in supply and demand and to the long-term overcrowding of hospital emergency departments (EDs). However, current solutions to increase medical resources and improve the handling of patient needs are either impractical or infeasible in the Taiwanese environment. Therefore, EDs must optimize resource allocation given limited medical resources to minimize the average length of stay of patients and medical resource waste costs. This study constructs a multi-objective mathematical model for medical resource allocation in EDs in accordance with emergency flow or procedure. The proposed mathematical model is complex and difficult to solve because its performance value is stochastic; furthermore, the model considers both objectives simultaneously. Thus, this study develops a multi-objective simulation optimization algorithm by integrating a non-dominated sorting genetic algorithm II (NSGA II) with multi-objective computing budget allocation (MOCBA) to address the challenges of multi-objective medical resource allocation. NSGA II is used to investigate plausible solutions for medical resource allocation, and MOCBA identifies effective sets of feasible Pareto (non-dominated) medical resource allocation solutions in addition to effectively allocating simulation or computation budgets. The discrete event simulation model of ED flow is inspired by a Taiwan hospital case and is constructed to estimate the expected performance values of each medical allocation solution as obtained through NSGA II. Finally, computational experiments are performed to verify the effectiveness and performance of the integrated NSGA II and MOCBA method, as well as to derive non-dominated medical resource allocation solutions from the algorithms.

  4. Multi-scale Gaussian representation and outline-learning based cell image segmentation

    PubMed Central

    2013-01-01

    Background High-throughput genome-wide screening to study gene-specific functions, e.g. for drug discovery, demands fast automated image analysis methods to assist in unraveling the full potential of such studies. Image segmentation is typically at the forefront of such analysis as the performance of the subsequent steps, for example, cell classification, cell tracking etc., often relies on the results of segmentation. Methods We present a cell cytoplasm segmentation framework which first separates cell cytoplasm from image background using novel approach of image enhancement and coefficient of variation of multi-scale Gaussian scale-space representation. A novel outline-learning based classification method is developed using regularized logistic regression with embedded feature selection which classifies image pixels as outline/non-outline to give cytoplasm outlines. Refinement of the detected outlines to separate cells from each other is performed in a post-processing step where the nuclei segmentation is used as contextual information. Results and conclusions We evaluate the proposed segmentation methodology using two challenging test cases, presenting images with completely different characteristics, with cells of varying size, shape, texture and degrees of overlap. The feature selection and classification framework for outline detection produces very simple sparse models which use only a small subset of the large, generic feature set, that is, only 7 and 5 features for the two cases. Quantitative comparison of the results for the two test cases against state-of-the-art methods show that our methodology outperforms them with an increase of 4-9% in segmentation accuracy with maximum accuracy of 93%. Finally, the results obtained for diverse datasets demonstrate that our framework not only produces accurate segmentation but also generalizes well to different segmentation tasks. PMID:24267488

  5. A comparative study of linear and nonlinear MIMO feedback configurations

    NASA Technical Reports Server (NTRS)

    Desoer, C. A.; Lin, C. A.

    1984-01-01

    In this paper, a comparison is conducted of several feedback configurations which have appeared in the literature (e.g. unity-feedback, model-reference, etc.). The linear time-invariant multi-input multi-output case is considered. For each configuration, the stability conditions are specified, the relation between achievable I/O maps and the achievable disturbance-to-output maps is examined, and the effect of various subsystem perturbations on the system performance is studied. In terms of these considerations, it is demonstrated that one of the configurations considered is better than all the others. The results are then extended to the nonlinear multi-input multi-output case.

  6. Evaluation of 3-D graphics software: A case study

    NASA Technical Reports Server (NTRS)

    Lores, M. E.; Chasen, S. H.; Garner, J. M.

    1984-01-01

    An efficient 3-D geometry graphics software package which is suitable for advanced design studies was developed. The advanced design system is called GRADE--Graphics for Advanced Design. Efficiency and ease of use are gained by sacrificing flexibility in surface representation. The immediate options were either to continue development of GRADE or to acquire a commercially available system which would replace or complement GRADE. Test cases which would reveal the ability of each system to satisfy the requirements were developed. A scoring method which adequately captured the relative capabilities of the three systems was presented. While more complex multi-attribute decision methods could be used, the selected method provides all the needed information without being so complex that it is difficult to understand. If the value factors are modestly perturbed, system Z is a clear winner based on its overall capabilities. System Z is superior in two vital areas: surfacing and ease of interface with application programs.

  7. Carbon nanotubes significance in Darcy-Forchheimer flow

    NASA Astrophysics Data System (ADS)

    Hayat, Tasawar; Rafique, Kiran; Muhammad, Taseer; Alsaedi, Ahmed; Ayub, Muhammad

    2018-03-01

    The present article examines Darcy-Forchheimer flow of water-based carbon nanotubes. Flow is induced due to a curved stretchable surface. Heat transfer mechanism is analyzed in presence of convective heating process. Xue model of nanofluid is employed to study the characteristics of both single-walled carbon nanotubes (SWCNTs) and multi-walled carbon nanotubes (MWCNTs). Results for both single-walled carbon nanotubes (SWCNTs) and multi-walled carbon nanotubes (MWCNTs) are achieved and compared. Appropriate transformations correspond to strong nonlinear ordinary differential system. Optimal homotopy analysis method (OHAM) is used for the solution development of the resulting system. The contributions of different sundry variables on the velocity and temperature are studied. Further the skin friction coefficient and local Nusselt number are analyzed graphically for both SWCNTs and MWCNTs cases.

  8. Advanced Mitigation Process (AMP) for Improving Laser Damage Threshold of Fused Silica Optics

    NASA Astrophysics Data System (ADS)

    Ye, Xin; Huang, Jin; Liu, Hongjie; Geng, Feng; Sun, Laixi; Jiang, Xiaodong; Wu, Weidong; Qiao, Liang; Zu, Xiaotao; Zheng, Wanguo

    2016-08-01

    The laser damage precursors in subsurface of fused silica (e.g. photosensitive impurities, scratches and redeposited silica compounds) were mitigated by mineral acid leaching and HF etching with multi-frequency ultrasonic agitation, respectively. The comparison of scratches morphology after static etching and high-frequency ultrasonic agitation etching was devoted in our case. And comparison of laser induce damage resistance of scratched and non-scratched fused silica surfaces after HF etching with high-frequency ultrasonic agitation were also investigated in this study. The global laser induce damage resistance was increased significantly after the laser damage precursors were mitigated in this case. The redeposition of reaction produce was avoided by involving multi-frequency ultrasonic and chemical leaching process. These methods made the increase of laser damage threshold more stable. In addition, there is no scratch related damage initiations found on the samples which were treated by Advanced Mitigation Process.

  9. Advanced Mitigation Process (AMP) for Improving Laser Damage Threshold of Fused Silica Optics

    PubMed Central

    Ye, Xin; Huang, Jin; Liu, Hongjie; Geng, Feng; Sun, Laixi; Jiang, Xiaodong; Wu, Weidong; Qiao, Liang; Zu, Xiaotao; Zheng, Wanguo

    2016-01-01

    The laser damage precursors in subsurface of fused silica (e.g. photosensitive impurities, scratches and redeposited silica compounds) were mitigated by mineral acid leaching and HF etching with multi-frequency ultrasonic agitation, respectively. The comparison of scratches morphology after static etching and high-frequency ultrasonic agitation etching was devoted in our case. And comparison of laser induce damage resistance of scratched and non-scratched fused silica surfaces after HF etching with high-frequency ultrasonic agitation were also investigated in this study. The global laser induce damage resistance was increased significantly after the laser damage precursors were mitigated in this case. The redeposition of reaction produce was avoided by involving multi-frequency ultrasonic and chemical leaching process. These methods made the increase of laser damage threshold more stable. In addition, there is no scratch related damage initiations found on the samples which were treated by Advanced Mitigation Process. PMID:27484188

  10. Inverse modeling methods for indoor airborne pollutant tracking: literature review and fundamentals.

    PubMed

    Liu, X; Zhai, Z

    2007-12-01

    Reduction in indoor environment quality calls for effective control and improvement measures. Accurate and prompt identification of contaminant sources ensures that they can be quickly removed and contaminated spaces isolated and cleaned. This paper discusses the use of inverse modeling to identify potential indoor pollutant sources with limited pollutant sensor data. The study reviews various inverse modeling methods for advection-dispersion problems and summarizes the methods into three major categories: forward, backward, and probability inverse modeling methods. The adjoint probability inverse modeling method is indicated as an appropriate model for indoor air pollutant tracking because it can quickly find source location, strength and release time without prior information. The paper introduces the principles of the adjoint probability method and establishes the corresponding adjoint equations for both multi-zone airflow models and computational fluid dynamics (CFD) models. The study proposes a two-stage inverse modeling approach integrating both multi-zone and CFD models, which can provide a rapid estimate of indoor pollution status and history for a whole building. Preliminary case study results indicate that the adjoint probability method is feasible for indoor pollutant inverse modeling. The proposed method can help identify contaminant source characteristics (location and release time) with limited sensor outputs. This will ensure an effective and prompt execution of building management strategies and thus achieve a healthy and safe indoor environment. The method can also help design optimal sensor networks.

  11. An Evaluation of Practical Applicability of Multi-Assortment Production Break-Even Analysis based on Mining Companies

    NASA Astrophysics Data System (ADS)

    Fuksa, Dariusz; Trzaskuś-Żak, Beata; Gałaś, Zdzisław; Utrata, Arkadiusz

    2017-03-01

    In the practice of mining companies, the vast majority of them produce more than one product. The analysis of the break-even, which is referred to as CVP (Cost-Volume-Profit) analysis (Wilkinson, 2005; Czopek, 2003) in their case is significantly constricted, given the necessity to include multi-assortment structure in the analysis, which may have more than 20 types of assortments (depending on the grain size) in their offer, as in the case of open-pit mines. The article presents methods of evaluation of break-even (volume and value) for both a single-assortment production and a multi-assortment production. The complexity of problem of break-even evaluation for multi-assortment production has resulted in formation of many methods, and, simultaneously, various approaches to its analysis, especially differences in accounting fixed costs, which may be either totally accounted for among particular assortments, relating to the whole company or partially accounted for among particular assortments and partially relating to the company, as a whole. The evaluation of the chosen methods of break-even analysis, given the availability of data, was based on two examples of mining companies: an open-pit mine of rock materials and an underground hard coal mine. The selection of methods was set by the available data provided by the companies. The data for the analysis comes from internal documentation of the mines - financial statements, breakdowns and cost calculations.

  12. "This Is a Beautiful School." "This School Is Useless!!" Explaining Disengagement in a Greek Vocational School through the Examination of Teacher Ideologies

    ERIC Educational Resources Information Center

    Giannakaki, Marina-Stefania; Batziakas, Georgios

    2016-01-01

    This multi-method case study of a Greek vocational school explored teachers' culture (including beliefs about education, teachers' role, and students' nature) using the concept of pupil control ideology to explain problems of disengagement and low morale among staff and students, as well as tensions in relationships. A prominent custodial culture…

  13. The Human Dimension of Energy Conservation and Sustainability: A Case Study of the University of Michigan's Energy Conservation Program

    ERIC Educational Resources Information Center

    Marans, Robert W.; Edelstein, Jack Y.

    2010-01-01

    Purpose: The purpose of this paper is to determine the behaviors, attitudes, and levels of understanding among faculty, staff, and students in efforts to design programs aimed at reducing energy use in University of Michigan (UM) buildings. Design/methodology/approach: A multi-method approach is used in five diverse pilot buildings including focus…

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samet, J.; Gilliland, F.D.

    This project incorporates two related research projects directed toward understanding respiratory carcinogenesis in radon-exposed former uranium miners. The first project involved a continuation of the tissue resource of lung cancer cases from former underground uranium miners and comparison cases from non-miners. The second project was a pilot study for a proposed longitudinal study of respiratory carcinogenesis in former uranium miners. The objectives including facilitating the investigation of molecular changes in radon exposed lung cancer cases, developing methods for prospectively studying clinical, cytologic, cytogenetic, and molecular changes in the multi-event process of respiratory carcinogenesis, and assessing the feasibility of recruiting formermore » uranium miners into a longitudinal study that collected multiple biological specimens. A pilot study was conducted to determine whether blood collection, induced sputum, bronchial brushing, washings, and mucosal biopsies from participants at two of the hospitals could be included efficiently. A questionnaire was developed for the extended study and all protocols for specimen collection and tissue handling were completed. Resource utilization is in progress at ITRI and the methods have been developed to study molecular and cellular changes in exfoliated cells contained in sputum as well as susceptibility factors.« less

  15. Surrogate Based Uni/Multi-Objective Optimization and Distribution Estimation Methods

    NASA Astrophysics Data System (ADS)

    Gong, W.; Duan, Q.; Huo, X.

    2017-12-01

    Parameter calibration has been demonstrated as an effective way to improve the performance of dynamic models, such as hydrological models, land surface models, weather and climate models etc. Traditional optimization algorithms usually cost a huge number of model evaluations, making dynamic model calibration very difficult, or even computationally prohibitive. With the help of a serious of recently developed adaptive surrogate-modelling based optimization methods: uni-objective optimization method ASMO, multi-objective optimization method MO-ASMO, and probability distribution estimation method ASMO-PODE, the number of model evaluations can be significantly reduced to several hundreds, making it possible to calibrate very expensive dynamic models, such as regional high resolution land surface models, weather forecast models such as WRF, and intermediate complexity earth system models such as LOVECLIM. This presentation provides a brief introduction to the common framework of adaptive surrogate-based optimization algorithms of ASMO, MO-ASMO and ASMO-PODE, a case study of Common Land Model (CoLM) calibration in Heihe river basin in Northwest China, and an outlook of the potential applications of the surrogate-based optimization methods.

  16. Identification of rice field using Multi-Temporal NDVI and PCA method on Landsat 8 (Case Study: Demak, Central Java)

    NASA Astrophysics Data System (ADS)

    Sukmono, Abdi; Ardiansyah

    2017-01-01

    Paddy is one of the most important agricultural crop in Indonesia. Indonesia’s consumption of rice per capita in 2013 amounted to 78,82 kg/capita/year. In 2017, the Indonesian government has the mission of realizing Indonesia became self-sufficient in food. Therefore, the Indonesian government should be able to seek the stability of the fulfillment of basic needs for food, such as rice field mapping. The accurate mapping for rice field can use a quick and easy method such as Remote Sensing. In this study, multi-temporal Landsat 8 are used for identification of rice field based on Rice Planting Time. It was combined with other method for extract information from the imagery. The methods which was used Normalized Difference Vegetation Index (NDVI), Principal Component Analysis (PCA) and band combination. Image classification is processed by using nine classes, those are water, settlements, mangrove, gardens, fields, rice fields 1st, rice fields 2nd, rice fields 3rd and rice fields 4th. The results showed the rice fields area obtained from the PCA method was 50,009 ha, combination bands was 51,016 ha and NDVI method was 45,893 ha. The accuracy level was obtained PCA method (84.848%), band combination (81.818%), and NDVI method (75.758%).

  17. Generalized Buneman Pruning for Inferring the Most Parsimonious Multi-state Phylogeny

    NASA Astrophysics Data System (ADS)

    Misra, Navodit; Blelloch, Guy; Ravi, R.; Schwartz, Russell

    Accurate reconstruction of phylogenies remains a key challenge in evolutionary biology. Most biologically plausible formulations of the problem are formally NP-hard, with no known efficient solution. The standard in practice are fast heuristic methods that are empirically known to work very well in general, but can yield results arbitrarily far from optimal. Practical exact methods, which yield exponential worst-case running times but generally much better times in practice, provide an important alternative. We report progress in this direction by introducing a provably optimal method for the weighted multi-state maximum parsimony phylogeny problem. The method is based on generalizing the notion of the Buneman graph, a construction key to efficient exact methods for binary sequences, so as to apply to sequences with arbitrary finite numbers of states with arbitrary state transition weights. We implement an integer linear programming (ILP) method for the multi-state problem using this generalized Buneman graph and demonstrate that the resulting method is able to solve data sets that are intractable by prior exact methods in run times comparable with popular heuristics. Our work provides the first method for provably optimal maximum parsimony phylogeny inference that is practical for multi-state data sets of more than a few characters.

  18. Societal Risk Evaluation Scheme (SRES): Scenario-Based Multi-Criteria Evaluation of Synthetic Biology Applications

    PubMed Central

    2017-01-01

    Synthetic biology (SB) applies engineering principles to biology for the construction of novel biological systems designed for useful purposes. From an oversight perspective, SB products come with significant uncertainty. Yet there is a need to anticipate and prepare for SB applications before deployment. This study develops a Societal Risk Evaluation Scheme (SRES) in order to advance methods for anticipatory governance of emerging technologies such as SB. The SRES is based upon societal risk factors that were identified as important through a policy Delphi study. These factors range from those associated with traditional risk assessment, such as health and environmental consequences, to broader features of risk such as those associated with reversibility, manageability, anticipated levels of public concern, and uncertainty. A multi-disciplinary panel with diverse perspectives and affiliations assessed four case studies of SB using the SRES. Rankings of the SRES components are compared within and across the case studies. From these comparisons, we found levels of controllability and familiarity associated with the cases to be important for overall SRES rankings. From a theoretical standpoint, this study illustrates the applicability of the psychometric paradigm to evaluating SB cases. In addition, our paper describes how the SRES can be incorporated into anticipatory governance models as a screening tool to prioritize research, information collection, and dialogue in the face of the limited capacity of governance systems. To our knowledge, this is the first study to elicit data on specific cases of SB with the goal of developing theory and tools for risk governance. PMID:28052080

  19. Inexact fuzzy-stochastic mixed-integer programming approach for long-term planning of waste management--Part A: methodology.

    PubMed

    Guo, P; Huang, G H

    2009-01-01

    In this study, an inexact fuzzy chance-constrained two-stage mixed-integer linear programming (IFCTIP) approach is proposed for supporting long-term planning of waste-management systems under multiple uncertainties in the City of Regina, Canada. The method improves upon the existing inexact two-stage programming and mixed-integer linear programming techniques by incorporating uncertainties expressed as multiple uncertainties of intervals and dual probability distributions within a general optimization framework. The developed method can provide an effective linkage between the predefined environmental policies and the associated economic implications. Four special characteristics of the proposed method make it unique compared with other optimization techniques that deal with uncertainties. Firstly, it provides a linkage to predefined policies that have to be respected when a modeling effort is undertaken; secondly, it is useful for tackling uncertainties presented as intervals, probabilities, fuzzy sets and their incorporation; thirdly, it facilitates dynamic analysis for decisions of facility-expansion planning and waste-flow allocation within a multi-facility, multi-period, multi-level, and multi-option context; fourthly, the penalties are exercised with recourse against any infeasibility, which permits in-depth analyses of various policy scenarios that are associated with different levels of economic consequences when the promised solid waste-generation rates are violated. In a companion paper, the developed method is applied to a real case for the long-term planning of waste management in the City of Regina, Canada.

  20. Automated unsupervised multi-parametric classification of adipose tissue depots in skeletal muscle

    PubMed Central

    Valentinitsch, Alexander; Karampinos, Dimitrios C.; Alizai, Hamza; Subburaj, Karupppasamy; Kumar, Deepak; Link, Thomas M.; Majumdar, Sharmila

    2012-01-01

    Purpose To introduce and validate an automated unsupervised multi-parametric method for segmentation of the subcutaneous fat and muscle regions in order to determine subcutaneous adipose tissue (SAT) and intermuscular adipose tissue (IMAT) areas based on data from a quantitative chemical shift-based water-fat separation approach. Materials and Methods Unsupervised standard k-means clustering was employed to define sets of similar features (k = 2) within the whole multi-modal image after the water-fat separation. The automated image processing chain was composed of three primary stages including tissue, muscle and bone region segmentation. The algorithm was applied on calf and thigh datasets to compute SAT and IMAT areas and was compared to a manual segmentation. Results The IMAT area using the automatic segmentation had excellent agreement with the IMAT area using the manual segmentation for all the cases in the thigh (R2: 0.96) and for cases with up to moderate IMAT area in the calf (R2: 0.92). The group with the highest grade of muscle fat infiltration in the calf had the highest error in the inner SAT contour calculation. Conclusion The proposed multi-parametric segmentation approach combined with quantitative water-fat imaging provides an accurate and reliable method for an automated calculation of the SAT and IMAT areas reducing considerably the total post-processing time. PMID:23097409

  1. Multi-criteria evaluation methods in the production scheduling

    NASA Astrophysics Data System (ADS)

    Kalinowski, K.; Krenczyk, D.; Paprocka, I.; Kempa, W.; Grabowik, C.

    2016-08-01

    The paper presents a discussion on the practical application of different methods of multi-criteria evaluation in the process of scheduling in manufacturing systems. Among the methods two main groups are specified: methods based on the distance function (using metacriterion) and methods that create a Pareto set of possible solutions. The basic criteria used for scheduling were also described. The overall procedure of evaluation process in production scheduling was presented. It takes into account the actions in the whole scheduling process and human decision maker (HDM) participation. The specified HDM decisions are related to creating and editing a set of evaluation criteria, selection of multi-criteria evaluation method, interaction in the searching process, using informal criteria and making final changes in the schedule for implementation. According to need, process scheduling may be completely or partially automated. Full automatization is possible in case of metacriterion based objective function and if Pareto set is selected - the final decision has to be done by HDM.

  2. Development of a generalized multi-pixel and multi-parameter satellite remote sensing algorithm for aerosol properties

    NASA Astrophysics Data System (ADS)

    Hashimoto, M.; Nakajima, T.; Takenaka, H.; Higurashi, A.

    2013-12-01

    We develop a new satellite remote sensing algorithm to retrieve the properties of aerosol particles in the atmosphere. In late years, high resolution and multi-wavelength, and multiple-angle observation data have been obtained by grand-based spectral radiometers and imaging sensors on board the satellite. With this development, optimized multi-parameter remote sensing methods based on the Bayesian theory have become popularly used (Turchin and Nozik, 1969; Rodgers, 2000; Dubovik et al., 2000). Additionally, a direct use of radiation transfer calculation has been employed for non-linear remote sensing problems taking place of look up table methods supported by the progress of computing technology (Dubovik et al., 2011; Yoshida et al., 2011). We are developing a flexible multi-pixel and multi-parameter remote sensing algorithm for aerosol optical properties. In this algorithm, the inversion method is a combination of the MAP method (Maximum a posteriori method, Rodgers, 2000) and the Phillips-Twomey method (Phillips, 1962; Twomey, 1963) as a smoothing constraint for the state vector. Furthermore, we include a radiation transfer calculation code, Rstar (Nakajima and Tanaka, 1986, 1988), numerically solved each time in iteration for solution search. The Rstar-code has been directly used in the AERONET operational processing system (Dubovik and King, 2000). Retrieved parameters in our algorithm are aerosol optical properties, such as aerosol optical thickness (AOT) of fine mode, sea salt, and dust particles, a volume soot fraction in fine mode particles, and ground surface albedo of each observed wavelength. We simultaneously retrieve all the parameters that characterize pixels in each of horizontal sub-domains consisting the target area. Then we successively apply the retrieval method to all the sub-domains in the target area. We conducted numerical tests for the retrieval of aerosol properties and ground surface albedo for GOSAT/CAI imager data to test the algorithm for the land area. In this test, we simulated satellite-observed radiances for a sub-domain consisting of 5 by 5 pixels by the Rstar code assuming wavelengths of 380, 674, 870 and 1600 [nm], atmospheric condition of the US standard atmosphere, and the several aerosol and ground surface conditions. The result of the experiment showed that AOTs of fine mode and dust particles, soot fraction and ground surface albedo at the wavelength of 674 [nm] are retrieved within absolute value differences of 0.04, 0.01, 0.06 and 0.006 from the true value, respectively, for the case of dark surface, and also, for the case of blight surface, 0.06, 0.03, 0.04 and 0.10 from the true value, respectively. We will conduct more tests to study the information contents of parameters needed for aerosol and land surface remote sensing with different boundary conditions among sub-domains.

  3. Unstructured Finite Volume Computational Thermo-Fluid Dynamic Method for Multi-Disciplinary Analysis and Design Optimization

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok; Schallhorn, Paul

    1998-01-01

    This paper describes a finite volume computational thermo-fluid dynamics method to solve for Navier-Stokes equations in conjunction with energy equation and thermodynamic equation of state in an unstructured coordinate system. The system of equations have been solved by a simultaneous Newton-Raphson method and compared with several benchmark solutions. Excellent agreements have been obtained in each case and the method has been found to be significantly faster than conventional Computational Fluid Dynamic(CFD) methods and therefore has the potential for implementation in Multi-Disciplinary analysis and design optimization in fluid and thermal systems. The paper also describes an algorithm of design optimization based on Newton-Raphson method which has been recently tested in a turbomachinery application.

  4. A temperature match based optimization method for daily load prediction considering DLC effect

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Z.

    This paper presents a unique optimization method for short term load forecasting. The new method is based on the optimal template temperature match between the future and past temperatures. The optimal error reduction technique is a new concept introduced in this paper. Two case studies show that for hourly load forecasting, this method can yield results as good as the rather complicated Box-Jenkins Transfer Function method, and better than the Box-Jenkins method; for peak load prediction, this method is comparable in accuracy to the neural network method with back propagation, and can produce more accurate results than the multi-linear regressionmore » method. The DLC effect on system load is also considered in this method.« less

  5. A Qualitative Multi-Case Study of the Influence of Personal and Professional Ethics on the Leadership of Public School Superintendents

    ERIC Educational Resources Information Center

    McDermott, Brian J.

    2010-01-01

    The purpose of this study is to examine the influence of personal and professional ethics on the leadership of public school superintendents. A multi-case, qualitative research design was used to gather data from four practicing public school superintendents. Transformational leadership theory and the three pillars of ethics of leadership…

  6. A Multi-Case Study of University Students' Language-Learning Experience Mediated by Mobile Technologies: A Socio-Cultural Perspective

    ERIC Educational Resources Information Center

    Ma, Qing

    2017-01-01

    Emerging mobile technologies can be considered a new form of social and cultural artefact that mediates people's language learning. This multi-case study investigates how mobile technologies mediate a group of Hong Kong university students' L2 learning, which serves as a lens with which to capture the personalised, unique, contextual and…

  7. The Influence of Game Design on the Collaborative Problem Solving Process: A Cross-Case Study of Multi-Player Collaborative Gameplay Analysis

    ERIC Educational Resources Information Center

    Yildirim, Nilay

    2013-01-01

    This cross-case study examines the relationships between game design attributes and collaborative problem solving process in the context of multi-player video games. The following game design attributes: sensory stimuli elements, level of challenge, and presentation of game goals and rules were examined to determine their influence on game…

  8. Improved quantitation and reproducibility in multi-PET/CT lung studies by combining CT information.

    PubMed

    Holman, Beverley F; Cuplov, Vesna; Millner, Lynn; Endozo, Raymond; Maher, Toby M; Groves, Ashley M; Hutton, Brian F; Thielemans, Kris

    2018-06-05

    Matched attenuation maps are vital for obtaining accurate and reproducible kinetic and static parameter estimates from PET data. With increased interest in PET/CT imaging of diffuse lung diseases for assessing disease progression and treatment effectiveness, understanding the extent of the effect of respiratory motion and establishing methods for correction are becoming more important. In a previous study, we have shown that using the wrong attenuation map leads to large errors due to density mismatches in the lung, especially in dynamic PET scans. Here, we extend this work to the case where the study is sub-divided into several scans, e.g. for patient comfort, each with its own CT (cine-CT and 'snap shot' CT). A method to combine multi-CT information into a combined-CT has then been developed, which averages the CT information from each study section to produce composite CT images with the lung density more representative of that in the PET data. This combined-CT was applied to nine patients with idiopathic pulmonary fibrosis, imaged with dynamic 18 F-FDG PET/CT to determine the improvement in the precision of the parameter estimates. Using XCAT simulations, errors in the influx rate constant were found to be as high as 60% in multi-PET/CT studies. Analysis of patient data identified displacements between study sections in the time activity curves, which led to an average standard error in the estimates of the influx rate constant of 53% with conventional methods. This reduced to within 5% after use of combined-CTs for attenuation correction of the study sections. Use of combined-CTs to reconstruct the sections of a multi-PET/CT study, as opposed to using the individually acquired CTs at each study stage, produces more precise parameter estimates and may improve discrimination between diseased and normal lung.

  9. Multi-fidelity Gaussian process regression for prediction of random fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parussini, L.; Venturi, D., E-mail: venturi@ucsc.edu; Perdikaris, P.

    We propose a new multi-fidelity Gaussian process regression (GPR) approach for prediction of random fields based on observations of surrogate models or hierarchies of surrogate models. Our method builds upon recent work on recursive Bayesian techniques, in particular recursive co-kriging, and extends it to vector-valued fields and various types of covariances, including separable and non-separable ones. The framework we propose is general and can be used to perform uncertainty propagation and quantification in model-based simulations, multi-fidelity data fusion, and surrogate-based optimization. We demonstrate the effectiveness of the proposed recursive GPR techniques through various examples. Specifically, we study the stochastic Burgersmore » equation and the stochastic Oberbeck–Boussinesq equations describing natural convection within a square enclosure. In both cases we find that the standard deviation of the Gaussian predictors as well as the absolute errors relative to benchmark stochastic solutions are very small, suggesting that the proposed multi-fidelity GPR approaches can yield highly accurate results.« less

  10. Provisional-Ideal-Point-Based Multi-objective Optimization Method for Drone Delivery Problem

    NASA Astrophysics Data System (ADS)

    Omagari, Hiroki; Higashino, Shin-Ichiro

    2018-04-01

    In this paper, we proposed a new evolutionary multi-objective optimization method for solving drone delivery problems (DDP). It can be formulated as a constrained multi-objective optimization problem. In our previous research, we proposed the "aspiration-point-based method" to solve multi-objective optimization problems. However, this method needs to calculate the optimal values of each objective function value in advance. Moreover, it does not consider the constraint conditions except for the objective functions. Therefore, it cannot apply to DDP which has many constraint conditions. To solve these issues, we proposed "provisional-ideal-point-based method." The proposed method defines a "penalty value" to search for feasible solutions. It also defines a new reference solution named "provisional-ideal point" to search for the preferred solution for a decision maker. In this way, we can eliminate the preliminary calculations and its limited application scope. The results of the benchmark test problems show that the proposed method can generate the preferred solution efficiently. The usefulness of the proposed method is also demonstrated by applying it to DDP. As a result, the delivery path when combining one drone and one truck drastically reduces the traveling distance and the delivery time compared with the case of using only one truck.

  11. An overlapped grid method for multigrid, finite volume/difference flow solvers: MaGGiE

    NASA Technical Reports Server (NTRS)

    Baysal, Oktay; Lessard, Victor R.

    1990-01-01

    The objective is to develop a domain decomposition method via overlapping/embedding the component grids, which is to be used by upwind, multi-grid, finite volume solution algorithms. A computer code, given the name MaGGiE (Multi-Geometry Grid Embedder) is developed to meet this objective. MaGGiE takes independently generated component grids as input, and automatically constructs the composite mesh and interpolation data, which can be used by the finite volume solution methods with or without multigrid convergence acceleration. Six demonstrative examples showing various aspects of the overlap technique are presented and discussed. These cases are used for developing the procedure for overlapping grids of different topologies, and to evaluate the grid connection and interpolation data for finite volume calculations on a composite mesh. Time fluxes are transferred between mesh interfaces using a trilinear interpolation procedure. Conservation losses are minimal at the interfaces using this method. The multi-grid solution algorithm, using the coaser grid connections, improves the convergence time history as compared to the solution on composite mesh without multi-gridding.

  12. Australian Schizophrenia Research Bank: a database of comprehensive clinical, endophenotypic and genetic data for aetiological studies of schizophrenia.

    PubMed

    Loughland, Carmel; Draganic, Daren; McCabe, Kathryn; Richards, Jacqueline; Nasir, Aslam; Allen, Joanne; Catts, Stanley; Jablensky, Assen; Henskens, Frans; Michie, Patricia; Mowry, Bryan; Pantelis, Christos; Schall, Ulrich; Scott, Rodney; Tooney, Paul; Carr, Vaughan

    2010-11-01

    This article describes the establishment of the Australian Schizophrenia Research Bank (ASRB), which operates to collect, store and distribute linked clinical, cognitive, neuroimaging and genetic data from a large sample of people with schizophrenia and healthy controls. Recruitment sources for the schizophrenia sample include a multi-media national advertising campaign, inpatient and community treatment services and non-government support agencies. Healthy controls have been recruited primarily through multi-media advertisements. All participants undergo an extensive diagnostic and family history assessment, neuropsychological evaluation, and blood sample donation for genetic studies. Selected individuals also complete structural MRI scans. Preliminary analyses of 493 schizophrenia cases and 293 healthy controls are reported. Mean age was 39.54 years (SD = 11.1) for the schizophrenia participants and 37.38 years (SD = 13.12) for healthy controls. Compared to the controls, features of the schizophrenia sample included a higher proportion of males (cases 65.9%; controls 46.8%), fewer living in married or de facto relationships (cases 16.1%; controls 53.6%) and fewer years of education (cases 13.05, SD = 2.84; controls 15.14, SD = 3.13), as well as lower current IQ (cases 102.68, SD = 15.51; controls 118.28, SD = 10.18). These and other sample characteristics are compared to those reported in another large Australian sample (i.e. the Low Prevalence Disorders Study), revealing some differences that reflect the different sampling methods of these two studies. The ASRB is a valuable and accessible schizophrenia research facility for use by approved scientific investigators. As recruitment continues, the approach to sampling for both cases and controls will need to be modified to ensure that the ASRB samples are as broadly representative as possible of all cases of schizophrenia and healthy controls.

  13. Development and Applications of Advanced Electronic Structure Methods

    NASA Astrophysics Data System (ADS)

    Bell, Franziska

    This dissertation contributes to three different areas in electronic structure theory. The first part of this thesis advances the fundamentals of orbital active spaces. Orbital active spaces are not only essential in multi-reference approaches, but have also become of interest in single-reference methods as they allow otherwise intractably large systems to be studied. However, despite their great importance, the optimal choice and, more importantly, their physical significance are still not fully understood. In order to address this problem, we studied the higher-order singular value decomposition (HOSVD) in the context of electronic structure methods. We were able to gain a physical understanding of the resulting orbitals and proved a connection to unrelaxed natural orbitals in the case of Moller-Plesset perturbation theory to second order (MP2). In the quest to find the optimal choice of the active space, we proposed a HOSVD for energy-weighted integrals, which yielded the fastest convergence in MP2 correlation energy for small- to medium-sized active spaces to date, and is also potentially transferable to coupled-cluster theory. In the second part, we studied monomeric and dimeric glycerol radical cations and their photo-induced dissociation in collaboration with Prof. Leone and his group. Understanding the mechanistic details involved in these processes are essential for further studies on the combustion of glycerol and carbohydrates. To our surprise, we found that in most cases, the experimentally observed appearance energies arise from the separation of product fragments from one another rather than rearrangement to products. The final chapters of this work focus on the development, assessment, and application of the spin-flip method, which is a single-reference approach, but capable of describing multi-reference problems. Systems exhibiting multi-reference character, which arises from the (near-) degeneracy of orbital energies, are amongst the most interesting in chemistry, biology and materials science, yet amongst the most challenging to study with electronic structure methods. In particular, we explored a substituted dimeric BPBP molecule with potential tetraradical character, which gained attention as one of the most promising candidates for an organic conductor. Furthermore, we extended the spin-flip approach to include variable orbital active spaces and multiple spin-flips. This allowed us to perform wave-function-based studies of ground- and excited-states of polynuclear metal complexes, polyradicals, and bond-dissociation processes involving three or more bonds.

  14. Candida Infective Endocarditis

    PubMed Central

    Baddley, John W.; Benjamin, Daniel K.; Patel, Mukesh; Miró, José; Athan, Eugene; Barsic, Bruno; Bouza, Emilio; Clara, Liliana; Elliott, Tom; Kanafani, Zeina; Klein, John; Lerakis, Stamatios; Levine, Donald; Spelman, Denis; Rubinstein, Ethan; Tornos, Pilar; Morris, Arthur J.; Pappas, Paul; Fowler, Vance G.; Chu, Vivian H.; Cabell, Christopher

    2009-01-01

    Purpose Candida infective endocarditis (IE) is uncommon but often fatal. Most epidemiologic data are derived from small case series or case reports. This study was conducted to explore epidemiology, treatment patterns, and outcomes of patients with Candida IE. Methods We compared 33 Candida IE cases to 2716 patients with non-fungal IE in the International Collaboration on Endocarditis - Prospective Cohort Study. Patients were enrolled and data collected from June 2000 until August 2005. Results Patients with Candida IE were more likely to have prosthetic valves (p<0.001), short term indwelling catheters (p<0.0001), and have healthcare-associated infection (p<0.001). Reasons for surgery differed between the two groups: myocardial abscess (46.7% vs. 22.2% p=0.026) and persistent positive blood cultures (33.3% vs. 9.9%, p=0.003) were more common among those with Candida IE. Mortality at discharge was higher in patients with Candida IE (30.3%) when compared to non-fungal cases (17%, p=0.046). Among Candida patients, mortality was similar in patients who received combination surgical and antifungal therapy versus antifungal therapy alone (33.3% vs. 27.8%, p=0.26). New antifungal drugs, particularly echinocandins, were used frequently. Conclusions These multi-center data suggest distinct epidemiologic features of Candida IE when compared to non-fungal cases. Indications for surgical intervention are different and mortality is increased. Newer antifungal treatment options are increasingly used. Large, multi-center studies are needed to help better define Candida IE. PMID:18283504

  15. Privacy Preserving Facial and Fingerprint Multi-biometric Authentication

    NASA Astrophysics Data System (ADS)

    Anzaku, Esla Timothy; Sohn, Hosik; Ro, Yong Man

    The cases of identity theft can be mitigated by the adoption of secure authentication methods. Biohashing and its variants, which utilizes secret keys and biometrics, are promising methods for secure authentication; however, their shortcoming is the degraded performance under the assumption that secret keys are compromised. In this paper, we extend the concept of Biohashing to multi-biometrics - facial and fingerprint traits. We chose these traits because they are widely used, howbeit, little research attention has been given to designing privacy preserving multi-biometric systems using them. Instead of just using a single modality (facial or fingerprint), we presented a framework for using both modalities. The improved performance of the proposed method, using face and fingerprint, as against either facial or fingerprint trait used in isolation is evaluated using two chimerical bimodal databases formed from publicly available facial and fingerprint databases.

  16. Applying multi-criteria decision-making to improve the waste reduction policy in Taiwan.

    PubMed

    Su, Jun-Pin; Hung, Ming-Lung; Chao, Chia-Wei; Ma, Hwong-wen

    2010-01-01

    Over the past two decades, the waste reduction problem has been a major issue in environmental protection. Both recycling and waste reduction policies have become increasingly important. As the complexity of decision-making has increased, it has become evident that more factors must be considered in the development and implementation of policies aimed at resource recycling and waste reduction. There are many studies focused on waste management excluding waste reduction. This study paid more attention to waste reduction. Social, economic, and management aspects of waste treatment policies were considered in this study. Further, a life-cycle assessment model was applied as an evaluation system for the environmental aspect. Results of both quantitative and qualitative analyses on the social, economic, and management aspects were integrated via the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) method into the comprehensive decision-making support system of multi-criteria decision-making (MCDM). A case study evaluating the waste reduction policy in Taoyuan County is presented to demonstrate the feasibility of this model. In the case study, reinforcement of MSW sorting was shown to be the best practice. The model in this study can be applied to other cities faced with the waste reduction problems.

  17. Functions of behavior change interventions when implementing multi-professional teamwork at an emergency department: a comparative case study

    PubMed Central

    2014-01-01

    Background While there is strong support for the benefits of working in multi-professional teams in health care, the implementation of multi-professional teamwork is reported to be complex and challenging. Implementation strategies combining multiple behavior change interventions are recommended, but the understanding of how and why the behavior change interventions influence staff behavior is limited. There is a lack of studies focusing on the functions of different behavior change interventions and the mechanisms driving behavior change. In this study, applied behavior analysis is used to analyze the function and impact of different behavior change interventions when implementing multi-professional teamwork. Methods A comparative case study design was applied. Two sections of an emergency department implemented multi-professional teamwork involving changes in work processes, aimed at increasing inter-professional collaboration. Behavior change interventions and staff behavior change were studied using observations, interviews and document analysis. Using a hybrid thematic analysis, the behavior change interventions were categorized according to the DCOM® model. The functions of the behavior change interventions were then analyzed using applied behavior analysis. Results The two sections used different behavior change interventions, resulting in a large difference in the degree of staff behavior change. The successful section enabled staff performance of teamwork behaviors with a strategy based on ongoing problem-solving and frequent clarification of directions. Managerial feedback initially played an important role in motivating teamwork behaviors. Gradually, as staff started to experience positive outcomes of the intervention, motivation for teamwork behaviors was replaced by positive task-generated feedback. Conclusions The functional perspective of applied behavior analysis offers insight into the behavioral mechanisms that describe how and why behavior change interventions influence staff behavior. The analysis demonstrates how enabling behavior change interventions, managerial feedback and task-related feedback interact in their influence on behavior and have complementary functions during different stages of implementation. PMID:24885212

  18. Multifocal necrotising fasciitis: an overlooked entity?

    PubMed

    El-Khani, Ussamah; Nehme, Jean; Darwish, Ammar; Jamnadas-Khoda, Benjamin; Scerri, Godwin; Heppell, Simon; Bennett, Nicholas

    2012-04-01

    The aim of the study is to report a case of multi-focal necrotising fasciitis, review research on this subject to identify common aetiological factors and highlight suggestions to improve management. Necrotising fasciitis is a severe, life-threatening soft tissue infection that typically arises from a single area, usually secondary to a minor penetrating injury. Multi-focal necrotising fasciitis, where there is more than one non-contiguous area of necrosis, is much less commonly reported. There are no guidelines specific to the management of multi-focal necrotising fasciitis, and its under-reporting may lead to missed management opportunities. A systematic literature review in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) protocol. A search of MEDLINE, OLD MEDLINE and the Cochrane Collaboration was performed from 1966 to March 2011 using 16 search terms. All articles were screened for genuine non-contiguous multi-focal necrotising fasciitis. Of the papers that met this criterion, data on patient demographics, likely inciting injury, presentation time-line, microbial agents, sites affected, objective assessment scores, treatment and outcome were extracted. A total of 31 studies met our inclusion criteria and 33 individual cases of multi-focal necrotising fasciitis were included in the quantitative analysis. About half (52%) of cases were type II necrotising fasciitis; 42% of cases had identifiable inciting injuries; 21% of cases developed multi-focal lesions non-synchronously, of which 86% were type II. Nearly all (94%) of cases had incomplete objective assessment scores. One case identified inflammatory imaging findings prior to clinical necrosis. Multifocality in necrotising fasciitis is likely to be associated with type II disease. We postulate that validated objective tools will aid necrotising fasciitis management pathways that will identify high-risk groups for multifocality and advise early pre-emptive imaging. We recommend the adoption of regional multi-focal necrotising fasciitis registers. Copyright © 2011 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  19. Brand Coherence at a Major Multi-Campus Public Research University: An Exploratory Case Study

    ERIC Educational Resources Information Center

    Zinkan, Rob

    2016-01-01

    With increased competition and other market forces affecting higher education, branding has emerged as a strategic imperative for colleges and universities. Branding in academia faces many inherent challenges, including institutions' multi-pronged missions and decentralized organizational structures. In some cases, branding is not widely…

  20. A multi-criteria spatial deprivation index to support health inequality analyses.

    PubMed

    Cabrera-Barona, Pablo; Murphy, Thomas; Kienberger, Stefan; Blaschke, Thomas

    2015-03-20

    Deprivation indices are useful measures to analyze health inequalities. There are several methods to construct these indices, however, few studies have used Geographic Information Systems (GIS) and Multi-Criteria methods to construct a deprivation index. Therefore, this study applies Multi-Criteria Evaluation to calculate weights for the indicators that make up the deprivation index and a GIS-based fuzzy approach to create different scenarios of this index is also implemented. The Analytical Hierarchy Process (AHP) is used to obtain the weights for the indicators of the index. The Ordered Weighted Averaging (OWA) method using linguistic quantifiers is applied in order to create different deprivation scenarios. Geographically Weighted Regression (GWR) and a Moran's I analysis are employed to explore spatial relationships between the different deprivation measures and two health factors: the distance to health services and the percentage of people that have never had a live birth. This last indicator was considered as the dependent variable in the GWR. The case study is Quito City, in Ecuador. The AHP-based deprivation index show medium and high levels of deprivation (0,511 to 1,000) in specific zones of the study area, even though most of the study area has low values of deprivation. OWA results show deprivation scenarios that can be evaluated considering the different attitudes of decision makers. GWR results indicate that the deprivation index and its OWA scenarios can be considered as local estimators for health related phenomena. Moran's I calculations demonstrate that several deprivation scenarios, in combination with the 'distance to health services' factor, could be explanatory variables to predict the percentage of people that have never had a live birth. The AHP-based deprivation index and the OWA deprivation scenarios developed in this study are Multi-Criteria instruments that can support the identification of highly deprived zones and can support health inequalities analysis in combination with different health factors. The methodology described in this study can be applied in other regions of the world to develop spatial deprivation indices based on Multi-Criteria analysis.

  1. Cross disease analysis of co-functional microRNA pairs on a reconstructed network of disease-gene-microRNA tripartite.

    PubMed

    Peng, Hui; Lan, Chaowang; Zheng, Yi; Hutvagner, Gyorgy; Tao, Dacheng; Li, Jinyan

    2017-03-24

    MicroRNAs always function cooperatively in their regulation of gene expression. Dysfunctions of these co-functional microRNAs can play significant roles in disease development. We are interested in those multi-disease associated co-functional microRNAs that regulate their common dysfunctional target genes cooperatively in the development of multiple diseases. The research is potentially useful for human disease studies at the transcriptional level and for the study of multi-purpose microRNA therapeutics. We designed a computational method to detect multi-disease associated co-functional microRNA pairs and conducted cross disease analysis on a reconstructed disease-gene-microRNA (DGR) tripartite network. The construction of the DGR tripartite network is by the integration of newly predicted disease-microRNA associations with those relationships of diseases, microRNAs and genes maintained by existing databases. The prediction method uses a set of reliable negative samples of disease-microRNA association and a pre-computed kernel matrix instead of kernel functions. From this reconstructed DGR tripartite network, multi-disease associated co-functional microRNA pairs are detected together with their common dysfunctional target genes and ranked by a novel scoring method. We also conducted proof-of-concept case studies on cancer-related co-functional microRNA pairs as well as on non-cancer disease-related microRNA pairs. With the prioritization of the co-functional microRNAs that relate to a series of diseases, we found that the co-function phenomenon is not unusual. We also confirmed that the regulation of the microRNAs for the development of cancers is more complex and have more unique properties than those of non-cancer diseases.

  2. The Role of Attention in Somatosensory Processing: A Multi-Trait, Multi-Method Analysis

    ERIC Educational Resources Information Center

    Wodka, Ericka L.; Puts, Nicolaas A. J.; Mahone, E. Mark; Edden, Richard A. E.; Tommerdahl, Mark; Mostofsky, Stewart H.

    2016-01-01

    Sensory processing abnormalities in autism have largely been described by parent report. This study used a multi-method (parent-report and measurement), multi-trait (tactile sensitivity and attention) design to evaluate somatosensory processing in ASD. Results showed multiple significant within-method (e.g., parent report of different…

  3. Development and application of dynamic hybrid multi-region inventory analysis for macro-level environmental policy analysis: a case study on climate policy in Taiwan.

    PubMed

    Chao, Chia-Wei; Heijungs, Reinout; Ma, Hwong-wen

    2013-03-19

    We develop a novel inventory method called Dynamic Hybrid Multi-Region Inventory analysis (DHMRI), which integrates the EEMRIOA and Integrated Hybrid LCA and applies time-dependent environmental intervention information for inventory analysis. Consequently, DHMRI is able to quantify the change in the environmental footprint caused by a specific policy while taking structural changes and technological dynamics into consideration. DHMRI is applied to assess the change in the total CO2 emissions associated with the total final demand caused by the climate policy in Taiwan to demonstrate the practicality of this novel method. The evaluation reveals that the implementation of mitigation measures included in the existing climate policy, such as an enhancement in energy efficiency, promotion of renewable energy, and limitation of the growth of energy-intensive industries, will lead to a 28% increase in the total CO2 emissions and that the main driver is the export-oriented electronics industry. Moreover, a major increase in the total emissions is predicted to occur in Southeast Asia and China. The observations from the case study reveal that DHMRI is capable of overcoming the limitations of existing assessment tools at macro-level evaluation of environmental policies.

  4. An optimal design of wind turbine and ship structure based on neuro-response surface method

    NASA Astrophysics Data System (ADS)

    Lee, Jae-Chul; Shin, Sung-Chul; Kim, Soo-Young

    2015-07-01

    The geometry of engineering systems affects their performances. For this reason, the shape of engineering systems needs to be optimized in the initial design stage. However, engineering system design problems consist of multi-objective optimization and the performance analysis using commercial code or numerical analysis is generally time-consuming. To solve these problems, many engineers perform the optimization using the approximation model (response surface). The Response Surface Method (RSM) is generally used to predict the system performance in engineering research field, but RSM presents some prediction errors for highly nonlinear systems. The major objective of this research is to establish an optimal design method for multi-objective problems and confirm its applicability. The proposed process is composed of three parts: definition of geometry, generation of response surface, and optimization process. To reduce the time for performance analysis and minimize the prediction errors, the approximation model is generated using the Backpropagation Artificial Neural Network (BPANN) which is considered as Neuro-Response Surface Method (NRSM). The optimization is done for the generated response surface by non-dominated sorting genetic algorithm-II (NSGA-II). Through case studies of marine system and ship structure (substructure of floating offshore wind turbine considering hydrodynamics performances and bulk carrier bottom stiffened panels considering structure performance), we have confirmed the applicability of the proposed method for multi-objective side constraint optimization problems.

  5. Applying the methodology of Design of Experiments to stability studies: a Partial Least Squares approach for evaluation of drug stability.

    PubMed

    Jordan, Nika; Zakrajšek, Jure; Bohanec, Simona; Roškar, Robert; Grabnar, Iztok

    2018-05-01

    The aim of the present research is to show that the methodology of Design of Experiments can be applied to stability data evaluation, as they can be seen as multi-factor and multi-level experimental designs. Linear regression analysis is usually an approach for analyzing stability data, but multivariate statistical methods could also be used to assess drug stability during the development phase. Data from a stability study for a pharmaceutical product with hydrochlorothiazide (HCTZ) as an unstable drug substance was used as a case example in this paper. The design space of the stability study was modeled using Umetrics MODDE 10.1 software. We showed that a Partial Least Squares model could be used for a multi-dimensional presentation of all data generated in a stability study and for determination of the relationship among factors that influence drug stability. It might also be used for stability predictions and potentially for the optimization of the extent of stability testing needed to determine shelf life and storage conditions, which would be time and cost-effective for the pharmaceutical industry.

  6. Design synthesis and optimization of permanent magnet synchronous machines based on computationally-efficient finite element analysis

    NASA Astrophysics Data System (ADS)

    Sizov, Gennadi Y.

    In this dissertation, a model-based multi-objective optimal design of permanent magnet ac machines, supplied by sine-wave current regulated drives, is developed and implemented. The design procedure uses an efficient electromagnetic finite element-based solver to accurately model nonlinear material properties and complex geometric shapes associated with magnetic circuit design. Application of an electromagnetic finite element-based solver allows for accurate computation of intricate performance parameters and characteristics. The first contribution of this dissertation is the development of a rapid computational method that allows accurate and efficient exploration of large multi-dimensional design spaces in search of optimum design(s). The computationally efficient finite element-based approach developed in this work provides a framework of tools that allow rapid analysis of synchronous electric machines operating under steady-state conditions. In the developed modeling approach, major steady-state performance parameters such as, winding flux linkages and voltages, average, cogging and ripple torques, stator core flux densities, core losses, efficiencies and saturated machine winding inductances, are calculated with minimum computational effort. In addition, the method includes means for rapid estimation of distributed stator forces and three-dimensional effects of stator and/or rotor skew on the performance of the machine. The second contribution of this dissertation is the development of the design synthesis and optimization method based on a differential evolution algorithm. The approach relies on the developed finite element-based modeling method for electromagnetic analysis and is able to tackle large-scale multi-objective design problems using modest computational resources. Overall, computational time savings of up to two orders of magnitude are achievable, when compared to current and prevalent state-of-the-art methods. These computational savings allow one to expand the optimization problem to achieve more complex and comprehensive design objectives. The method is used in the design process of several interior permanent magnet industrial motors. The presented case studies demonstrate that the developed finite element-based approach practically eliminates the need for using less accurate analytical and lumped parameter equivalent circuit models for electric machine design optimization. The design process and experimental validation of the case-study machines are detailed in the dissertation.

  7. Semi-Lagrangian particle methods for high-dimensional Vlasov-Poisson systems

    NASA Astrophysics Data System (ADS)

    Cottet, Georges-Henri

    2018-07-01

    This paper deals with the implementation of high order semi-Lagrangian particle methods to handle high dimensional Vlasov-Poisson systems. It is based on recent developments in the numerical analysis of particle methods and the paper focuses on specific algorithmic features to handle large dimensions. The methods are tested with uniform particle distributions in particular against a recent multi-resolution wavelet based method on a 4D plasma instability case and a 6D gravitational case. Conservation properties, accuracy and computational costs are monitored. The excellent accuracy/cost trade-off shown by the method opens new perspective for accurate simulations of high dimensional kinetic equations by particle methods.

  8. Consensus oriented fuzzified decision support for oil spill contingency management.

    PubMed

    Liu, Xin; Wirtz, Kai W

    2006-06-30

    Studies on multi-group multi-criteria decision-making problems for oil spill contingency management are in their infancy. This paper presents a second-order fuzzy comprehensive evaluation (FCE) model to resolve decision-making problems in the area of contingency management after environmental disasters such as oil spills. To assess the performance of different oil combat strategies, second-order FCE allows for the utilization of lexical information, the consideration of ecological and socio-economic criteria and the involvement of a variety of stakeholders. On the other hand, the new approach can be validated by using internal and external checks, which refer to sensitivity tests regarding its internal setups and comparisons with other methods, respectively. Through a case study, the Pallas oil spill in the German Bight in 1998, it is demonstrated that this approach can help decision makers who search for an optimal strategy in multi-thread contingency problems and has a wider application potential in the field of integrated coastal zone management.

  9. Labeled RFS-Based Track-Before-Detect for Multiple Maneuvering Targets in the Infrared Focal Plane Array.

    PubMed

    Li, Miao; Li, Jun; Zhou, Yiyu

    2015-12-08

    The problem of jointly detecting and tracking multiple targets from the raw observations of an infrared focal plane array is a challenging task, especially for the case with uncertain target dynamics. In this paper a multi-model labeled multi-Bernoulli (MM-LMB) track-before-detect method is proposed within the labeled random finite sets (RFS) framework. The proposed track-before-detect method consists of two parts-MM-LMB filter and MM-LMB smoother. For the MM-LMB filter, original LMB filter is applied to track-before-detect based on target and measurement models, and is integrated with the interacting multiple models (IMM) approach to accommodate the uncertainty of target dynamics. For the MM-LMB smoother, taking advantage of the track labels and posterior model transition probability, the single-model single-target smoother is extended to a multi-model multi-target smoother. A Sequential Monte Carlo approach is also presented to implement the proposed method. Simulation results show the proposed method can effectively achieve tracking continuity for multiple maneuvering targets. In addition, compared with the forward filtering alone, our method is more robust due to its combination of forward filtering and backward smoothing.

  10. Labeled RFS-Based Track-Before-Detect for Multiple Maneuvering Targets in the Infrared Focal Plane Array

    PubMed Central

    Li, Miao; Li, Jun; Zhou, Yiyu

    2015-01-01

    The problem of jointly detecting and tracking multiple targets from the raw observations of an infrared focal plane array is a challenging task, especially for the case with uncertain target dynamics. In this paper a multi-model labeled multi-Bernoulli (MM-LMB) track-before-detect method is proposed within the labeled random finite sets (RFS) framework. The proposed track-before-detect method consists of two parts—MM-LMB filter and MM-LMB smoother. For the MM-LMB filter, original LMB filter is applied to track-before-detect based on target and measurement models, and is integrated with the interacting multiple models (IMM) approach to accommodate the uncertainty of target dynamics. For the MM-LMB smoother, taking advantage of the track labels and posterior model transition probability, the single-model single-target smoother is extended to a multi-model multi-target smoother. A Sequential Monte Carlo approach is also presented to implement the proposed method. Simulation results show the proposed method can effectively achieve tracking continuity for multiple maneuvering targets. In addition, compared with the forward filtering alone, our method is more robust due to its combination of forward filtering and backward smoothing. PMID:26670234

  11. Analysis of 2000 cases treated with gamma knife surgery: validating eligibility criteria for a prospective multi-institutional study of stereotactic radiosurgery alone for treatment of patients with 1-10 brain metastases (JLGK0901) in Japan

    PubMed Central

    Higuchi, Yoshinori; Nagano, Osamu; Sato, Yasunori; Yamamoto, Masaaki; Ono, Junichi; Saeki, Naokatsu; Miyakawa, Akifumi; Hirai, Tatsuo

    2012-01-01

    Objective The Japan Leksell Gamma Knife (JLGK) Society has conducted a prospective multi-institute study (JLGK0901, UNIN000001812) for selected patients in order to prove the effectiveness of stereotactic radiosurgery (SRS) alone using the gamma knife (GK) for 1-10 brain lesions. Herein, we verify the validity of 5 major patient selection criteria for the JLGK0901 trial. Materials and Methods Between 1998 and 2010, 2246 consecutive cases with 10352 brain metastases treated with GK were analyzed to determine the validity of the following 5 major JLGK0901 criteria; 1) 1-10 brain lesions, 2) less than 10 cm3 volume of the largest tumor, 3) no more than 15 cm3 total tumor volume, 4) no cerebrospinal fluid (CSF) dissemination, 5) Karnofsky performance status (KPS) score ≥70. Results For cases with >10 brain metastases, salvage treatments for new lesions were needed more frequently. The tumor control rate for lesions larger than 10 cm3 was significantly lower than that of tumors <10 cm3. Overall, neurological and qualitative survivals (OS, NS, QS) of cases with >15 cm3 total tumor volume or positive magnetic resonance imaging findings of CSF were significantly poorer. Outcomes in cases with KPS <70 were significantly poorer in terms of OS. Conclusion Our retrospective results of 2246 GK-treated cases verified the validity of the 5 major JLGK0901 criteria. The inclusion criteria for the JLGK0901 study are appearently good indications for SRS. PMID:29296339

  12. 'Away Days' in multi-centre randomised controlled trials: a questionnaire survey of their use and a case study on the effect of one Away Day on patient recruitment.

    PubMed

    Jefferson, Laura; Cook, Liz; Keding, Ada; Brealey, Stephen; Handoll, Helen; Rangan, Amar

    2015-11-06

    'Away Days' (trial promotion and training events for trial site personnel) are a well-established method used by trialists to encourage engagement of research sites in the recruitment of patients to multi-centre randomised controlled trials (RCTs). We explored the use of Away Days in multi-centre RCTs and analysed the effect on patient recruitment in a case study. Members of the United Kingdom Trial Managers' Network were surveyed in June 2013 to investigate their experiences in the design and conduct of Away Days in RCTs. We used data from a multi-centre pragmatic surgical trial to explore the effects of an Away Day on the screening and recruitment of patients. A total of 94 people responded to the survey. The majority (78%), who confirmed had organised an Away Day previously, found them to be useful. This is despite their costs.. There was no evidence, however, from the analysis of data from a surgical trial that attendance at an Away Day increased the number of patients screened or recruited at participating sites. Although those responsible for managing RCTs in the UK tend to believe that trial Away Days are beneficial, evidence from a multi-centre surgical trial shows no improvement on a key indicator of trial success. This points to the need to carefully consider the aims, design and conduct of Away Days. Further more rigorous research nested within RCTs would be valuable to evaluate the design and conduct of Away Days. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  13. Multi-model ensemble hydrological simulation using a BP Neural Network for the upper Yalongjiang River Basin, China

    NASA Astrophysics Data System (ADS)

    Li, Zhanjie; Yu, Jingshan; Xu, Xinyi; Sun, Wenchao; Pang, Bo; Yue, Jiajia

    2018-06-01

    Hydrological models are important and effective tools for detecting complex hydrological processes. Different models have different strengths when capturing the various aspects of hydrological processes. Relying on a single model usually leads to simulation uncertainties. Ensemble approaches, based on multi-model hydrological simulations, can improve application performance over single models. In this study, the upper Yalongjiang River Basin was selected for a case study. Three commonly used hydrological models (SWAT, VIC, and BTOPMC) were selected and used for independent simulations with the same input and initial values. Then, the BP neural network method was employed to combine the results from the three models. The results show that the accuracy of BP ensemble simulation is better than that of the single models.

  14. Merging Contexts and Shifting Practice within Professional Development Schools: Teachers as Teacher Educators and the Nature of Resistance--A Critical Multi-Case Study

    ERIC Educational Resources Information Center

    Wallace, Mary Ann

    2012-01-01

    This qualitative critical multi-case study examines the nature of resistance as it emerges within the work of two urban secondary teachers acting as teacher educators, each teaching a secondary teacher preparation course within their own respective school context. Both research sites are discursively and functionally similar in terms of their…

  15. Creating an Effective Educational Environment for Adult Learners: A Qualitative, Multi-Case Study of Off-Campus Center Administrator's Use of Invitational Leadership

    ERIC Educational Resources Information Center

    McKnight, Carolyn P.

    2012-01-01

    This qualitative, multi-case study was designed to examine off-campus centers and their administrators in creating an effective learning environment for adult learners. Serving as the conceptual framework, invitational leadership theory is a holistic approach which nurtures the belief that everyone is intrinsically motivated and it is the leaders'…

  16. The Multi-Disciplinary Graduate Program in Educational Research. Final Report, Part IV; The Utilization of Sociological Ideas in Organizational Planning: A Case Study.

    ERIC Educational Resources Information Center

    Lazarsfeld, Paul F., Ed.

    This document, the fourth in the final report on the Multi-Disciplinary Graduate Program in Educational Research, is a qualitative case study designed to show the form of sociological contributions to and the role of sociologists in policy formulation at an American Educational Research Association (AERA) colloquium. Discussions at the conference…

  17. Differential evolution-based multi-objective optimization for the definition of a health indicator for fault diagnostics and prognostics

    NASA Astrophysics Data System (ADS)

    Baraldi, P.; Bonfanti, G.; Zio, E.

    2018-03-01

    The identification of the current degradation state of an industrial component and the prediction of its future evolution is a fundamental step for the development of condition-based and predictive maintenance approaches. The objective of the present work is to propose a general method for extracting a health indicator to measure the amount of component degradation from a set of signals measured during operation. The proposed method is based on the combined use of feature extraction techniques, such as Empirical Mode Decomposition and Auto-Associative Kernel Regression, and a multi-objective Binary Differential Evolution (BDE) algorithm for selecting the subset of features optimal for the definition of the health indicator. The objectives of the optimization are desired characteristics of the health indicator, such as monotonicity, trendability and prognosability. A case study is considered, concerning the prediction of the remaining useful life of turbofan engines. The obtained results confirm that the method is capable of extracting health indicators suitable for accurate prognostics.

  18. A point-by-point multi-scale surface temperature reconstruction method and tests by pseudo proxy experiments

    NASA Astrophysics Data System (ADS)

    Chen, X.

    2016-12-01

    This study present a multi-scale approach combining Mode Decomposition and Variance Matching (MDVM) method and basic process of Point-by-Point Regression (PPR) method. Different from the widely applied PPR method, the scanning radius for each grid box, were re-calculated considering the impact from topography (i.e. mean altitudes and fluctuations). Thus, appropriate proxy records were selected to be candidates for reconstruction. The results of this multi-scale methodology could not only provide the reconstructed gridded temperature, but also the corresponding uncertainties of the four typical timescales. In addition, this method can bring in another advantage that spatial distribution of the uncertainty for different scales could be quantified. To interpreting the necessity of scale separation in calibration, with proxy records location over Eastern Asia, we perform two sets of pseudo proxy experiments (PPEs) based on different ensembles of climate model simulation. One consist of 7 simulated results by 5 models (BCC-CSM1-1, CSIRO-MK3L-1-2, HadCM3, MPI-ESM-P, and Giss-E2-R) of the "past1000" simulation from Coupled Model Intercomparison Project Phase 5. The other is based on the simulations of Community Earth System Model Last Millennium Ensemble (CESM-LME). The pseudo-records network were obtained by adding the white noise with signal-to-noise ratio (SNR) increasing from 0.1 to 1.0 to the simulated true state and the locations mainly followed the PAGES-2k network in Asia. Totally, 400 years (1601-2000) simulation was used for calibration and 600 years (1001-1600) for verification. The reconstructed results were evaluated by three metrics 1) root mean squared error (RMSE), 2) correlation and 3) reduction of error (RE) score. The PPE verification results have shown that, in comparison with ordinary linear calibration method (variance matching), the RMSE and RE score of PPR-MDVM are improved, especially for the area with sparse proxy records. To be noted, in some periods with large volcanic activities, the RMSE of MDVM get larger than VM for higher SNR cases. It should be inferred that the volcanic eruptions might blur the intrinsic characteristics of multi-scales variabilities of the climate system and the MDVM method would show less advantage in that case.

  19. A multi-material topology optimization approach for wrinkle-free design of cable-suspended membrane structures

    NASA Astrophysics Data System (ADS)

    Luo, Yangjun; Niu, Yanzhuang; Li, Ming; Kang, Zhan

    2017-06-01

    In order to eliminate stress-related wrinkles in cable-suspended membrane structures and to provide simple and reliable deployment, this study presents a multi-material topology optimization model and an effective solution procedure for generating optimal connected layouts for membranes and cables. On the basis of the principal stress criterion of membrane wrinkling behavior and the density-based interpolation of multi-phase materials, the optimization objective is to maximize the total structural stiffness while satisfying principal stress constraints and specified material volume requirements. By adopting the cosine-type relaxation scheme to avoid the stress singularity phenomenon, the optimization model is successfully solved through a standard gradient-based algorithm. Four-corner tensioned membrane structures with different loading cases were investigated to demonstrate the effectiveness of the proposed method in automatically finding the optimal design composed of curved boundary cables and wrinkle-free membranes.

  20. A high-order multi-zone cut-stencil method for numerical simulations of high-speed flows over complex geometries

    NASA Astrophysics Data System (ADS)

    Greene, Patrick T.; Eldredge, Jeff D.; Zhong, Xiaolin; Kim, John

    2016-07-01

    In this paper, we present a method for performing uniformly high-order direct numerical simulations of high-speed flows over arbitrary geometries. The method was developed with the goal of simulating and studying the effects of complex isolated roughness elements on the stability of hypersonic boundary layers. The simulations are carried out on Cartesian grids with the geometries imposed by a third-order cut-stencil method. A fifth-order hybrid weighted essentially non-oscillatory scheme was implemented to capture any steep gradients in the flow created by the geometries and a third-order Runge-Kutta method is used for time advancement. A multi-zone refinement method was also utilized to provide extra resolution at locations with expected complex physics. The combination results in a globally fourth-order scheme in space and third order in time. Results confirming the method's high order of convergence are shown. Two-dimensional and three-dimensional test cases are presented and show good agreement with previous results. A simulation of Mach 3 flow over the logo of the Ubuntu Linux distribution is shown to demonstrate the method's capabilities for handling complex geometries. Results for Mach 6 wall-bounded flow over a three-dimensional cylindrical roughness element are also presented. The results demonstrate that the method is a promising tool for the study of hypersonic roughness-induced transition.

  1. Case studies of aerosol and ocean color retrieval using a Markov chain radiative transfer model and AirMSPI measurements

    NASA Astrophysics Data System (ADS)

    Xu, F.; Diner, D. J.; Seidel, F. C.; Dubovik, O.; Zhai, P.

    2014-12-01

    A vector Markov chain radiative transfer method was developed for forward modeling of radiance and polarization fields in a coupled atmosphere-ocean system. The method was benchmarked against an independent Successive Orders of Scattering code and linearized through the use of Jacobians. Incorporated with the multi-patch optimization algorithm and look-up-table method, simultaneous aerosol and ocean color retrievals were performed using imagery acquired by the Airborne Multiangle SpectroPolarimetric Imager (AirMSPI) when it was operated in step-and-stare mode with 9 viewing angles ranging between ±67°. Data from channels near 355, 380, 445, 470*, 555, 660*, and 865* nm were used in the retrievals, where the asterisk denotes the polarimetric bands. Retrievals were run for AirMSPI overflights over Southern California and Monterey Bay, CA. For the relatively high aerosol optical depth (AOD) case (~0.28 at 550 nm), the retrieved aerosol concentration, size distribution, water-leaving radiance, and chlorophyll concentration were compared to those reported by the USC SeaPRISM AERONET-OC site off the coast of Southern California on 6 February 2013. For the relatively low AOD case (~0.08 at 550 nm), the retrieved aerosol concentration and size distribution were compared to those reported by the Monterey Bay AERONET site on 28 April 2014. Further, we evaluate the benefits of multi-angle and polarimetric observations by performing the retrievals using (a) all view angles and channels; (b) all view angles but radiances only (no polarization); (c) the nadir view angle only with both radiance and polarization; and (d) the nadir view angle without polarization. Optimized retrievals using different initial guesses were performed to provide a measure of retrieval uncertainty. Removal of multi-angular or polarimetric information resulted in increases in both parameter uncertainty and systematic bias. Potential accuracy improvements afforded by applying constraints on the surface and aerosol parametric models will also be discussed.

  2. Expert Knowledge-Based Automatic Sleep Stage Determination by Multi-Valued Decision Making Method

    NASA Astrophysics Data System (ADS)

    Wang, Bei; Sugi, Takenao; Kawana, Fusae; Wang, Xingyu; Nakamura, Masatoshi

    In this study, an expert knowledge-based automatic sleep stage determination system working on a multi-valued decision making method is developed. Visual inspection by a qualified clinician is adopted to obtain the expert knowledge database. The expert knowledge database consists of probability density functions of parameters for various sleep stages. Sleep stages are determined automatically according to the conditional probability. Totally, four subjects were participated. The automatic sleep stage determination results showed close agreements with the visual inspection on sleep stages of awake, REM (rapid eye movement), light sleep and deep sleep. The constructed expert knowledge database reflects the distributions of characteristic parameters which can be adaptive to variable sleep data in hospitals. The developed automatic determination technique based on expert knowledge of visual inspection can be an assistant tool enabling further inspection of sleep disorder cases for clinical practice.

  3. Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Knox, Lenora A.

    The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.

  4. The multi-reference retaining the excitation degree perturbation theory: A size-consistent, unitary invariant, and rapidly convergent wavefunction based ab initio approach

    NASA Astrophysics Data System (ADS)

    Fink, Reinhold F.

    2009-02-01

    The retaining the excitation degree (RE) partitioning [R.F. Fink, Chem. Phys. Lett. 428 (2006) 461(20 September)] is reformulated and applied to multi-reference cases with complete active space (CAS) reference wave functions. The generalised van Vleck perturbation theory is employed to set up the perturbation equations. It is demonstrated that this leads to a consistent and well defined theory which fulfils all important criteria of a generally applicable ab initio method: The theory is proven numerically and analytically to be size-consistent and invariant with respect to unitary orbital transformations within the inactive, active and virtual orbital spaces. In contrast to most previously proposed multi-reference perturbation theories the necessary condition for a proper perturbation theory to fulfil the zeroth order perturbation equation is exactly satisfied with the RE partitioning itself without additional projectors on configurational spaces. The theory is applied to several excited states of the benchmark systems CH2 , SiH2 , and NH2 , as well as to the lowest states of the carbon, nitrogen and oxygen atoms. In all cases comparisons are made with full configuration interaction results. The multi-reference (MR)-RE method is shown to provide very rapidly converging perturbation series. Energy differences between states of similar configurations converge even faster.

  5. Utility of a novel error-stepping method to improve gradient-based parameter identification by increasing the smoothness of the local objective surface: a case-study of pulmonary mechanics.

    PubMed

    Docherty, Paul D; Schranz, Christoph; Chase, J Geoffrey; Chiew, Yeong Shiong; Möller, Knut

    2014-05-01

    Accurate model parameter identification relies on accurate forward model simulations to guide convergence. However, some forward simulation methodologies lack the precision required to properly define the local objective surface and can cause failed parameter identification. The role of objective surface smoothness in identification of a pulmonary mechanics model was assessed using forward simulation from a novel error-stepping method and a proprietary Runge-Kutta method. The objective surfaces were compared via the identified parameter discrepancy generated in a Monte Carlo simulation and the local smoothness of the objective surfaces they generate. The error-stepping method generated significantly smoother error surfaces in each of the cases tested (p<0.0001) and more accurate model parameter estimates than the Runge-Kutta method in three of the four cases tested (p<0.0001) despite a 75% reduction in computational cost. Of note, parameter discrepancy in most cases was limited to a particular oblique plane, indicating a non-intuitive multi-parameter trade-off was occurring. The error-stepping method consistently improved or equalled the outcomes of the Runge-Kutta time-integration method for forward simulations of the pulmonary mechanics model. This study indicates that accurate parameter identification relies on accurate definition of the local objective function, and that parameter trade-off can occur on oblique planes resulting prematurely halted parameter convergence. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  6. Multi-scale Gaussian representation and outline-learning based cell image segmentation.

    PubMed

    Farhan, Muhammad; Ruusuvuori, Pekka; Emmenlauer, Mario; Rämö, Pauli; Dehio, Christoph; Yli-Harja, Olli

    2013-01-01

    High-throughput genome-wide screening to study gene-specific functions, e.g. for drug discovery, demands fast automated image analysis methods to assist in unraveling the full potential of such studies. Image segmentation is typically at the forefront of such analysis as the performance of the subsequent steps, for example, cell classification, cell tracking etc., often relies on the results of segmentation. We present a cell cytoplasm segmentation framework which first separates cell cytoplasm from image background using novel approach of image enhancement and coefficient of variation of multi-scale Gaussian scale-space representation. A novel outline-learning based classification method is developed using regularized logistic regression with embedded feature selection which classifies image pixels as outline/non-outline to give cytoplasm outlines. Refinement of the detected outlines to separate cells from each other is performed in a post-processing step where the nuclei segmentation is used as contextual information. We evaluate the proposed segmentation methodology using two challenging test cases, presenting images with completely different characteristics, with cells of varying size, shape, texture and degrees of overlap. The feature selection and classification framework for outline detection produces very simple sparse models which use only a small subset of the large, generic feature set, that is, only 7 and 5 features for the two cases. Quantitative comparison of the results for the two test cases against state-of-the-art methods show that our methodology outperforms them with an increase of 4-9% in segmentation accuracy with maximum accuracy of 93%. Finally, the results obtained for diverse datasets demonstrate that our framework not only produces accurate segmentation but also generalizes well to different segmentation tasks.

  7. Analytic hierarchy process-based approach for selecting a Pareto-optimal solution of a multi-objective, multi-site supply-chain planning problem

    NASA Astrophysics Data System (ADS)

    Ayadi, Omar; Felfel, Houssem; Masmoudi, Faouzi

    2017-07-01

    The current manufacturing environment has changed from traditional single-plant to multi-site supply chain where multiple plants are serving customer demands. In this article, a tactical multi-objective, multi-period, multi-product, multi-site supply-chain planning problem is proposed. A corresponding optimization model aiming to simultaneously minimize the total cost, maximize product quality and maximize the customer satisfaction demand level is developed. The proposed solution approach yields to a front of Pareto-optimal solutions that represents the trade-offs among the different objectives. Subsequently, the analytic hierarchy process method is applied to select the best Pareto-optimal solution according to the preferences of the decision maker. The robustness of the solutions and the proposed approach are discussed based on a sensitivity analysis and an application to a real case from the textile and apparel industry.

  8. Full space device optimization for solar cells.

    PubMed

    Baloch, Ahmer A B; Aly, Shahzada P; Hossain, Mohammad I; El-Mellouhi, Fedwa; Tabet, Nouar; Alharbi, Fahhad H

    2017-09-20

    Advances in computational materials have paved a way to design efficient solar cells by identifying the optimal properties of the device layers. Conventionally, the device optimization has been governed by single or double descriptors for an individual layer; mostly the absorbing layer. However, the performance of the device depends collectively on all the properties of the material and the geometry of each layer in the cell. To address this issue of multi-property optimization and to avoid the paradigm of reoccurring materials in the solar cell field, a full space material-independent optimization approach is developed and presented in this paper. The method is employed to obtain an optimized material data set for maximum efficiency and for targeted functionality for each layer. To ensure the robustness of the method, two cases are studied; namely perovskite solar cells device optimization and cadmium-free CIGS solar cell. The implementation determines the desirable optoelectronic properties of transport mediums and contacts that can maximize the efficiency for both cases. The resulted data sets of material properties can be matched with those in materials databases or by further microscopic material design. Moreover, the presented multi-property optimization framework can be extended to design any solid-state device.

  9. Leadership Development Institute: A California Community College Multi-College District Case Study

    ERIC Educational Resources Information Center

    Leon, Bianca R.

    2016-01-01

    The purpose of this study is to examine a community college district Grow Your Own (GYO) leadership program in the Western United States, the Multi College Leadership Development Institute (MCLDI). The MCLDI was developed in-house for a multi-campus community college district and offered to interested employees at all position levels with the…

  10. Qualitative and quantitative comparison of geostatistical techniques of porosity prediction from the seismic and logging data: a case study from the Blackfoot Field, Alberta, Canada

    NASA Astrophysics Data System (ADS)

    Maurya, S. P.; Singh, K. H.; Singh, N. P.

    2018-05-01

    In present study, three recently developed geostatistical methods, single attribute analysis, multi-attribute analysis and probabilistic neural network algorithm have been used to predict porosity in inter well region for Blackfoot field, Alberta, Canada, an offshore oil field. These techniques make use of seismic attributes, generated by model based inversion and colored inversion techniques. The principle objective of the study is to find the suitable combination of seismic inversion and geostatistical techniques to predict porosity and identification of prospective zones in 3D seismic volume. The porosity estimated from these geostatistical approaches is corroborated with the well log porosity. The results suggest that all the three implemented geostatistical methods are efficient and reliable to predict the porosity but the multi-attribute and probabilistic neural network analysis provide more accurate and high resolution porosity sections. A low impedance (6000-8000 m/s g/cc) and high porosity (> 15%) zone is interpreted from inverted impedance and porosity sections respectively between 1060 and 1075 ms time interval and is characterized as reservoir. The qualitative and quantitative results demonstrate that of all the employed geostatistical methods, the probabilistic neural network along with model based inversion is the most efficient method for predicting porosity in inter well region.

  11. Models for Type Ia Supernovae and Related Astrophysical Transients

    NASA Astrophysics Data System (ADS)

    Röpke, Friedrich K.; Sim, Stuart A.

    2018-06-01

    We give an overview of recent efforts to model Type Ia supernovae and related astrophysical transients resulting from thermonuclear explosions in white dwarfs. In particular we point out the challenges resulting from the multi-physics multi-scale nature of the problem and discuss possible numerical approaches to meet them in hydrodynamical explosion simulations and radiative transfer modeling. We give examples of how these methods are applied to several explosion scenarios that have been proposed to explain distinct subsets or, in some cases, the majority of the observed events. In case we comment on some of the successes and shortcoming of these scenarios and highlight important outstanding issues.

  12. Parsimony and goodness-of-fit in multi-dimensional NMR inversion

    NASA Astrophysics Data System (ADS)

    Babak, Petro; Kryuchkov, Sergey; Kantzas, Apostolos

    2017-01-01

    Multi-dimensional nuclear magnetic resonance (NMR) experiments are often used for study of molecular structure and dynamics of matter in core analysis and reservoir evaluation. Industrial applications of multi-dimensional NMR involve a high-dimensional measurement dataset with complicated correlation structure and require rapid and stable inversion algorithms from the time domain to the relaxation rate and/or diffusion domains. In practice, applying existing inverse algorithms with a large number of parameter values leads to an infinite number of solutions with a reasonable fit to the NMR data. The interpretation of such variability of multiple solutions and selection of the most appropriate solution could be a very complex problem. In most cases the characteristics of materials have sparse signatures, and investigators would like to distinguish the most significant relaxation and diffusion values of the materials. To produce an easy to interpret and unique NMR distribution with the finite number of the principal parameter values, we introduce a new method for NMR inversion. The method is constructed based on the trade-off between the conventional goodness-of-fit approach to multivariate data and the principle of parsimony guaranteeing inversion with the least number of parameter values. We suggest performing the inversion of NMR data using the forward stepwise regression selection algorithm. To account for the trade-off between goodness-of-fit and parsimony, the objective function is selected based on Akaike Information Criterion (AIC). The performance of the developed multi-dimensional NMR inversion method and its comparison with conventional methods are illustrated using real data for samples with bitumen, water and clay.

  13. Novel ways to explore surgical interventions in randomised controlled trials: applying case study methodology in the operating theatre.

    PubMed

    Blencowe, Natalie S; Blazeby, Jane M; Donovan, Jenny L; Mills, Nicola

    2015-12-28

    Multi-centre randomised controlled trials (RCTs) in surgery are challenging. It is particularly difficult to establish standards of surgery and ensure that interventions are delivered as intended. This study developed and tested methods for identifying the key components of surgical interventions and standardising interventions within RCTs. Qualitative case studies of surgical interventions were undertaken within the internal pilot phase of a surgical RCT for obesity (the By-Band study). Each case study involved video data capture and non-participant observation of gastric bypass surgery in the operating theatre and interviews with surgeons. Methods were developed to transcribe and synchronise data from video recordings with observational data to identify key intervention components, which were then explored in the interviews with surgeons. Eight qualitative case studies were undertaken. A novel combination of video data capture, observation and interview data identified variations in intervention delivery between surgeons and centres. Although surgeons agreed that the most critical intervention component was the size and shape of the gastric pouch, there was no consensus regarding other aspects of the procedure. They conceded that evidence about the 'best way' to perform bypass was lacking and, combined with the pragmatic nature of the By-Band study, agreed that strict standardisation of bypass might not be required. This study has developed and tested methods for understanding how surgical interventions are designed and delivered delivered in RCTs. Applying these methods more widely may help identify key components of interventions to be delivered by surgeons in trials, enabling monitoring of key components and adherence to the protocol. These methods are now being tested in the context of other surgical RCTs. Current Controlled Trials ISRCTN00786323 , 05/09/2011.

  14. Low-cost and no-cost practice to achieve energy efficiency of government office buildings: A case study in federal territory of Malaysia

    NASA Astrophysics Data System (ADS)

    Tahir, Mohamad Zamhari; Nawi, Mohd Nasrun Mohd; Ibrahim, Amlus

    2016-08-01

    This paper presents the findings of a case study to achieve energy-efficient performance of conventional office buildings in Malaysia. Two multi-storey office buildings in Federal Territory of Malaysia have been selected. The aim is to study building energy saving potential then to highlight the appropriate measures that can be implemented. Data was collected using benchmarking method by comparing the measured consumption to other similar office buildings and a series of preliminary audit which involves interviews, a brief review of utility and operating data as well as a walkthrough in the buildings. Additionally, in order to get a better understanding of major energy consumption in the selected buildings, general audit have been conducted to collect more detailed information about building operation. In the end, this study emphasized low-cost and no-cost practice to achieve energy efficiency with significant results in some cases.

  15. Object-based classification of global undersea topography and geomorphological features from the SRTM30_PLUS data

    NASA Astrophysics Data System (ADS)

    Dekavalla, Maria; Argialas, Demetre

    2017-07-01

    The analysis of undersea topography and geomorphological features provides necessary information to related disciplines and many applications. The development of an automated knowledge-based classification approach of undersea topography and geomorphological features is challenging due to their multi-scale nature. The aim of the study is to develop and evaluate an automated knowledge-based OBIA approach to: i) decompose the global undersea topography to multi-scale regions of distinct morphometric properties, and ii) assign the derived regions to characteristic geomorphological features. First, the global undersea topography was decomposed through the SRTM30_PLUS bathymetry data to the so-called morphometric objects of discrete morphometric properties and spatial scales defined by data-driven methods (local variance graphs and nested means) and multi-scale analysis. The derived morphometric objects were combined with additional relative topographic position information computed with a self-adaptive pattern recognition method (geomorphons), and auxiliary data and were assigned to characteristic undersea geomorphological feature classes through a knowledge base, developed from standard definitions. The decomposition of the SRTM30_PLUS data to morphometric objects was considered successful for the requirements of maximizing intra-object and inter-object heterogeneity, based on the near zero values of the Moran's I and the low values of the weighted variance index. The knowledge-based classification approach was tested for its transferability in six case studies of various tectonic settings and achieved the efficient extraction of 11 undersea geomorphological feature classes. The classification results for the six case studies were compared with the digital global seafloor geomorphic features map (GSFM). The 11 undersea feature classes and their producer's accuracies in respect to the GSFM relevant areas were Basin (95%), Continental Shelf (94.9%), Trough (88.4%), Plateau (78.9%), Continental Slope (76.4%), Trench (71.2%), Abyssal Hill (62.9%), Abyssal Plain (62.4%), Ridge (49.8%), Seamount (48.8%) and Continental Rise (25.4%). The knowledge-based OBIA classification approach was considered transferable since the percentages of spatial and thematic agreement between the most of the classified undersea feature classes and the GSFM exhibited low deviations across the six case studies.

  16. A study of zero tolerance policies in schools: a multi-integrated systems approach to improve outcomes for adolescents.

    PubMed

    Teske, Steven C

    2011-05-01

    School officials throughout the United States have adopted zero tolerance policies to address student discipline, resulting in an increase in out-of-school suspensions and expulsions. The introduction of police on school campuses also increased the referral of students to the juvenile courts. Although school personnel generally view zero tolerance policies as a constructive measure, this approach denies recent research on adolescent brain development that mischief is a foreseeable derivative of adolescence. A case study method examined one juvenile court's innovative multi-integrated systems approach related to the adverse trends associated with zero tolerance policies. A multi-disciplinary protocol resulted in more effective youth assessments that reduced out-of-school suspensions and school referrals; increased graduation rates by 20%; and decreased delinquent felony rates by nearly 50%. The resulting protocol changed how the system responds to disruptive students by significantly reducing out-of-school suspensions and school referrals, and putting into place alternatives as well as providing community resources to address the underlying causes of the behavior. A multi-systems approach that targets the reasons for disruptive behavior improves student educational and behavioral outcomes. © 2011 Wiley Periodicals, Inc.

  17. Integrating the Ergonomics Techniques with Multi Criteria Decision Making as a New Approach for Risk Management: An Assessment of Repetitive Tasks -Entropy Case Study.

    PubMed

    Khandan, Mohammad; Nili, Majid; Koohpaei, Alireza; Mosaferchi, Saeedeh

    2016-01-01

    Nowadays, the health work decision makers need to analyze a huge amount of data and consider many conflicting evaluation criteria and sub-criteria. Therefore, an ergonomic evaluation in the work environment in order to the control occupational disorders is considered as the Multi Criteria Decision Making (MCDM) problem. In this study, the ergonomic risks factors, which may influence health, were evaluated in a manufacturing company in 2014. Then entropy method was applied to prioritize the different risk factors. This study was done with a descriptive-analytical approach and 13 tasks were included from total number of employees who were working in the seven halls of an ark opal manufacturing (240). Required information was gathered by the demographic questionnaire and Assessment of Repetitive Tasks (ART) method for repetitive task assessment. In addition, entropy was used to prioritize the risk factors based on the ergonomic control needs. The total exposure score based on the ART method calculated was equal to 30.07 ±12.43. Data analysis illustrated that 179 cases (74.6% of tasks) were in the high level of risk area and 13.8% were in the medium level of risk. ART- entropy results revealed that based on the weighted factors, higher value belongs to grip factor and the lowest value was related to neck and hand posture and duration. Based on the limited financial resources, it seems that MCDM in many challenging situations such as control procedures and priority approaches could be used successfully. Other MCDM methods for evaluating and prioritizing the ergonomic problems are recommended.

  18. Fuzzy Multi-Objective Vendor Selection Problem with Modified S-CURVE Membership Function

    NASA Astrophysics Data System (ADS)

    Díaz-Madroñero, Manuel; Peidro, David; Vasant, Pandian

    2010-06-01

    In this paper, the S-Curve membership function methodology is used in a vendor selection (VS) problem. An interactive method for solving multi-objective VS problems with fuzzy goals is developed. The proposed method attempts simultaneously to minimize the total order costs, the number of rejected items and the number of late delivered items with reference to several constraints such as meeting buyers' demand, vendors' capacity, vendors' quota flexibility, vendors' allocated budget, etc. We compare in an industrial case the performance of S-curve membership functions, representing uncertainty goals and constraints in VS problems, with linear membership functions.

  19. A volume-of-fluid method for simulation of compressible axisymmetric multi-material flow

    NASA Astrophysics Data System (ADS)

    de Niem, D.; Kührt, E.; Motschmann, U.

    2007-02-01

    A two-dimensional Eulerian hydrodynamic method for the numerical simulation of inviscid compressible axisymmetric multi-material flow in external force fields for the situation of pure fluids separated by macroscopic interfaces is presented. The method combines an implicit Lagrangian step with an explicit Eulerian advection step. Individual materials obey separate energy equations, fulfill general equations of state, and may possess different temperatures. Material volume is tracked using a piecewise linear volume-of-fluid method. An overshoot-free logically simple and economic material advection algorithm for cylinder coordinates is derived, in an algebraic formulation. New aspects arising in the case of more than two materials such as the material ordering strategy during transport are presented. One- and two-dimensional numerical examples are given.

  20. Multi-probe-based resonance-frequency electrical impedance spectroscopy for detection of suspicious breast lesions: improving performance using partial ROC optimization

    NASA Astrophysics Data System (ADS)

    Lederman, Dror; Zheng, Bin; Wang, Xingwei; Wang, Xiao Hui; Gur, David

    2011-03-01

    We have developed a multi-probe resonance-frequency electrical impedance spectroscope (REIS) system to detect breast abnormalities. Based on assessing asymmetry in REIS signals acquired between left and right breasts, we developed several machine learning classifiers to classify younger women (i.e., under 50YO) into two groups of having high and low risk for developing breast cancer. In this study, we investigated a new method to optimize performance based on the area under a selected partial receiver operating characteristic (ROC) curve when optimizing an artificial neural network (ANN), and tested whether it could improve classification performance. From an ongoing prospective study, we selected a dataset of 174 cases for whom we have both REIS signals and diagnostic status verification. The dataset includes 66 "positive" cases recommended for biopsy due to detection of highly suspicious breast lesions and 108 "negative" cases determined by imaging based examinations. A set of REIS-based feature differences, extracted from the two breasts using a mirror-matched approach, was computed and constituted an initial feature pool. Using a leave-one-case-out cross-validation method, we applied a genetic algorithm (GA) to train the ANN with an optimal subset of features. Two optimization criteria were separately used in GA optimization, namely the area under the entire ROC curve (AUC) and the partial area under the ROC curve, up to a predetermined threshold (i.e., 90% specificity). The results showed that although the ANN optimized using the entire AUC yielded higher overall performance (AUC = 0.83 versus 0.76), the ANN optimized using the partial ROC area criterion achieved substantially higher operational performance (i.e., increasing sensitivity level from 28% to 48% at 95% specificity and/ or from 48% to 58% at 90% specificity).

  1. Structural and Practical Identifiability Issues of Immuno-Epidemiological Vector-Host Models with Application to Rift Valley Fever.

    PubMed

    Tuncer, Necibe; Gulbudak, Hayriye; Cannataro, Vincent L; Martcheva, Maia

    2016-09-01

    In this article, we discuss the structural and practical identifiability of a nested immuno-epidemiological model of arbovirus diseases, where host-vector transmission rate, host recovery, and disease-induced death rates are governed by the within-host immune system. We incorporate the newest ideas and the most up-to-date features of numerical methods to fit multi-scale models to multi-scale data. For an immunological model, we use Rift Valley Fever Virus (RVFV) time-series data obtained from livestock under laboratory experiments, and for an epidemiological model we incorporate a human compartment to the nested model and use the number of human RVFV cases reported by the CDC during the 2006-2007 Kenya outbreak. We show that the immunological model is not structurally identifiable for the measurements of time-series viremia concentrations in the host. Thus, we study the non-dimensionalized and scaled versions of the immunological model and prove that both are structurally globally identifiable. After fixing estimated parameter values for the immunological model derived from the scaled model, we develop a numerical method to fit observable RVFV epidemiological data to the nested model for the remaining parameter values of the multi-scale system. For the given (CDC) data set, Monte Carlo simulations indicate that only three parameters of the epidemiological model are practically identifiable when the immune model parameters are fixed. Alternatively, we fit the multi-scale data to the multi-scale model simultaneously. Monte Carlo simulations for the simultaneous fitting suggest that the parameters of the immunological model and the parameters of the immuno-epidemiological model are practically identifiable. We suggest that analytic approaches for studying the structural identifiability of nested models are a necessity, so that identifiable parameter combinations can be derived to reparameterize the nested model to obtain an identifiable one. This is a crucial step in developing multi-scale models which explain multi-scale data.

  2. A hybrid degradation tendency measurement method for mechanical equipment based on moving window and Grey-Markov model

    NASA Astrophysics Data System (ADS)

    Jiang, Wei; Zhou, Jianzhong; Zheng, Yang; Liu, Han

    2017-11-01

    Accurate degradation tendency measurement is vital for the secure operation of mechanical equipment. However, the existing techniques and methodologies for degradation measurement still face challenges, such as lack of appropriate degradation indicator, insufficient accuracy, and poor capability to track the data fluctuation. To solve these problems, a hybrid degradation tendency measurement method for mechanical equipment based on a moving window and Grey-Markov model is proposed in this paper. In the proposed method, a 1D normalized degradation index based on multi-feature fusion is designed to assess the extent of degradation. Subsequently, the moving window algorithm is integrated with the Grey-Markov model for the dynamic update of the model. Two key parameters, namely the step size and the number of states, contribute to the adaptive modeling and multi-step prediction. Finally, three types of combination prediction models are established to measure the degradation trend of equipment. The effectiveness of the proposed method is validated with a case study on the health monitoring of turbine engines. Experimental results show that the proposed method has better performance, in terms of both measuring accuracy and data fluctuation tracing, in comparison with other conventional methods.

  3. Simultaneous, proportional, multi-axis prosthesis control using multichannel surface EMG.

    PubMed

    Yatsenko, Dimitri; McDonnall, Daniel; Guillory, K Shane

    2007-01-01

    Most upper limb prosthesis controllers only allow the individual selection and control of single joints of the limb. The main limiting factor for simultaneous multi-joint control is usually the availability of reliable independent control signals that can intuitively be used. In this paper, a novel method is presented for extraction of individual muscle source signals from surface EMG array recordings, based on EMG energy orthonormalization along principle movement vectors. In cases where independently-controllable muscles are present in residual limbs, this method can be used to provide simultaneous, multi-axis, proportional control of prosthetic systems. Initial results are presented for simultaneous control of wrist rotation, wrist flexion/extension, and grip open/close for two intact subjects under both isometric and non-isometric conditions and for one subject with transradial amputation.

  4. A new pulsed laser deposition technique: scanning multi-component pulsed laser deposition method.

    PubMed

    Fischer, D; de la Fuente, G F; Jansen, M

    2012-04-01

    The scanning multi-component pulsed laser deposition (PLD) method realizes uniform depositions of desired coatings by a modified pulsed laser deposition process, preferably with a femto-second laser-system. Multi-component coatings (single or multilayered) are thus deposited onto substrates via laser induced ablation of segmented targets. This is achieved via horizontal line-scanning of a focused laser beam over a uniformly moving target's surface. This process allows to deposit the desired composition of the coating simultaneously, starting from the different segments of the target and adjusting the scan line as a function of target geometry. The sequence and thickness of multilayers can easily be adjusted by target architecture and motion, enabling inter/intra layer concentration gradients and thus functional gradient coatings. This new, simple PLD method enables the achievement of uniform, large-area coatings. Case studies were performed with segmented targets containing aluminum, titanium, and niobium. Under the laser irradiation conditions applied, all three metals were uniformly ablated. The elemental composition within the rough coatings obtained was fixed by the scanned area to Ti-Al-Nb = 1:1:1. Crystalline aluminum, titanium, and niobium were found to coexist side by side at room temperature within the substrate, without alloy formation up to 600 °C. © 2012 American Institute of Physics

  5. Reply to the comment by B. Ghobadipour and B. Mojarradi "M. Abedi, S.A. Torabi, G.-H. Norouzi and M. Hamzeh; ELECTRE III: A knowledge-driven method for integration of geophysical data with geological and geochemical data in mineral prospectivity mapping"

    NASA Astrophysics Data System (ADS)

    Abedi, Maysam

    2015-06-01

    This reply discusses the results of two previously developed approaches in mineral prospectivity/potential mapping (MPM), i.e., ELECTRE III and PROMETHEE II as well-known methods in multi-criteria decision-making (MCDM) problems. Various geo-data sets are integrated to prepare MPM in which generated maps have acceptable matching with the drilled boreholes. Equal performance of the applied methods is indicated in the studied case. Complementary information of these methods is also provided in order to help interested readers to implement them in MPM process.

  6. Multi-level, multi-scale resource selection functions and resistance surfaces for conservation planning: Pumas as a case study

    PubMed Central

    Vickers, T. Winston; Ernest, Holly B.; Boyce, Walter M.

    2017-01-01

    The importance of examining multiple hierarchical levels when modeling resource use for wildlife has been acknowledged for decades. Multi-level resource selection functions have recently been promoted as a method to synthesize resource use across nested organizational levels into a single predictive surface. Analyzing multiple scales of selection within each hierarchical level further strengthens multi-level resource selection functions. We extend this multi-level, multi-scale framework to modeling resistance for wildlife by combining multi-scale resistance surfaces from two data types, genetic and movement. Resistance estimation has typically been conducted with one of these data types, or compared between the two. However, we contend it is not an either/or issue and that resistance may be better-modeled using a combination of resistance surfaces that represent processes at different hierarchical levels. Resistance surfaces estimated from genetic data characterize temporally broad-scale dispersal and successful breeding over generations, whereas resistance surfaces estimated from movement data represent fine-scale travel and contextualized movement decisions. We used telemetry and genetic data from a long-term study on pumas (Puma concolor) in a highly developed landscape in southern California to develop a multi-level, multi-scale resource selection function and a multi-level, multi-scale resistance surface. We used these multi-level, multi-scale surfaces to identify resource use patches and resistant kernel corridors. Across levels, we found puma avoided urban, agricultural areas, and roads and preferred riparian areas and more rugged terrain. For other landscape features, selection differed among levels, as did the scales of selection for each feature. With these results, we developed a conservation plan for one of the most isolated puma populations in the U.S. Our approach captured a wide spectrum of ecological relationships for a population, resulted in effective conservation planning, and can be readily applied to other wildlife species. PMID:28609466

  7. Multi-level, multi-scale resource selection functions and resistance surfaces for conservation planning: Pumas as a case study.

    PubMed

    Zeller, Katherine A; Vickers, T Winston; Ernest, Holly B; Boyce, Walter M

    2017-01-01

    The importance of examining multiple hierarchical levels when modeling resource use for wildlife has been acknowledged for decades. Multi-level resource selection functions have recently been promoted as a method to synthesize resource use across nested organizational levels into a single predictive surface. Analyzing multiple scales of selection within each hierarchical level further strengthens multi-level resource selection functions. We extend this multi-level, multi-scale framework to modeling resistance for wildlife by combining multi-scale resistance surfaces from two data types, genetic and movement. Resistance estimation has typically been conducted with one of these data types, or compared between the two. However, we contend it is not an either/or issue and that resistance may be better-modeled using a combination of resistance surfaces that represent processes at different hierarchical levels. Resistance surfaces estimated from genetic data characterize temporally broad-scale dispersal and successful breeding over generations, whereas resistance surfaces estimated from movement data represent fine-scale travel and contextualized movement decisions. We used telemetry and genetic data from a long-term study on pumas (Puma concolor) in a highly developed landscape in southern California to develop a multi-level, multi-scale resource selection function and a multi-level, multi-scale resistance surface. We used these multi-level, multi-scale surfaces to identify resource use patches and resistant kernel corridors. Across levels, we found puma avoided urban, agricultural areas, and roads and preferred riparian areas and more rugged terrain. For other landscape features, selection differed among levels, as did the scales of selection for each feature. With these results, we developed a conservation plan for one of the most isolated puma populations in the U.S. Our approach captured a wide spectrum of ecological relationships for a population, resulted in effective conservation planning, and can be readily applied to other wildlife species.

  8. Multi-Objective Algorithm for Blood Supply via Unmanned Aerial Vehicles to the Wounded in an Emergency Situation

    PubMed Central

    Wen, Tingxi; Zhang, Zhongnan; Wong, Kelvin K. L.

    2016-01-01

    Unmanned aerial vehicle (UAV) has been widely used in many industries. In the medical environment, especially in some emergency situations, UAVs play an important role such as the supply of medicines and blood with speed and efficiency. In this paper, we study the problem of multi-objective blood supply by UAVs in such emergency situations. This is a complex problem that includes maintenance of the supply blood’s temperature model during transportation, the UAVs’ scheduling and routes’ planning in case of multiple sites requesting blood, and limited carrying capacity. Most importantly, we need to study the blood’s temperature change due to the external environment, the heating agent (or refrigerant) and time factor during transportation, and propose an optimal method for calculating the mixing proportion of blood and appendage in different circumstances and delivery conditions. Then, by introducing the idea of transportation appendage into the traditional Capacitated Vehicle Routing Problem (CVRP), this new problem is proposed according to the factors of distance and weight. Algorithmically, we use the combination of decomposition-based multi-objective evolutionary algorithm and local search method to perform a series of experiments on the CVRP public dataset. By comparing our technique with the traditional ones, our algorithm can obtain better optimization results and time performance. PMID:27163361

  9. Multi-Objective Algorithm for Blood Supply via Unmanned Aerial Vehicles to the Wounded in an Emergency Situation.

    PubMed

    Wen, Tingxi; Zhang, Zhongnan; Wong, Kelvin K L

    2016-01-01

    Unmanned aerial vehicle (UAV) has been widely used in many industries. In the medical environment, especially in some emergency situations, UAVs play an important role such as the supply of medicines and blood with speed and efficiency. In this paper, we study the problem of multi-objective blood supply by UAVs in such emergency situations. This is a complex problem that includes maintenance of the supply blood's temperature model during transportation, the UAVs' scheduling and routes' planning in case of multiple sites requesting blood, and limited carrying capacity. Most importantly, we need to study the blood's temperature change due to the external environment, the heating agent (or refrigerant) and time factor during transportation, and propose an optimal method for calculating the mixing proportion of blood and appendage in different circumstances and delivery conditions. Then, by introducing the idea of transportation appendage into the traditional Capacitated Vehicle Routing Problem (CVRP), this new problem is proposed according to the factors of distance and weight. Algorithmically, we use the combination of decomposition-based multi-objective evolutionary algorithm and local search method to perform a series of experiments on the CVRP public dataset. By comparing our technique with the traditional ones, our algorithm can obtain better optimization results and time performance.

  10. Family members of older persons with multi-morbidity and their experiences of case managers in Sweden: an interpretive phenomenological approach.

    PubMed

    Hjelm, Markus; Holmgren, Ann-Charlotte; Willman, Ania; Bohman, Doris; Holst, Göran

    2015-01-01

    Family members of older persons (75+) with multi-morbidity are likely to benefit from utilising case management services performed by case managers. However, research has not yet explored their experiences of case managers. The aim of the study was to deepen the understanding of the importance of case managers to family members of older persons (75+) with multi-morbidity. The study design was based on an interpretive phenomenological approach. Data were collected through individual interviews with 16 family members in Sweden. The interviews were analysed by means of an interpretive phenomenological approach. The findings revealed one overarching theme: "Helps to fulfil my unmet needs", based on three sub-themes: (1) "Helps me feel secure - Experiencing a trusting relationship", (2) "Confirms and strengthens me - Challenging my sense of being alone" and (3) "Being my personal guide - Increasing my competence". The findings indicate that case managers were able to fulfil unmet needs of family members. The latter recognised the importance of case managers providing them with professional services tailored to their individual needs. The findings can contribute to the improvement of case management models not only for older persons but also for their family members.

  11. New approach to information fusion for Lipschitz classifiers ensembles: Application in multi-channel C-OTDR-monitoring systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timofeev, Andrey V.; Egorov, Dmitry V.

    This paper presents new results concerning selection of an optimal information fusion formula for an ensemble of Lipschitz classifiers. The goal of information fusion is to create an integral classificatory which could provide better generalization ability of the ensemble while achieving a practically acceptable level of effectiveness. The problem of information fusion is very relevant for data processing in multi-channel C-OTDR-monitoring systems. In this case we have to effectively classify targeted events which appear in the vicinity of the monitored object. Solution of this problem is based on usage of an ensemble of Lipschitz classifiers each of which corresponds tomore » a respective channel. We suggest a brand new method for information fusion in case of ensemble of Lipschitz classifiers. This method is called “The Weighing of Inversely as Lipschitz Constants” (WILC). Results of WILC-method practical usage in multichannel C-OTDR monitoring systems are presented.« less

  12. A scale space feature based registration technique for fusion of satellite imagery

    NASA Technical Reports Server (NTRS)

    Raghavan, Srini; Cromp, Robert F.; Campbell, William C.

    1997-01-01

    Feature based registration is one of the most reliable methods to register multi-sensor images (both active and passive imagery) since features are often more reliable than intensity or radiometric values. The only situation where a feature based approach will fail is when the scene is completely homogenous or densely textural in which case a combination of feature and intensity based methods may yield better results. In this paper, we present some preliminary results of testing our scale space feature based registration technique, a modified version of feature based method developed earlier for classification of multi-sensor imagery. The proposed approach removes the sensitivity in parameter selection experienced in the earlier version as explained later.

  13. Medicaid medical directors quality improvement studies: a case study of evolving methods for a research network.

    PubMed

    Fairbrother, Gerry; Trudnak, Tara; Griffith, Katherine

    2014-01-01

    To describe the evolution of methods and share lessons learned from conducting multi-state studies with Medicaid Medical Directors (MMD) using state administrative data. There was a great need for these studies, but also much to be learned about conducting network-based research and ensuring comparability of results. This was a network-level case study. The findings were drawn from the experience developing and executing network analyses with the MMDs, as well as from participant feedback on lessons learned. For the latter, nine interviews with MMD project leads, state data analysts, and outside researchers involved with the projects were conducted. Interviews were transcribed, coded and analyzed using NVivo 10.0 analytic software. MMD study methodology involved many steps: developing research questions, defining data specifications, organizing an aggregated data collection spreadsheet form, assuring quality through review, and analyzing and reporting state data at the national level. State analysts extracted the data from their state Medicaid administrative (claims) databases (and sometimes other datasets). Analysis at the national level aggregated state data overall, by demographics and other sub groups, and displayed descriptive statistics and cross-tabs. Projects in the MMD multi-state network address high-priority clinical issues in Medicaid and impact quality of care through sharing of data and policies among states. Further, these studies contribute not only to high-quality, cost-effective health care for Medicaid beneficiaries, but also add to our knowledge of network-based research. Continuation of these studies requires funding for a permanent research infrastructure nationally, as well as at the state-level to strengthen capacity.

  14. Autonomous Motion Learning for Intra-Vehicular Activity Space Robot

    NASA Astrophysics Data System (ADS)

    Watanabe, Yutaka; Yairi, Takehisa; Machida, Kazuo

    Space robots will be needed in the future space missions. So far, many types of space robots have been developed, but in particular, Intra-Vehicular Activity (IVA) space robots that support human activities should be developed to reduce human-risks in space. In this paper, we study the motion learning method of an IVA space robot with the multi-link mechanism. The advantage point is that this space robot moves using reaction force of the multi-link mechanism and contact forces from the wall as space walking of an astronaut, not to use a propulsion. The control approach is determined based on a reinforcement learning with the actor-critic algorithm. We demonstrate to clear effectiveness of this approach using a 5-link space robot model by simulation. First, we simulate that a space robot learn the motion control including contact phase in two dimensional case. Next, we simulate that a space robot learn the motion control changing base attitude in three dimensional case.

  15. Case study method and problem-based learning: utilizing the pedagogical model of progressive complexity in nursing education.

    PubMed

    McMahon, Michelle A; Christopher, Kimberly A

    2011-08-19

    As the complexity of health care delivery continues to increase, educators are challenged to determine educational best practices to prepare BSN students for the ambiguous clinical practice setting. Integrative, active, and student-centered curricular methods are encouraged to foster student ability to use clinical judgment for problem solving and informed clinical decision making. The proposed pedagogical model of progressive complexity in nursing education suggests gradually introducing students to complex and multi-contextual clinical scenarios through the utilization of case studies and problem-based learning activities, with the intention to transition nursing students into autonomous learners and well-prepared practitioners at the culmination of a nursing program. Exemplar curricular activities are suggested to potentiate student development of a transferable problem solving skill set and a flexible knowledge base to better prepare students for practice in future novel clinical experiences, which is a mutual goal for both educators and students.

  16. A measured approach to sustainability and other multi-criteria assessment

    EPA Science Inventory

    From determining what to have for lunch to deciding where to invest our resources, we, as individuals and societies, are constantly involved in multi-criteria assessment. Using sustainability assessment as a case study, in this talk I will demonstrate the ubiquity of multi-crite...

  17. Importance of multi-modal approaches to effectively identify cataract cases from electronic health records

    PubMed Central

    Rasmussen, Luke V; Berg, Richard L; Linneman, James G; McCarty, Catherine A; Waudby, Carol; Chen, Lin; Denny, Joshua C; Wilke, Russell A; Pathak, Jyotishman; Carrell, David; Kho, Abel N; Starren, Justin B

    2012-01-01

    Objective There is increasing interest in using electronic health records (EHRs) to identify subjects for genomic association studies, due in part to the availability of large amounts of clinical data and the expected cost efficiencies of subject identification. We describe the construction and validation of an EHR-based algorithm to identify subjects with age-related cataracts. Materials and methods We used a multi-modal strategy consisting of structured database querying, natural language processing on free-text documents, and optical character recognition on scanned clinical images to identify cataract subjects and related cataract attributes. Extensive validation on 3657 subjects compared the multi-modal results to manual chart review. The algorithm was also implemented at participating electronic MEdical Records and GEnomics (eMERGE) institutions. Results An EHR-based cataract phenotyping algorithm was successfully developed and validated, resulting in positive predictive values (PPVs) >95%. The multi-modal approach increased the identification of cataract subject attributes by a factor of three compared to single-mode approaches while maintaining high PPV. Components of the cataract algorithm were successfully deployed at three other institutions with similar accuracy. Discussion A multi-modal strategy incorporating optical character recognition and natural language processing may increase the number of cases identified while maintaining similar PPVs. Such algorithms, however, require that the needed information be embedded within clinical documents. Conclusion We have demonstrated that algorithms to identify and characterize cataracts can be developed utilizing data collected via the EHR. These algorithms provide a high level of accuracy even when implemented across multiple EHRs and institutional boundaries. PMID:22319176

  18. Hybrid algorithms for fuzzy reverse supply chain network design.

    PubMed

    Che, Z H; Chiang, Tzu-An; Kuo, Y C; Cui, Zhihua

    2014-01-01

    In consideration of capacity constraints, fuzzy defect ratio, and fuzzy transport loss ratio, this paper attempted to establish an optimized decision model for production planning and distribution of a multiphase, multiproduct reverse supply chain, which addresses defects returned to original manufacturers, and in addition, develops hybrid algorithms such as Particle Swarm Optimization-Genetic Algorithm (PSO-GA), Genetic Algorithm-Simulated Annealing (GA-SA), and Particle Swarm Optimization-Simulated Annealing (PSO-SA) for solving the optimized model. During a case study of a multi-phase, multi-product reverse supply chain network, this paper explained the suitability of the optimized decision model and the applicability of the algorithms. Finally, the hybrid algorithms showed excellent solving capability when compared with original GA and PSO methods.

  19. Hybrid Algorithms for Fuzzy Reverse Supply Chain Network Design

    PubMed Central

    Che, Z. H.; Chiang, Tzu-An; Kuo, Y. C.

    2014-01-01

    In consideration of capacity constraints, fuzzy defect ratio, and fuzzy transport loss ratio, this paper attempted to establish an optimized decision model for production planning and distribution of a multiphase, multiproduct reverse supply chain, which addresses defects returned to original manufacturers, and in addition, develops hybrid algorithms such as Particle Swarm Optimization-Genetic Algorithm (PSO-GA), Genetic Algorithm-Simulated Annealing (GA-SA), and Particle Swarm Optimization-Simulated Annealing (PSO-SA) for solving the optimized model. During a case study of a multi-phase, multi-product reverse supply chain network, this paper explained the suitability of the optimized decision model and the applicability of the algorithms. Finally, the hybrid algorithms showed excellent solving capability when compared with original GA and PSO methods. PMID:24892057

  20. Re-use of pilot data and interim analysis of pivotal data in MRMC studies: a simulation study

    NASA Astrophysics Data System (ADS)

    Chen, Weijie; Samuelson, Frank; Sahiner, Berkman; Petrick, Nicholas

    2017-03-01

    Novel medical imaging devices are often evaluated with multi-reader multi-case (MRMC) studies in which radiologists read images of patient cases for a specified clinical task (e.g., cancer detection). A pilot study is often used to measure the effect size and variance parameters that are necessary for sizing a pivotal study (including sizing readers, non-diseased and diseased cases). Due to the practical difficulty of collecting patient cases or recruiting clinical readers, some investigators attempt to include the pilot data as part of their pivotal study. In other situations, some investigators attempt to perform an interim analysis of their pivotal study data based upon which the sample sizes may be re-estimated. Re-use of the pilot data or interim analyses of the pivotal data may inflate the type I error of the pivotal study. In this work, we use the Roe and Metz model to simulate MRMC data under the null hypothesis (i.e., two devices have equal diagnostic performance) and investigate the type I error rate for several practical designs involving re-use of pilot data or interim analysis of pivotal data. Our preliminary simulation results indicate that, under the simulation conditions we investigated, the inflation of type I error is none or only marginal for some design strategies (e.g., re-use of patient data without re-using readers, and size re-estimation without using the effect-size estimated in the interim analysis). Upon further verifications, these are potentially useful design methods in that they may help make a study less burdensome and have a better chance to succeed without substantial loss of the statistical rigor.

  1. Multi-objective decision-making model based on CBM for an aircraft fleet

    NASA Astrophysics Data System (ADS)

    Luo, Bin; Lin, Lin

    2018-04-01

    Modern production management patterns, in which multi-unit (e.g., a fleet of aircrafts) are managed in a holistic manner, have brought new challenges for multi-unit maintenance decision making. To schedule a good maintenance plan, not only does the individual machine maintenance have to be considered, but also the maintenance of the other individuals have to be taken into account. Since most condition-based maintenance researches for aircraft focused on solely reducing maintenance cost or maximizing the availability of single aircraft, as well as considering that seldom researches concentrated on both the two objectives: minimizing cost and maximizing the availability of a fleet (total number of available aircraft in fleet), a multi-objective decision-making model based on condition-based maintenance concentrated both on the above two objectives is established. Furthermore, in consideration of the decision maker may prefer providing the final optimal result in the form of discrete intervals instead of a set of points (non-dominated solutions) in real decision-making problem, a novel multi-objective optimization method based on support vector regression is proposed to solve the above multi-objective decision-making model. Finally, a case study regarding a fleet is conducted, with the results proving that the approach efficiently generates outcomes that meet the schedule requirements.

  2. Automated Test Case Generator for Phishing Prevention Using Generative Grammars and Discriminative Methods

    ERIC Educational Resources Information Center

    Palka, Sean

    2015-01-01

    This research details a methodology designed for creating content in support of various phishing prevention tasks including live exercises and detection algorithm research. Our system uses probabilistic context-free grammars (PCFG) and variable interpolation as part of a multi-pass method to create diverse and consistent phishing email content on…

  3. Prospective multi-center study of an automatic online seizure detection system for epilepsy monitoring units.

    PubMed

    Fürbass, F; Ossenblok, P; Hartmann, M; Perko, H; Skupch, A M; Lindinger, G; Elezi, L; Pataraia, E; Colon, A J; Baumgartner, C; Kluge, T

    2015-06-01

    A method for automatic detection of epileptic seizures in long-term scalp-EEG recordings called EpiScan will be presented. EpiScan is used as alarm device to notify medical staff of epilepsy monitoring units (EMUs) in case of a seizure. A prospective multi-center study was performed in three EMUs including 205 patients. A comparison between EpiScan and the Persyst seizure detector on the prospective data will be presented. In addition, the detection results of EpiScan on retrospective EEG data of 310 patients and the public available CHB-MIT dataset will be shown. A detection sensitivity of 81% was reached for unequivocal electrographic seizures with false alarm rate of only 7 per day. No statistical significant differences in the detection sensitivities could be found between the centers. The comparison to the Persyst seizure detector showed a lower false alarm rate of EpiScan but the difference was not of statistical significance. The automatic seizure detection method EpiScan showed high sensitivity and low false alarm rate in a prospective multi-center study on a large number of patients. The application as seizure alarm device in EMUs becomes feasible and will raise the efficiency of video-EEG monitoring and the safety levels of patients. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  4. Selection of Malaysia School Youth Cadet Corps leader by using analytical hierarchy process: A case study at SMK Ahmad Boestamam

    NASA Astrophysics Data System (ADS)

    Mohamed, Nurul Huda; Ahmat, Norhayati; Mohamed, Nurul Akmal; Razmi, Syazwani Che; Mohamed, Nurul Farihan

    2017-05-01

    This research is a case study to identify the best criteria that a person should have as the leader of Malaysia School Youth Cadet Corps (Kadet Remaja Sekolah (KRS)) at SMK Ahmad Boestamam, Sitiawan in order to select the most appropriate person to hold the position. The approach used in this study is Analytical Hierarchy Process (AHP) which include pairwise comparison to compare the criteria and also the candidates. There are four criteria namely charisma, interpersonal communication, personality and physical. Four candidates (1, 2, 3 and 4) are being considered in this study. Purposive sampling and questionnaires are used as instruments to obtain the data which are then analyzed by using the AHP method. The final output indicates that Candidate 1 has the highest score, followed by Candidate 2, Candidate 4 and Candidate 3. It shows that this method is very helpful in the multi-criteria decision making when there are several options available.

  5. A comparison of two multi-variable integrator windup protection schemes

    NASA Technical Reports Server (NTRS)

    Mattern, Duane

    1993-01-01

    Two methods are examined for limit and integrator wind-up protection for multi-input, multi-output linear controllers subject to actuator constraints. The methods begin with an existing linear controller that satisfies the specifications for the nominal, small perturbation, linear model of the plant. The controllers are formulated to include an additional contribution to the state derivative calculations. The first method to be examined is the multi-variable version of the single-input, single-output, high gain, Conventional Anti-Windup (CAW) scheme. Except for the actuator limits, the CAW scheme is linear. The second scheme to be examined, denoted the Modified Anti-Windup (MAW) scheme, uses a scalar to modify the magnitude of the controller output vector while maintaining the vector direction. The calculation of the scalar modifier is a nonlinear function of the controller outputs and the actuator limits. In both cases the constrained actuator is tracked. These two integrator windup protection methods are demonstrated on a turbofan engine control system with five measurements, four control variables, and four actuators. The closed-loop responses of the two schemes are compared and contrasted during limit operation. The issue of maintaining the direction of the controller output vector using the Modified Anti-Windup scheme is discussed and the advantages and disadvantages of both of the IWP methods are presented.

  6. Multi-agent robotic systems and applications for satellite missions

    NASA Astrophysics Data System (ADS)

    Nunes, Miguel A.

    A revolution in the space sector is happening. It is expected that in the next decade there will be more satellites launched than in the previous sixty years of space exploration. Major challenges are associated with this growth of space assets such as the autonomy and management of large groups of satellites, in particular with small satellites. There are two main objectives for this work. First, a flexible and distributed software architecture is presented to expand the possibilities of spacecraft autonomy and in particular autonomous motion in attitude and position. The approach taken is based on the concept of distributed software agents, also referred to as multi-agent robotic system. Agents are defined as software programs that are social, reactive and proactive to autonomously maximize the chances of achieving the set goals. Part of the work is to demonstrate that a multi-agent robotic system is a feasible approach for different problems of autonomy such as satellite attitude determination and control and autonomous rendezvous and docking. The second main objective is to develop a method to optimize multi-satellite configurations in space, also known as satellite constellations. This automated method generates new optimal mega-constellations designs for Earth observations and fast revisit times on large ground areas. The optimal satellite constellation can be used by researchers as the baseline for new missions. The first contribution of this work is the development of a new multi-agent robotic system for distributing the attitude determination and control subsystem for HiakaSat. The multi-agent robotic system is implemented and tested on the satellite hardware-in-the-loop testbed that simulates a representative space environment. The results show that the newly proposed system for this particular case achieves an equivalent control performance when compared to the monolithic implementation. In terms on computational efficiency it is found that the multi-agent robotic system has a consistent lower CPU load of 0.29 +/- 0.03 compared to 0.35 +/- 0.04 for the monolithic implementation, a 17.1 % reduction. The second contribution of this work is the development of a multi-agent robotic system for the autonomous rendezvous and docking of multiple spacecraft. To compute the maneuvers guidance, navigation and control algorithms are implemented as part of the multi-agent robotic system. The navigation and control functions are implemented using existing algorithms, but one important contribution of this section is the introduction of a new six degrees of freedom guidance method which is part of the guidance, navigation and control architecture. This new method is an explicit solution to the guidance problem, and is particularly useful for real time guidance for attitude and position, as opposed to typical guidance methods which are based on numerical solutions, and therefore are computationally intensive. A simulation scenario is run for docking four CubeSats deployed radially from a launch vehicle. Considering fully actuated CubeSats, the simulations show docking maneuvers that are successfully completed within 25 minutes which is approximately 30% of a full orbital period in low earth orbit. The final section investigates the problem of optimization of satellite constellations for fast revisit time, and introduces a new method to generate different constellation configurations that are evaluated with a genetic algorithm. Two case studies are presented. The first is the optimization of a constellation for rapid coverage of the oceans of the globe in 24 hours or less. Results show that for an 80 km sensor swath width 50 satellites are required to cover the oceans with a 24 hour revisit time. The second constellation configuration study focuses on the optimization for the rapid coverage of the North Atlantic Tracks for air traffic monitoring in 3 hours or less. The results show that for a fixed swath width of 160 km and for a 3 hour revisit time 52 satellites are required.

  7. Magnetic MIMO Signal Processing and Optimization for Wireless Power Transfer

    NASA Astrophysics Data System (ADS)

    Yang, Gang; Moghadam, Mohammad R. Vedady; Zhang, Rui

    2017-06-01

    In magnetic resonant coupling (MRC) enabled multiple-input multiple-output (MIMO) wireless power transfer (WPT) systems, multiple transmitters (TXs) each with one single coil are used to enhance the efficiency of simultaneous power transfer to multiple single-coil receivers (RXs) by constructively combining their induced magnetic fields at the RXs, a technique termed "magnetic beamforming". In this paper, we study the optimal magnetic beamforming design in a multi-user MIMO MRC-WPT system. We introduce the multi-user power region that constitutes all the achievable power tuples for all RXs, subject to the given total power constraint over all TXs as well as their individual peak voltage and current constraints. We characterize each boundary point of the power region by maximizing the sum-power deliverable to all RXs subject to their minimum harvested power constraints. For the special case without the TX peak voltage and current constraints, we derive the optimal TX current allocation for the single-RX setup in closed-form as well as that for the multi-RX setup. In general, the problem is a non-convex quadratically constrained quadratic programming (QCQP), which is difficult to solve. For the case of one single RX, we show that the semidefinite relaxation (SDR) of the problem is tight. For the general case with multiple RXs, based on SDR we obtain two approximate solutions by applying time-sharing and randomization, respectively. Moreover, for practical implementation of magnetic beamforming, we propose a novel signal processing method to estimate the magnetic MIMO channel due to the mutual inductances between TXs and RXs. Numerical results show that our proposed magnetic channel estimation and adaptive beamforming schemes are practically effective, and can significantly improve the power transfer efficiency and multi-user performance trade-off in MIMO MRC-WPT systems.

  8. Paradise nearly Gained. Volume 2: Case Studies of Impact and Diversity for Frontline Management Initiative Practice.

    ERIC Educational Resources Information Center

    Barratt-Pugh, Llandis; Soutar, Geoffrey N.

    This document presents the case studies from a multi-phase study of the impact of Australia's Frontline Management Initiative (FMI), which provides a framework for competency-based development of frontline managers in Australian enterprises. Nineteen organizational case studies and one individual case study of the FMI's impacts are included. The…

  9. Diagnosis of neglected tropical diseases among patients with persistent digestive disorders (diarrhoea and/or abdominal pain ≥14 days): Pierrea multi-country, prospective, non-experimental case-control study.

    PubMed

    Polman, Katja; Becker, Sören L; Alirol, Emilie; Bhatta, Nisha K; Bhattarai, Narayan R; Bottieau, Emmanuel; Bratschi, Martin W; Burza, Sakib; Coulibaly, Jean T; Doumbia, Mama N; Horié, Ninon S; Jacobs, Jan; Khanal, Basudha; Landouré, Aly; Mahendradhata, Yodi; Meheus, Filip; Mertens, Pascal; Meyanti, Fransiska; Murhandarwati, Elsa H; N'Goran, Eliézer K; Peeling, Rosanna W; Ravinetto, Raffaella; Rijal, Suman; Sacko, Moussa; Saye, Rénion; Schneeberger, Pierre H H; Schurmans, Céline; Silué, Kigbafori D; Thobari, Jarir A; Traoré, Mamadou S; van Lieshout, Lisette; van Loen, Harry; Verdonck, Kristien; von Müller, Lutz; Yansouni, Cédric P; Yao, Joel A; Yao, Patrick K; Yap, Peiling; Boelaert, Marleen; Chappuis, François; Utzinger, Jürg

    2015-08-18

    Diarrhoea still accounts for considerable mortality and morbidity worldwide. The highest burden is concentrated in tropical areas where populations lack access to clean water, adequate sanitation and hygiene. In contrast to acute diarrhoea (<14 days), the spectrum of pathogens that may give rise to persistent diarrhoea (≥14 days) and persistent abdominal pain is poorly understood. It is conceivable that pathogens causing neglected tropical diseases play a major role, but few studies investigated this issue. Clinical management and diagnostic work-up of persistent digestive disorders in the tropics therefore remain inadequate. Hence, important aspects regarding the pathogenesis, epidemiology, clinical symptomatology and treatment options for patients presenting with persistent diarrhoea and persistent abdominal pain should be investigated in multi-centric clinical studies. This multi-country, prospective, non-experimental case-control study will assess persistent diarrhoea (≥14 days; in individuals aged ≥1 year) and persistent abdominal pain (≥14 days; in children/adolescents aged 1-18 years) in up to 2000 symptomatic patients and 2000 matched controls. Subjects from Côte d'Ivoire, Indonesia, Mali and Nepal will be clinically examined and interviewed using a detailed case report form. Additionally, each participant will provide a stool sample that will be examined using a suite of diagnostic methods (i.e., microscopic techniques, rapid diagnostic tests, stool culture and polymerase chain reaction) for the presence of bacterial and parasitic pathogens. Treatment will be offered to all infected participants and the clinical treatment response will be recorded. Data obtained will be utilised to develop patient-centred clinical algorithms that will be validated in primary health care centres in the four study countries in subsequent studies. Our research will deepen the understanding of the importance of persistent diarrhoea and related digestive disorders in the tropics. A diversity of intestinal pathogens will be assessed for potential associations with persistent diarrhoea and persistent abdominal pain. Different diagnostic methods will be compared, clinical symptoms investigated and diagnosis-treatment algorithms developed for validation in selected primary health care centres. The findings from this study will improve differential diagnosis and evidence-based clinical management of digestive syndromes in the tropics. ClinicalTrials.gov; identifier: NCT02105714 .

  10. Sustainability assessment of alternative end-uses for disused areas based on multi-criteria decision-making method.

    PubMed

    De Feo, Giovanni; De Gisi, Sabino; De Vita, Sabato; Notarnicola, Michele

    2018-08-01

    The main aim of this study was to define and apply a multidisciplinary and multi-criteria approach to sustainability in evaluating alternative end-uses for disused areas. Taking into account the three pillars of sustainability (social, economic and environmental dimension) as well as the need for stakeholders to have new practical instruments, the innovative approach consists of four modules stated (i) sociological, (ii) economic, (iii) environmental and (iv) multi-criteria assessment. By means of a case study on a small Municipality in Southern Italy, three end-uses alternatives, representing three essential services for citizens, were selected: Municipal gym; Market area; Municipal Solid Waste (MSW) separate collection centre. The sociological module was useful to select the most socially sound alternative by means of a consultative referendum, simulated with the use of a structured questionnaire administered to a sample of the population. The economic evaluation was conducted defining the bill of quantities with regarding to six main items (soil handling, landfill disposal tax, public services, structure and services, completion work, equipment and furnishings). The environmental evaluation was performed applying the Delphi method with local technicians who were involved in a qualitative-quantitative evaluation of the three alternatives with regarding to eight possible environmental impacts (landscape impact, soil handling, odour, traffic, noise, atmospheric pollution, wastewater, waste). Finally, the Simple Additive Weighting was used as multi-criteria technique to define alternatives priorities. The obtained results showed how the multi-criteria analysis is a useful decision support tool able to identify transparently and efficiently the most sustainable solutions to a complex social problem. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. A tuning algorithm for model predictive controllers based on genetic algorithms and fuzzy decision making.

    PubMed

    van der Lee, J H; Svrcek, W Y; Young, B R

    2008-01-01

    Model Predictive Control is a valuable tool for the process control engineer in a wide variety of applications. Because of this the structure of an MPC can vary dramatically from application to application. There have been a number of works dedicated to MPC tuning for specific cases. Since MPCs can differ significantly, this means that these tuning methods become inapplicable and a trial and error tuning approach must be used. This can be quite time consuming and can result in non-optimum tuning. In an attempt to resolve this, a generalized automated tuning algorithm for MPCs was developed. This approach is numerically based and combines a genetic algorithm with multi-objective fuzzy decision-making. The key advantages to this approach are that genetic algorithms are not problem specific and only need to be adapted to account for the number and ranges of tuning parameters for a given MPC. As well, multi-objective fuzzy decision-making can handle qualitative statements of what optimum control is, in addition to being able to use multiple inputs to determine tuning parameters that best match the desired results. This is particularly useful for multi-input, multi-output (MIMO) cases where the definition of "optimum" control is subject to the opinion of the control engineer tuning the system. A case study will be presented in order to illustrate the use of the tuning algorithm. This will include how different definitions of "optimum" control can arise, and how they are accounted for in the multi-objective decision making algorithm. The resulting tuning parameters from each of the definition sets will be compared, and in doing so show that the tuning parameters vary in order to meet each definition of optimum control, thus showing the generalized automated tuning algorithm approach for tuning MPCs is feasible.

  12. Site selection for managed aquifer recharge using fuzzy rules: integrating geographical information system (GIS) tools and multi-criteria decision making

    NASA Astrophysics Data System (ADS)

    Malekmohammadi, Bahram; Ramezani Mehrian, Majid; Jafari, Hamid Reza

    2012-11-01

    One of the most important water-resources management strategies for arid lands is managed aquifer recharge (MAR). In establishing a MAR scheme, site selection is the prime prerequisite that can be assisted by geographic information system (GIS) tools. One of the most important uncertainties in the site-selection process using GIS is finite ranges or intervals resulting from data classification. In order to reduce these uncertainties, a novel method has been developed involving the integration of multi-criteria decision making (MCDM), GIS, and a fuzzy inference system (FIS). The Shemil-Ashkara plain in the Hormozgan Province of Iran was selected as the case study; slope, geology, groundwater depth, potential for runoff, land use, and groundwater electrical conductivity have been considered as site-selection factors. By defining fuzzy membership functions for the input layers and the output layer, and by constructing fuzzy rules, a FIS has been developed. Comparison of the results produced by the proposed method and the traditional simple additive weighted (SAW) method shows that the proposed method yields more precise results. In conclusion, fuzzy-set theory can be an effective method to overcome associated uncertainties in classification of geographic information data.

  13. Deep multi-scale location-aware 3D convolutional neural networks for automated detection of lacunes of presumed vascular origin.

    PubMed

    Ghafoorian, Mohsen; Karssemeijer, Nico; Heskes, Tom; Bergkamp, Mayra; Wissink, Joost; Obels, Jiri; Keizer, Karlijn; de Leeuw, Frank-Erik; Ginneken, Bram van; Marchiori, Elena; Platel, Bram

    2017-01-01

    Lacunes of presumed vascular origin (lacunes) are associated with an increased risk of stroke, gait impairment, and dementia and are a primary imaging feature of the small vessel disease. Quantification of lacunes may be of great importance to elucidate the mechanisms behind neuro-degenerative disorders and is recommended as part of study standards for small vessel disease research. However, due to the different appearance of lacunes in various brain regions and the existence of other similar-looking structures, such as perivascular spaces, manual annotation is a difficult, elaborative and subjective task, which can potentially be greatly improved by reliable and consistent computer-aided detection (CAD) routines. In this paper, we propose an automated two-stage method using deep convolutional neural networks (CNN). We show that this method has good performance and can considerably benefit readers. We first use a fully convolutional neural network to detect initial candidates. In the second step, we employ a 3D CNN as a false positive reduction tool. As the location information is important to the analysis of candidate structures, we further equip the network with contextual information using multi-scale analysis and integration of explicit location features. We trained, validated and tested our networks on a large dataset of 1075 cases obtained from two different studies. Subsequently, we conducted an observer study with four trained observers and compared our method with them using a free-response operating characteristic analysis. Shown on a test set of 111 cases, the resulting CAD system exhibits performance similar to the trained human observers and achieves a sensitivity of 0.974 with 0.13 false positives per slice. A feasibility study also showed that a trained human observer would considerably benefit once aided by the CAD system.

  14. Detection of 22 common leukemic fusion genes using a single-step multiplex qRT-PCR-based assay.

    PubMed

    Lyu, Xiaodong; Wang, Xianwei; Zhang, Lina; Chen, Zhenzhu; Zhao, Yu; Hu, Jieying; Fan, Ruihua; Song, Yongping

    2017-07-25

    Fusion genes generated from chromosomal translocation play an important role in hematological malignancies. Detection of fusion genes currently employ use of either conventional RT-PCR methods or fluorescent in situ hybridization (FISH), where both methods involve tedious methodologies and require prior characterization of chromosomal translocation events as determined by cytogenetic analysis. In this study, we describe a real-time quantitative reverse transcription PCR (qRT-PCR)-based multi-fusion gene screening method with the capacity to detect 22 fusion genes commonly found in leukemia. This method does not require pre-characterization of gene translocation events, thereby facilitating immediate diagnosis and therapeutic management. We performed fluorescent qRT-PCR (F-qRT-PCR) using a commercially-available multi-fusion gene detection kit on a patient cohort of 345 individuals comprising 108 cases diagnosed with acute myeloid leukemia (AML) for initial evaluation; remaining patients within the cohort were assayed for confirmatory diagnosis. Results obtained by F-qRT-PCR were compared alongside patient analysis by cytogenetic characterization. Gene translocations detected by F-qRT-PCR in AML cases were diagnosed in 69.4% of the patient cohort, which was comparatively similar to 68.5% as diagnosed by cytogenetic analysis, thereby demonstrating 99.1% concordance. Overall gene fusion was detected in 53.7% of the overall patient population by F-qRT-PCR, 52.9% by cytogenetic prediction in leukemia, and 9.1% in non-leukemia patients by both methods. The overall concordance rate was calculated to be 99.0%. Fusion genes were detected by F-qRT-PCR in 97.3% of patients with CML, followed by 69.4% with AML, 33.3% with acute lymphoblastic leukemia (ALL), 9.1% with myelodysplastic syndromes (MDS), and 0% with chronic lymphocytic leukemia (CLL). We describe the use of a F-qRT-PCR-based multi-fusion gene screening method as an efficient one-step diagnostic procedure as an effective alternative to lengthy conventional diagnostic procedures requiring both cytogenetic analysis followed by targeted quantitative reverse transcription (qRT-PCR) methods, thus allowing timely patient management.

  15. Chondrocyte Deformations as a Function of Tibiofemoral Joint Loading Predicted by a Generalized High-Throughput Pipeline of Multi-Scale Simulations

    PubMed Central

    Sibole, Scott C.; Erdemir, Ahmet

    2012-01-01

    Cells of the musculoskeletal system are known to respond to mechanical loading and chondrocytes within the cartilage are not an exception. However, understanding how joint level loads relate to cell level deformations, e.g. in the cartilage, is not a straightforward task. In this study, a multi-scale analysis pipeline was implemented to post-process the results of a macro-scale finite element (FE) tibiofemoral joint model to provide joint mechanics based displacement boundary conditions to micro-scale cellular FE models of the cartilage, for the purpose of characterizing chondrocyte deformations in relation to tibiofemoral joint loading. It was possible to identify the load distribution within the knee among its tissue structures and ultimately within the cartilage among its extracellular matrix, pericellular environment and resident chondrocytes. Various cellular deformation metrics (aspect ratio change, volumetric strain, cellular effective strain and maximum shear strain) were calculated. To illustrate further utility of this multi-scale modeling pipeline, two micro-scale cartilage constructs were considered: an idealized single cell at the centroid of a 100×100×100 μm block commonly used in past research studies, and an anatomically based (11 cell model of the same volume) representation of the middle zone of tibiofemoral cartilage. In both cases, chondrocytes experienced amplified deformations compared to those at the macro-scale, predicted by simulating one body weight compressive loading on the tibiofemoral joint. In the 11 cell case, all cells experienced less deformation than the single cell case, and also exhibited a larger variance in deformation compared to other cells residing in the same block. The coupling method proved to be highly scalable due to micro-scale model independence that allowed for exploitation of distributed memory computing architecture. The method’s generalized nature also allows for substitution of any macro-scale and/or micro-scale model providing application for other multi-scale continuum mechanics problems. PMID:22649535

  16. Integration of DNA sample collection into a multi-site birth defects case-control study.

    PubMed

    Rasmussen, Sonja A; Lammer, Edward J; Shaw, Gary M; Finnell, Richard H; McGehee, Robert E; Gallagher, Margaret; Romitti, Paul A; Murray, Jeffrey C

    2002-10-01

    Advances in quantitative analysis and molecular genotyping have provided unprecedented opportunities to add biological sampling and genetic information to epidemiologic studies. The purpose of this article is to describe the incorporation of DNA sample collection into the National Birth Defects Prevention Study (NBDPS), an ongoing case-control study in an eight-state consortium with a primary goal to identify risk factors for birth defects. Babies with birth defects are identified through birth defects surveillance systems in the eight participating centers. Cases are infants with one or more of over 30 major birth defects. Controls are infants without defects from the same geographic area. Epidemiologic information is collected through an hour-long interview with mothers of both cases and controls. We added the collection of buccal cytobrush DNA samples for case-infants, control-infants, and their parents to this study. We describe here the methods by which the samples have been collected and processed, establishment of a centralized resource for DNA banking, and quality control, database management, access, informed consent, and confidentiality issues. Biological sampling and genetic analyses are important components to epidemiologic studies of birth defects aimed at identifying risk factors. The DNA specimens collected in this study can be used for detection of mutations, study of polymorphic variants that confer differential susceptibility to teratogens, and examination of interactions among genetic risk factors. Information on the methods used and issues faced by the NBDPS may be of value to others considering the addition of DNA sampling to epidemiologic studies.

  17. Assessment of multiple geophysical techniques for the characterization of municipal waste deposit sites

    NASA Astrophysics Data System (ADS)

    Gaël, Dumont; Tanguy, Robert; Nicolas, Marck; Frédéric, Nguyen

    2017-10-01

    In this study, we tested the ability of geophysical methods to characterize a large technical landfill installed in a former sand quarry. The geophysical surveys specifically aimed at delimitating the deposit site horizontal extension, at estimating its thickness and at characterizing the waste material composition (the moisture content in the present case). The site delimitation was conducted with electromagnetic (in-phase and out-of-phase) and magnetic (vertical gradient and total field) methods that clearly showed the transition between the waste deposit and the host formation. Regarding waste deposit thickness evaluation, electrical resistivity tomography appeared inefficient on this particularly thick deposit site. Thus, we propose a combination of horizontal to vertical noise spectral ratio (HVNSR) and multichannel analysis of the surface waves (MASW), which successfully determined the approximate waste deposit thickness in our test landfill. However, ERT appeared to be an appropriate tool to characterize the moisture content of the waste, which is of prior information for the organic waste biodegradation process. The global multi-scale and multi-method geophysical survey offers precious information for site rehabilitation studies, water content mitigation processes for enhanced biodegradation or landfill mining operation planning.

  18. A multi-level approach for investigating socio-economic and agricultural risk factors associated with rates of reported cases of Escherichia coli O157 in humans in Alberta, Canada.

    PubMed

    Pearl, D L; Louie, M; Chui, L; Doré, K; Grimsrud, K M; Martin, S W; Michel, P; Svenson, L W; McEwen, S A

    2009-10-01

    Using negative binomial and multi-level Poisson models, the authors determined the statistical significance of agricultural and socio-economic risk factors for rates of reported disease associated with Escherichia coli O157 in census subdivisions (CSDs) in Alberta, Canada, 2000-2002. Variables relating to population stability, aboriginal composition of the CSDs, and the economic relationship between CSDs and urban centres were significant risk factors. The percentage of individuals living in low-income households was not a statistically significant risk factor for rates of disease. The statistical significance of cattle density, recorded at a higher geographical level, depended on the method used to correct for overdispersion, the number of levels included in the multi-level models, and the choice of using all reported cases or only sporadic cases. Our results highlight the importance of local socio-economic risk factors in determining rates of disease associated with E. coli O157, but their relationship with individual risk factors requires further evaluation.

  19. Fish intake, cooking practices, and risk of prostate cancer: results from a multi-ethnic case-control study.

    PubMed

    Joshi, Amit D; John, Esther M; Koo, Jocelyn; Ingles, Sue A; Stern, Mariana C

    2012-03-01

    Studies conducted to assess the association between fish consumption and prostate cancer (PCA) risk are inconclusive. However, few studies have distinguished between fatty and lean fish, and no studies have considered the role of different cooking practices, which may lead to differential accumulation of chemical carcinogens. In this study, we investigated the association between fish intake and localized and advanced PCA taking into account fish types (lean vs. fatty) and cooking practices. We analyzed data for 1,096 controls, 717 localized and 1,140 advanced cases from the California Collaborative Prostate Cancer Study, a multiethnic, population-based case-control study. We used multivariate conditional logistic regression to estimate odds ratios using nutrient density converted variables of fried fish, tuna, dark fish and white fish consumption. We tested for effect modification by cooking methods (high- vs. low-temperature methods) and levels of doneness. We observed that high white fish intake was associated with increased risk of advanced PCA among men who cooked with high-temperature methods (pan-frying, oven-broiling and grilling) until fish was well done (p (trend) = 0.001). No associations were found among men who cooked fish at low temperature and/or just until done (white fish x cooking method p (interaction) = 0.040). Our results indicate that consideration of fish type (oily vs. lean), specific fish cooking practices and levels of doneness of cooked fish helps elucidate the association between fish intake and PCA risk and suggest that avoiding high-temperature cooking methods for white fish may lower PCA risk.

  20. A unified inversion scheme to process multifrequency measurements of various dispersive electromagnetic properties

    NASA Astrophysics Data System (ADS)

    Han, Y.; Misra, S.

    2018-04-01

    Multi-frequency measurement of a dispersive electromagnetic (EM) property, such as electrical conductivity, dielectric permittivity, or magnetic permeability, is commonly analyzed for purposes of material characterization. Such an analysis requires inversion of the multi-frequency measurement based on a specific relaxation model, such as Cole-Cole model or Pelton's model. We develop a unified inversion scheme that can be coupled to various type of relaxation models to independently process multi-frequency measurement of varied EM properties for purposes of improved EM-based geomaterial characterization. The proposed inversion scheme is firstly tested in few synthetic cases in which different relaxation models are coupled into the inversion scheme and then applied to multi-frequency complex conductivity, complex resistivity, complex permittivity, and complex impedance measurements. The method estimates up to seven relaxation-model parameters exhibiting convergence and accuracy for random initializations of the relaxation-model parameters within up to 3-orders of magnitude variation around the true parameter values. The proposed inversion method implements a bounded Levenberg algorithm with tuning initial values of damping parameter and its iterative adjustment factor, which are fixed in all the cases shown in this paper and irrespective of the type of measured EM property and the type of relaxation model. Notably, jump-out step and jump-back-in step are implemented as automated methods in the inversion scheme to prevent the inversion from getting trapped around local minima and to honor physical bounds of model parameters. The proposed inversion scheme can be easily used to process various types of EM measurements without major changes to the inversion scheme.

  1. Using Ensemble Decisions and Active Selection to Improve Low-Cost Labeling for Multi-View Data

    NASA Technical Reports Server (NTRS)

    Rebbapragada, Umaa; Wagstaff, Kiri L.

    2011-01-01

    This paper seeks to improve low-cost labeling in terms of training set reliability (the fraction of correctly labeled training items) and test set performance for multi-view learning methods. Co-training is a popular multiview learning method that combines high-confidence example selection with low-cost (self) labeling. However, co-training with certain base learning algorithms significantly reduces training set reliability, causing an associated drop in prediction accuracy. We propose the use of ensemble labeling to improve reliability in such cases. We also discuss and show promising results on combining low-cost ensemble labeling with active (low-confidence) example selection. We unify these example selection and labeling strategies under collaborative learning, a family of techniques for multi-view learning that we are developing for distributed, sensor-network environments.

  2. Multi-Unmanned Aerial Vehicle (UAV) Cooperative Fault Detection Employing Differential Global Positioning (DGPS), Inertial and Vision Sensors.

    PubMed

    Heredia, Guillermo; Caballero, Fernando; Maza, Iván; Merino, Luis; Viguria, Antidio; Ollero, Aníbal

    2009-01-01

    This paper presents a method to increase the reliability of Unmanned Aerial Vehicle (UAV) sensor Fault Detection and Identification (FDI) in a multi-UAV context. Differential Global Positioning System (DGPS) and inertial sensors are used for sensor FDI in each UAV. The method uses additional position estimations that augment individual UAV FDI system. These additional estimations are obtained using images from the same planar scene taken from two different UAVs. Since accuracy and noise level of the estimation depends on several factors, dynamic replanning of the multi-UAV team can be used to obtain a better estimation in case of faults caused by slow growing errors of absolute position estimation that cannot be detected by using local FDI in the UAVs. Experimental results with data from two real UAVs are also presented.

  3. Numerical Investigations of Two Typical Unsteady Flows in Turbomachinery Using the Multi-Passage Model

    NASA Astrophysics Data System (ADS)

    Zhou, Di; Lu, Zhiliang; Guo, Tongqing; Shen, Ennan

    2016-06-01

    In this paper, the research on two types of unsteady flow problems in turbomachinery including blade flutter and rotor-stator interaction is made by means of numerical simulation. For the former, the energy method is often used to predict the aeroelastic stability by calculating the aerodynamic work per vibration cycle. The inter-blade phase angle (IBPA) is an important parameter in computation and may have significant effects on aeroelastic behavior. For the latter, the numbers of blades in each row are usually not equal and the unsteady rotor-stator interactions could be strong. An effective way to perform multi-row calculations is the domain scaling method (DSM). These two cases share a common point that the computational domain has to be extended to multi passages (MP) considering their respective features. The present work is aimed at modeling these two issues with the developed MP model. Computational fluid dynamics (CFD) technique is applied to resolve the unsteady Reynolds-averaged Navier-Stokes (RANS) equations and simulate the flow fields. With the parallel technique, the additional time cost due to modeling more passages can be largely decreased. Results are presented on two test cases including a vibrating rotor blade and a turbine stage.

  4. Confidence-based ensemble for GBM brain tumor segmentation

    NASA Astrophysics Data System (ADS)

    Huo, Jing; van Rikxoort, Eva M.; Okada, Kazunori; Kim, Hyun J.; Pope, Whitney; Goldin, Jonathan; Brown, Matthew

    2011-03-01

    It is a challenging task to automatically segment glioblastoma multiforme (GBM) brain tumors on T1w post-contrast isotropic MR images. A semi-automated system using fuzzy connectedness has recently been developed for computing the tumor volume that reduces the cost of manual annotation. In this study, we propose a an ensemble method that combines multiple segmentation results into a final ensemble one. The method is evaluated on a dataset of 20 cases from a multi-center pharmaceutical drug trial and compared to the fuzzy connectedness method. Three individual methods were used in the framework: fuzzy connectedness, GrowCut, and voxel classification. The combination method is a confidence map averaging (CMA) method. The CMA method shows an improved ROC curve compared to the fuzzy connectedness method (p < 0.001). The CMA ensemble result is more robust compared to the three individual methods.

  5. [Multi-central controlled study on three-part massage therapy for treatment of insomnia of deficiency of both the heart and spleen].

    PubMed

    Zhou, Yun-feng; Wei, Yu-long; Zhang, Pu-lin; Gao, Shan; Ning, Guo-li; Zhang, Zhen-qiang; Hu, Bin; Wang, Dan-yi; Yan, Mei-rong; Liu, Wen-jun

    2006-06-01

    To make multi-central clinical evaluation for three-part massage therapy for treatment of insomnia of deficiency of both the heart and spleen. One hundred and sixty-six cases were randomly divided into a test group (n = 84) and a control group (n = 82). Multi-central, randomized and controlled methods were adopted. The test group were treated by the three-part massage therapy, i. e. acupoints at the head, abdomen and back were massaged, once each day; and the control group by oral administration of Guipi Pills [symbol: see text], 8 pills each time, thrice daily. The treatment was given for 15 consecutive days and then the therapeutic effects were observed. Sixty-seven cases were cured, 11 markedly effective, 3 effective, and 3 ineffective in the test group, and the corresponding figures were 10, 21, 29 and 22 in the control group with a very significant difference between the two groups (P< 0.001). The test group was superior to the control group in improvement for Pittsburgh Sleep Quality Index (PSQI), Sleepless Anxiety Scale (SAS) and Sleepless Depression Scale (SDS) (P < 0.001). The three-part massage therapy has definite therapeutic effect on insomnia of deficiency of both the heart and spleen with safety.

  6. Fuzzy Multi-Objective Transportation Planning with Modified S-Curve Membership Function

    NASA Astrophysics Data System (ADS)

    Peidro, D.; Vasant, P.

    2009-08-01

    In this paper, the S-Curve membership function methodology is used in a transportation planning decision (TPD) problem. An interactive method for solving multi-objective TPD problems with fuzzy goals, available supply and forecast demand is developed. The proposed method attempts simultaneously to minimize the total production and transportation costs and the total delivery time with reference to budget constraints and available supply, machine capacities at each source, as well as forecast demand and warehouse space constraints at each destination. We compare in an industrial case the performance of S-curve membership functions, representing uncertainty goals and constraints in TPD problems, with linear membership functions.

  7. Research of Simple Multi-Attribute Rating Technique for Decision Support

    NASA Astrophysics Data System (ADS)

    Siregar, Dodi; Arisandi, Diki; Usman, Ari; Irwan, Dedy; Rahim, Robbi

    2017-12-01

    One of the roles of decision support system is that it can assist the decision maker in obtaining the appropriate alternative with the desired criteria, one of the methods that could apply for the decision maker is SMART method with multicriteria decision making. This multi-criteria decision-making theory has meaning where every alternative has criteria and has value and weight, and the author uses this approach to facilitate decision making with a compelling case. The problems discussed in this paper are classified into problems of a variety Multiobjective (multiple goals to be accomplished) and multicriteria (many of the decisive criteria in reaching such decisions).

  8. Effects of heterogeneous traffic with speed limit zone on the car accidents

    NASA Astrophysics Data System (ADS)

    Marzoug, R.; Lakouari, N.; Bentaleb, K.; Ez-Zahraouy, H.; Benyoussef, A.

    2016-06-01

    Using the extended Nagel-Schreckenberg (NS) model, we numerically study the impact of the heterogeneity of traffic with speed limit zone (SLZ) on the probability of occurrence of car accidents (Pac). SLZ in the heterogeneous traffic has an important effect, typically in the mixture velocities case. In the deterministic case, SLZ leads to the appearance of car accidents even in the low densities, in this region Pac increases with increasing of fraction of fast vehicles (Ff). In the nondeterministic case, SLZ decreases the effect of braking probability Pb in the low densities. Furthermore, the impact of multi-SLZ on the probability Pac is also studied. In contrast with the homogeneous case [X. Li, H. Kuang, Y. Fan and G. Zhang, Int. J. Mod. Phys. C 25 (2014) 1450036], it is found that in the low densities the probability Pac without SLZ (n = 0) is low than Pac with multi-SLZ (n > 0). However, the existence of multi-SLZ in the road decreases the risk of collision in the congestion phase.

  9. The effects of particulate air pollution on daily deaths: a multi-city case crossover analysis

    PubMed Central

    Schwartz, J

    2004-01-01

    Background: Numerous studies have reported that day-to-day changes in particulate air pollution are associated with day-to-day changes in deaths. Recently, several reports have indicated that the software used to control for season and weather in some of these studies had deficiencies. Aims: To investigate the use of the case-crossover design as an alternative. Methods: This approach compares the exposure of each case to their exposure on a nearby day, when they did not die. Hence it controls for seasonal patterns and for all slowly varying covariates (age, smoking, etc) by matching rather than complex modelling. A key feature is that temperature can also be controlled by matching. This approach was applied to a study of 14 US cities. Weather and day of the week were controlled for in the regression. Results: A 10 µg/m3 increase in PM10 was associated with a 0.36% increase in daily deaths from internal causes (95% CI 0.22% to 0.50%). Results were little changed if, instead of symmetrical sampling of control days the time stratified method was applied, when control days were matched on temperature, or when more lags of winter time temperatures were used. Similar results were found using a Poisson regression, but the case-crossover method has the advantage of simplicity in modelling, and of combining matched strata across multiple locations in a single stage analysis. Conclusions: Despite the considerable differences in analytical design, the previously reported associations of particles with mortality persisted in this study. The association appeared quite linear. Case-crossover designs represent an attractive method to control for season and weather by matching. PMID:15550600

  10. Three-Dimensional Surface Parameters and Multi-Fractal Spectrum of Corroded Steel

    PubMed Central

    Shanhua, Xu; Songbo, Ren; Youde, Wang

    2015-01-01

    To study multi-fractal behavior of corroded steel surface, a range of fractal surfaces of corroded surfaces of Q235 steel were constructed by using the Weierstrass-Mandelbrot method under a high total accuracy. The multi-fractal spectrum of fractal surface of corroded steel was calculated to study the multi-fractal characteristics of the W-M corroded surface. Based on the shape feature of the multi-fractal spectrum of corroded steel surface, the least squares method was applied to the quadratic fitting of the multi-fractal spectrum of corroded surface. The fitting function was quantitatively analyzed to simplify the calculation of multi-fractal characteristics of corroded surface. The results showed that the multi-fractal spectrum of corroded surface was fitted well with the method using quadratic curve fitting, and the evolution rules and trends were forecasted accurately. The findings can be applied to research on the mechanisms of corroded surface formation of steel and provide a new approach for the establishment of corrosion damage constitutive models of steel. PMID:26121468

  11. Three-Dimensional Surface Parameters and Multi-Fractal Spectrum of Corroded Steel.

    PubMed

    Shanhua, Xu; Songbo, Ren; Youde, Wang

    2015-01-01

    To study multi-fractal behavior of corroded steel surface, a range of fractal surfaces of corroded surfaces of Q235 steel were constructed by using the Weierstrass-Mandelbrot method under a high total accuracy. The multi-fractal spectrum of fractal surface of corroded steel was calculated to study the multi-fractal characteristics of the W-M corroded surface. Based on the shape feature of the multi-fractal spectrum of corroded steel surface, the least squares method was applied to the quadratic fitting of the multi-fractal spectrum of corroded surface. The fitting function was quantitatively analyzed to simplify the calculation of multi-fractal characteristics of corroded surface. The results showed that the multi-fractal spectrum of corroded surface was fitted well with the method using quadratic curve fitting, and the evolution rules and trends were forecasted accurately. The findings can be applied to research on the mechanisms of corroded surface formation of steel and provide a new approach for the establishment of corrosion damage constitutive models of steel.

  12. Multi-stakeholder perspectives in defining health-services quality in cataract care.

    PubMed

    Stolk-Vos, Aline C; van de Klundert, Joris J; Maijers, Niels; Zijlmans, Bart L M; Busschbach, Jan J V

    2017-08-01

    To develop a method to define a multi-stakeholder perspective on health-service quality that enables the expression of differences in systematically identified stakeholders' perspectives, and to pilot the approach for cataract care. Mixed-method study between 2014 and 2015. Cataract care in the Netherlands. Stakeholder representatives. We first identified and classified stakeholders using stakeholder theory. Participants established a multi-stakeholder perspective on quality of cataract care using concept mapping, this yielded a cluster map based on multivariate statistical analyses. Consensus-based quality dimensions were subsequently defined in a plenary stakeholder session. Stakeholders and multi-stakeholder perspective on health-service quality. Our analysis identified seven definitive stakeholders, as follows: the Dutch Ophthalmology Society, ophthalmologists, general practitioners, optometrists, health insurers, hospitals and private clinics. Patients, as dependent stakeholders, were considered to lack power by other stakeholders; hence, they were not classified as definitive stakeholders. Overall, 18 stakeholders representing ophthalmologists, general practitioners, optometrists, health insurers, hospitals, private clinics, patients, patient federations and the Dutch Healthcare Institute sorted 125 systematically collected indicators into the seven following clusters: patient centeredness and accessibility, interpersonal conduct and expectations, experienced outcome, clinical outcome, process and structure, medical technical acting and safety. Importance scores from stakeholders directly involved in the cataract service delivery process correlated strongly, as did scores from stakeholders not directly involved in this process. Using a case study on cataract care, the proposed methods enable different views among stakeholders concerning quality dimensions to be systematically revealed, and the stakeholders jointly agreed on these dimensions. The methods helped to unify different quality definitions and facilitated operationalisation of quality measurement in a way that was accepted by relevant stakeholders. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  13. Identification of multivariable nonlinear systems in the presence of colored noises using iterative hierarchical least squares algorithm.

    PubMed

    Jafari, Masoumeh; Salimifard, Maryam; Dehghani, Maryam

    2014-07-01

    This paper presents an efficient method for identification of nonlinear Multi-Input Multi-Output (MIMO) systems in the presence of colored noises. The method studies the multivariable nonlinear Hammerstein and Wiener models, in which, the nonlinear memory-less block is approximated based on arbitrary vector-based basis functions. The linear time-invariant (LTI) block is modeled by an autoregressive moving average with exogenous (ARMAX) model which can effectively describe the moving average noises as well as the autoregressive and the exogenous dynamics. According to the multivariable nature of the system, a pseudo-linear-in-the-parameter model is obtained which includes two different kinds of unknown parameters, a vector and a matrix. Therefore, the standard least squares algorithm cannot be applied directly. To overcome this problem, a Hierarchical Least Squares Iterative (HLSI) algorithm is used to simultaneously estimate the vector and the matrix of unknown parameters as well as the noises. The efficiency of the proposed identification approaches are investigated through three nonlinear MIMO case studies. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  14. Characterisation of carbon nanotubes in the context of toxicity studies

    USGS Publications Warehouse

    Berhanu, D.; Dybowska, A.; Misra, S.K.; Stanley, C.J.; Ruenraroengsak, P.; Boccaccini, A.R.; Tetley, T.D.; Luoma, S.N.; Plant, J.A.; Valsami-Jones, E.

    2009-01-01

    Nanotechnology has the potential to revolutionise our futures, but has also prompted concerns about the possibility that nanomaterials may harm humans or the biosphere. The unique properties of nanoparticles, that give them novel size dependent functionalities, may also have the potential to cause harm. Discrepancies in existing human health and environmental studies have shown the importance of good quality, well-characterized reference nanomaterials for toxicological studies. Here we make a case for the importance of the detailed characterization of nanoparticles, using several methods, particularly to allow the recognition of impurities and the presence of chemically identical but structurally distinct phases. Methods to characterise fully, commercially available multi-wall carbon nanotubes at different scales, are presented. ?? 2009 Berhanu et al; licensee BioMed Central Ltd.

  15. C–IBI: Targeting cumulative coordination within an iterative protocol to derive coarse-grained models of (multi-component) complex fluids

    DOE PAGES

    de Oliveira, Tiago E.; Netz, Paulo A.; Kremer, Kurt; ...

    2016-05-03

    We present a coarse-graining strategy that we test for aqueous mixtures. The method uses pair-wise cumulative coordination as a target function within an iterative Boltzmann inversion (IBI) like protocol. We name this method coordination iterative Boltzmann inversion (C–IBI). While the underlying coarse-grained model is still structure based and, thus, preserves pair-wise solution structure, our method also reproduces solvation thermodynamics of binary and/or ternary mixtures. In addition, we observe much faster convergence within C–IBI compared to IBI. To validate the robustness, we apply C–IBI to study test cases of solvation thermodynamics of aqueous urea and a triglycine solvation in aqueous urea.

  16. SU-F-R-10: Selecting the Optimal Solution for Multi-Objective Radiomics Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Z; Folkert, M; Wang, J

    2016-06-15

    Purpose: To develop an evidential reasoning approach for selecting the optimal solution from a Pareto solution set obtained by a multi-objective radiomics model for predicting distant failure in lung SBRT. Methods: In the multi-objective radiomics model, both sensitivity and specificity are considered as the objective functions simultaneously. A Pareto solution set with many feasible solutions will be resulted from the multi-objective optimization. In this work, an optimal solution Selection methodology for Multi-Objective radiomics Learning model using the Evidential Reasoning approach (SMOLER) was proposed to select the optimal solution from the Pareto solution set. The proposed SMOLER method used the evidentialmore » reasoning approach to calculate the utility of each solution based on pre-set optimal solution selection rules. The solution with the highest utility was chosen as the optimal solution. In SMOLER, an optimal learning model coupled with clonal selection algorithm was used to optimize model parameters. In this study, PET, CT image features and clinical parameters were utilized for predicting distant failure in lung SBRT. Results: Total 126 solution sets were generated by adjusting predictive model parameters. Each Pareto set contains 100 feasible solutions. The solution selected by SMOLER within each Pareto set was compared to the manually selected optimal solution. Five-cross-validation was used to evaluate the optimal solution selection accuracy of SMOLER. The selection accuracies for five folds were 80.00%, 69.23%, 84.00%, 84.00%, 80.00%, respectively. Conclusion: An optimal solution selection methodology for multi-objective radiomics learning model using the evidential reasoning approach (SMOLER) was proposed. Experimental results show that the optimal solution can be found in approximately 80% cases.« less

  17. A case study of multi-seam coal mine entry stability analysis with strength reduction method

    PubMed Central

    Tulu, Ihsan Berk; Esterhuizen, Gabriel S; Klemetti, Ted; Murphy, Michael M.; Sumner, James; Sloan, Michael

    2017-01-01

    In this paper, the advantage of using numerical models with the strength reduction method (SRM) to evaluate entry stability in complex multiple-seam conditions is demonstrated. A coal mine under variable topography from the Central Appalachian region is used as a case study. At this mine, unexpected roof conditions were encountered during development below previously mined panels. Stress mapping and observation of ground conditions were used to quantify the success of entry support systems in three room-and-pillar panels. Numerical model analyses were initially conducted to estimate the stresses induced by the multiple-seam mining at the locations of the affected entries. The SRM was used to quantify the stability factor of the supported roof of the entries at selected locations. The SRM-calculated stability factors were compared with observations made during the site visits, and the results demonstrate that the SRM adequately identifies the unexpected roof conditions in this complex case. It is concluded that the SRM can be used to effectively evaluate the likely success of roof supports and the stability condition of entries in coal mines. PMID:28239503

  18. A case study of multi-seam coal mine entry stability analysis with strength reduction method.

    PubMed

    Tulu, Ihsan Berk; Esterhuizen, Gabriel S; Klemetti, Ted; Murphy, Michael M; Sumner, James; Sloan, Michael

    2016-03-01

    In this paper, the advantage of using numerical models with the strength reduction method (SRM) to evaluate entry stability in complex multiple-seam conditions is demonstrated. A coal mine under variable topography from the Central Appalachian region is used as a case study. At this mine, unexpected roof conditions were encountered during development below previously mined panels. Stress mapping and observation of ground conditions were used to quantify the success of entry support systems in three room-and-pillar panels. Numerical model analyses were initially conducted to estimate the stresses induced by the multiple-seam mining at the locations of the affected entries. The SRM was used to quantify the stability factor of the supported roof of the entries at selected locations. The SRM-calculated stability factors were compared with observations made during the site visits, and the results demonstrate that the SRM adequately identifies the unexpected roof conditions in this complex case. It is concluded that the SRM can be used to effectively evaluate the likely success of roof supports and the stability condition of entries in coal mines.

  19. Point-of-care testing in the early diagnosis of acute pesticide intoxication: The example of paraquat.

    PubMed

    Wei, Ting-Yen; Yen, Tzung-Hai; Cheng, Chao-Min

    2018-01-01

    Acute pesticide intoxication is a common method of suicide globally. This article reviews current diagnostic methods and makes suggestions for future development. In the case of paraquat intoxication, it is characterized by multi-organ failure, causing substantial mortality and morbidity. Early diagnosis may save the life of a paraquat intoxication patient. Conventional paraquat intoxication diagnostic methods, such as symptom review and urine sodium dithionite assay, are time-consuming and impractical in resource-scarce areas where most intoxication cases occur. Several experimental and clinical studies have shown the potential of portable Surface Enhanced Raman Scattering (SERS), paper-based devices, and machine learning for paraquat intoxication diagnosis. Portable SERS and new SERS substrates maintain the sensitivity of SERS while being less costly and more convenient than conventional SERS. Paper-based devices provide the advantages of price and portability. Machine learning algorithms can be implemented as a mobile phone application and facilitate diagnosis in resource-limited areas. Although these methods have not yet met all features of an ideal diagnostic method, the combination and development of these methods offer much promise.

  20. Multi-electrolyte-step anodic aluminum oxide method for the fabrication of self-organized nanochannel arrays

    PubMed Central

    2012-01-01

    Nanochannel arrays were fabricated by the self-organized multi-electrolyte-step anodic aluminum oxide [AAO] method in this study. The anodization conditions used in the multi-electrolyte-step AAO method included a phosphoric acid solution as the electrolyte and an applied high voltage. There was a change in the phosphoric acid by the oxalic acid solution as the electrolyte and the applied low voltage. This method was used to produce self-organized nanochannel arrays with good regularity and circularity, meaning less power loss and processing time than with the multi-step AAO method. PMID:22333268

  1. A Bayesian method for assessing multiscalespecies-habitat relationships

    USGS Publications Warehouse

    Stuber, Erica F.; Gruber, Lutz F.; Fontaine, Joseph J.

    2017-01-01

    ContextScientists face several theoretical and methodological challenges in appropriately describing fundamental wildlife-habitat relationships in models. The spatial scales of habitat relationships are often unknown, and are expected to follow a multi-scale hierarchy. Typical frequentist or information theoretic approaches often suffer under collinearity in multi-scale studies, fail to converge when models are complex or represent an intractable computational burden when candidate model sets are large.ObjectivesOur objective was to implement an automated, Bayesian method for inference on the spatial scales of habitat variables that best predict animal abundance.MethodsWe introduce Bayesian latent indicator scale selection (BLISS), a Bayesian method to select spatial scales of predictors using latent scale indicator variables that are estimated with reversible-jump Markov chain Monte Carlo sampling. BLISS does not suffer from collinearity, and substantially reduces computation time of studies. We present a simulation study to validate our method and apply our method to a case-study of land cover predictors for ring-necked pheasant (Phasianus colchicus) abundance in Nebraska, USA.ResultsOur method returns accurate descriptions of the explanatory power of multiple spatial scales, and unbiased and precise parameter estimates under commonly encountered data limitations including spatial scale autocorrelation, effect size, and sample size. BLISS outperforms commonly used model selection methods including stepwise and AIC, and reduces runtime by 90%.ConclusionsGiven the pervasiveness of scale-dependency in ecology, and the implications of mismatches between the scales of analyses and ecological processes, identifying the spatial scales over which species are integrating habitat information is an important step in understanding species-habitat relationships. BLISS is a widely applicable method for identifying important spatial scales, propagating scale uncertainty, and testing hypotheses of scaling relationships.

  2. Outcomes of Moral Case Deliberation - the development of an evaluation instrument for clinical ethics support (the Euro-MCD)

    PubMed Central

    2014-01-01

    Background Clinical ethics support, in particular Moral Case Deliberation, aims to support health care providers to manage ethically difficult situations. However, there is a lack of evaluation instruments regarding outcomes of clinical ethics support in general and regarding Moral Case Deliberation (MCD) in particular. There also is a lack of clarity and consensuses regarding which MCD outcomes are beneficial. In addition, MCD outcomes might be context-sensitive. Against this background, there is a need for a standardised but flexible outcome evaluation instrument. The aim of this study was to develop a multi-contextual evaluation instrument measuring health care providers’ experiences and perceived importance of outcomes of Moral Case Deliberation. Methods A multi-item instrument for assessing outcomes of Moral Case Deliberation (MCD) was constructed through an iterative process, founded on a literature review and modified through a multistep review by ethicists and health care providers. The instrument measures perceived importance of outcomes before and after MCD, as well as experienced outcomes during MCD and in daily work. A purposeful sample of 86 European participants contributed to a Delphi panel and content validity testing. The Delphi panel (n = 13), consisting of ethicists and ethics researchers, participated in three Delphi-rounds. Health care providers (n = 73) participated in the content validity testing through ‘think-aloud’ interviews and a method using Content Validity Index. Results The development process resulted in the European Moral Case Deliberation Outcomes Instrument (Euro-MCD), which consists of two sections, one to be completed before a participant’s first MCD and the other after completing multiple MCDs. The instrument contains a few open-ended questions and 26 specific items with a corresponding rating/response scale representing various MCD outcomes. The items were categorised into the following six domains: Enhanced emotional support, Enhanced collaboration, Improved moral reflexivity, Improved moral attitude, Improvement on organizational level and Concrete results. Conclusions A tentative instrument has been developed that seems to cover main outcomes of Moral Case Deliberation. The next step will be to test the Euro-MCD in a field study. PMID:24712735

  3. Hindsight Is 20/20: A Case Study of Vision and Reading Issues Sheds Light for Teacher Education Programs

    ERIC Educational Resources Information Center

    Chandler, Kristie B.; Box, Jean A.

    2012-01-01

    This paper presents a case study designed to educate students in pre-service teacher education programs about the importance of a comprehensive eye exam. The case study chronicles a family's multi-year search for solutions to their child's reading difficulties. The research supporting the case study explores the connection between vision…

  4. Applicability Assessment of Uavsar Data in Wetland Monitoring: a Case Study of Louisiana Wetland

    NASA Astrophysics Data System (ADS)

    Zhao, J.; Niu, Y.; Lu, Z.; Yang, J.; Li, P.; Liu, W.

    2018-04-01

    Wetlands are highly productive and support a wide variety of ecosystem goods and services. Monitoring wetland is essential and potential. Because of the repeat-pass nature of satellite orbit and airborne, time-series of remote sensing data can be obtained to monitor wetland. UAVSAR is a NASA L-band synthetic aperture radar (SAR) sensor compact pod-mounted polarimetric instrument for interferometric repeat-track observations. Moreover, UAVSAR images can accurately map crustal deformations associated with natural hazards, such as volcanoes and earthquakes. And its polarization agility facilitates terrain and land-use classification and change detection. In this paper, the multi-temporal UAVSAR data are applied for monitoring the wetland change. Using the multi-temporal polarimetric SAR (PolSAR) data, the change detection maps are obtained by unsupervised and supervised method. And the coherence is extracted from the interfometric SAR (InSAR) data to verify the accuracy of change detection map. The experimental results show that the multi-temporal UAVSAR data is fit for wetland monitor.

  5. Probing optimal measurement configuration for optical scatterometry by the multi-objective genetic algorithm

    NASA Astrophysics Data System (ADS)

    Chen, Xiuguo; Gu, Honggang; Jiang, Hao; Zhang, Chuanwei; Liu, Shiyuan

    2018-04-01

    Measurement configuration optimization (MCO) is a ubiquitous and important issue in optical scatterometry, whose aim is to probe the optimal combination of measurement conditions, such as wavelength, incidence angle, azimuthal angle, and/or polarization directions, to achieve a higher measurement precision for a given measuring instrument. In this paper, the MCO problem is investigated and formulated as a multi-objective optimization problem, which is then solved by the multi-objective genetic algorithm (MOGA). The case study on the Mueller matrix scatterometry for the measurement of a Si grating verifies the feasibility of the MOGA in handling the MCO problem in optical scatterometry by making a comparison with the Monte Carlo simulations. Experiments performed at the achieved optimal measurement configuration also show good agreement between the measured and calculated best-fit Mueller matrix spectra. The proposed MCO method based on MOGA is expected to provide a more general and practical means to solve the MCO problem in the state-of-the-art optical scatterometry.

  6. Design for sustainability of industrial symbiosis based on emergy and multi-objective particle swarm optimization.

    PubMed

    Ren, Jingzheng; Liang, Hanwei; Dong, Liang; Sun, Lu; Gao, Zhiqiu

    2016-08-15

    Industrial symbiosis provides novel and practical pathway to the design for the sustainability. Decision support tool for its verification is necessary for practitioners and policy makers, while to date, quantitative research is limited. The objective of this work is to present an innovative approach for supporting decision-making in the design for the sustainability with the implementation of industrial symbiosis in chemical complex. Through incorporating the emergy theory, the model is formulated as a multi-objective approach that can optimize both the economic benefit and sustainable performance of the integrated industrial system. A set of emergy based evaluation index are designed. Multi-objective Particle Swarm Algorithm is proposed to solve the model, and the decision-makers are allowed to choose the suitable solutions form the Pareto solutions. An illustrative case has been studied by the proposed method, a few of compromises between high profitability and high sustainability can be obtained for the decision-makers/stakeholders to make decision. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. PROMETHEE II: A knowledge-driven method for copper exploration

    NASA Astrophysics Data System (ADS)

    Abedi, Maysam; Ali Torabi, S.; Norouzi, Gholam-Hossain; Hamzeh, Mohammad; Elyasi, Gholam-Reza

    2012-09-01

    This paper describes the application of a well-known Multi Criteria Decision Making (MCDM) technique called Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE II) to explore porphyry copper deposits. Various raster-based evidential layers involving geological, geophysical, and geochemical geo-datasets are integrated to prepare a mineral prospectivity mapping (MPM). In a case study, thirteen layers of the Now Chun copper deposit located in the Kerman province of Iran are used to explore the region of interest. The PROMETHEE II technique is applied to produce the desired MPM, and the outputs are validated using twenty-one boreholes that have been classified into five classes. This proposed method shows a high performance when providing the MPM while reducing the cost of exploratory drilling in the study area.

  8. Managing risks of noncancer health effects at hazardous waste sites: A case study using the Reference Concentration (RfC) of trichloroethylene (TCE).

    PubMed

    Dourson, Michael L; Gadagbui, Bernard K; Thompson, Rod B; Pfau, Edward J; Lowe, John

    2016-10-01

    A method for determining a safety range for non-cancer risks is proposed, similar in concept to the range used for cancer in the management of waste sites. This safety range brings transparency to the chemical specific Reference Dose or Concentration by replacing their "order of magnitude" definitions with a scientifically-based range. EPA's multiple RfCs for trichloroethylene (TCE) were evaluated as a case study. For TCE, a multi-endpoint safety range was judged to be 3 μg/m(3) to 30 μg/m,(3) based on a review of kidney effects found in NTP (1988), thymus effects found in Keil et al. (2009) and cardiac effects found in the Johnson et al. (2003) study. This multi-endpoint safety range is derived from studies for which the appropriate averaging time corresponds to different exposure durations, and, therefore, can be applied to both long- and short-term exposures with appropriate consideration of exposure averaging times. For shorter-term exposures, averaging time should be based on the time of cardiac development in humans during fetal growth, an average of approximately 20-25 days. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  9. Grantee Spotlight: Habtom Ressom, Ph.D.

    Cancer.gov

    Dr. Habtom W. Ressom, an NCI CRCHD R21 grantee, works to identify a method for detecting liver cancer early by employing bioinformatic technologies to collect, and make sense of, multi-omic data from liver cancer cases and patients with liver cirrhosis.

  10. Radiation-free quantification of head malformations in craniosynostosis patients from 3D photography

    NASA Astrophysics Data System (ADS)

    Tu, Liyun; Porras, Antonio R.; Oh, Albert; Lepore, Natasha; Mastromanolis, Manuel; Tsering, Deki; Paniagua, Beatriz; Enquobahrie, Andinet; Keating, Robert; Rogers, Gary F.; Linguraru, Marius George

    2018-02-01

    The evaluation of cranial malformations plays an essential role both in the early diagnosis and in the decision to perform surgical treatment for craniosynostosis. In clinical practice, both cranial shape and suture fusion are evaluated using CT images, which involve the use of harmful radiation on children. Three-dimensional (3D) photography offers noninvasive, radiation-free, and anesthetic-free evaluation of craniofacial morphology. The aim of this study is to develop an automated framework to objectively quantify cranial malformations in patients with craniosynostosis from 3D photography. We propose a new method that automatically extracts the cranial shape by identifying a set of landmarks from a 3D photograph. Specifically, it registers the 3D photograph of a patient to a reference template in which the position of the landmarks is known. Then, the method finds the closest cranial shape to that of the patient from a normative statistical shape multi-atlas built from 3D photographs of healthy cases, and uses it to quantify objectively cranial malformations. We calculated the cranial malformations on 17 craniosynostosis patients and we compared them with the malformations of the normative population used to build the multi-atlas. The average malformations of the craniosynostosis cases were 2.68 +/- 0.75 mm, which is significantly higher (p<0.001) than the average malformations of 1.70 +/- 0.41 mm obtained from the normative cases. Our approach can support the quantitative assessment of surgical procedures for cranial vault reconstruction without exposing pediatric patients to harmful radiation.

  11. A Data Management System for Multi-Phase Case-Control Studies

    PubMed Central

    Gibeau, Joanne M.; Steinfeldt, Lois C.; Stine, Mark J.; Tullis, Katherine V.; Lynch, H. Keith

    1983-01-01

    The design of a computerized system for the management of data in multi-phase epidemiologic case-control studies is described. Typical study phases include case-control selection, abstracting of data from medical records, and interview of study subjects or next of kin. In consultation with project personnel, requirements for the system were established: integration of data from all study phases into one data base, accurate follow-up of subjects through the study, sophisticated data editing capabilities, ready accessibility of specified programs to project personnel, and generation of current status and exception reports for project managment. SIR (Scientific Information Retrieval), a commercially available data base management system, was selected as the foundation of this system. The system forms a comprehensive data management system applicable to many types of public health research studies.

  12. Care pathways across the primary-hospital care continuum: using the multi-level framework in explaining care coordination.

    PubMed

    Van Houdt, Sabine; Heyrman, Jan; Vanhaecht, Kris; Sermeus, Walter; De Lepeleire, Jan

    2013-08-06

    Care pathways are widely used in hospitals for a structured and detailed planning of the care process. There is a growing interest in extending care pathways into primary care to improve quality of care by increasing care coordination. Evidence is sparse about the relationship between care pathways and care coordination.The multi-level framework explores care coordination across organizations and states that (inter)organizational mechanisms have an effect on the relationships between healthcare professionals, resulting in quality and efficiency of care.The aim of this study was to assess the extent to which care pathways support or create elements of the multi-level framework necessary to improve care coordination across the primary-hospital care continuum. This study is an in-depth analysis of five existing local community projects located in four different regions in Flanders (Belgium) to determine whether the available empirical evidence supported or refuted the theoretical expectations from the multi-level framework. Data were gathered using mixed methods, including structured face-to-face interviews, participant observations, documentation and a focus group. Multiple cases were analyzed performing a cross case synthesis to strengthen the results. The development of a care pathway across the primary-hospital care continuum, supported by a step-by-step scenario, led to the use of existing and newly constructed structures, data monitoring and the development of information tools. The construction and use of these inter-organizational mechanisms had a positive effect on exchanging information, formulating and sharing goals, defining and knowing each other's roles, expectations and competences and building qualitative relationships. Care pathways across the primary-hospital care continuum enhance the components of care coordination.

  13. Advances in multi-scale modeling of solidification and casting processes

    NASA Astrophysics Data System (ADS)

    Liu, Baicheng; Xu, Qingyan; Jing, Tao; Shen, Houfa; Han, Zhiqiang

    2011-04-01

    The development of the aviation, energy and automobile industries requires an advanced integrated product/process R&D systems which could optimize the product and the process design as well. Integrated computational materials engineering (ICME) is a promising approach to fulfill this requirement and make the product and process development efficient, economic, and environmentally friendly. Advances in multi-scale modeling of solidification and casting processes, including mathematical models as well as engineering applications are presented in the paper. Dendrite morphology of magnesium and aluminum alloy of solidification process by using phase field and cellular automaton methods, mathematical models of segregation of large steel ingot, and microstructure models of unidirectionally solidified turbine blade casting are studied and discussed. In addition, some engineering case studies, including microstructure simulation of aluminum casting for automobile industry, segregation of large steel ingot for energy industry, and microstructure simulation of unidirectionally solidified turbine blade castings for aviation industry are discussed.

  14. Multi-category micro-milling tool wear monitoring with continuous hidden Markov models

    NASA Astrophysics Data System (ADS)

    Zhu, Kunpeng; Wong, Yoke San; Hong, Geok Soon

    2009-02-01

    In-process monitoring of tool conditions is important in micro-machining due to the high precision requirement and high tool wear rate. Tool condition monitoring in micro-machining poses new challenges compared to conventional machining. In this paper, a multi-category classification approach is proposed for tool flank wear state identification in micro-milling. Continuous Hidden Markov models (HMMs) are adapted for modeling of the tool wear process in micro-milling, and estimation of the tool wear state given the cutting force features. For a noise-robust approach, the HMM outputs are connected via a medium filter to minimize the tool state before entry into the next state due to high noise level. A detailed study on the selection of HMM structures for tool condition monitoring (TCM) is presented. Case studies on the tool state estimation in the micro-milling of pure copper and steel demonstrate the effectiveness and potential of these methods.

  15. Nonlinear dynamics and control of a vibrating rectangular plate

    NASA Technical Reports Server (NTRS)

    Shebalin, J. V.

    1983-01-01

    The von Karman equations of nonlinear elasticity are solved for the case of a vibrating rectangular plate by meams of a Fourier spectral transform method. The amplification of a particular Fourier mode by nonlinear transfer of energy is demonstrated for this conservative system. The multi-mode system is reduced to a minimal (two mode) system, retaining the qualitative features of the multi-mode system. The effect of a modal control law on the dynamics of this minimal nonlinear elastic system is examined.

  16. Using MultiMedia Content to Present Business Ethics: An Empirical Study

    ERIC Educational Resources Information Center

    Stanwick, Peter A.

    2010-01-01

    The purpose of this study is to empirically examine whether presenting a multimedia case study enhances the learning experience of students in an undergraduate management class. A questionnaire was administered before and after the presentation of the case study and the results showed that the multimedia case did indeed enhance the learning…

  17. [Reform and practice of teaching methods for culture of medicinal plant].

    PubMed

    Si, Jinping; Zhu, Yuqiu; Liu, Jingjing; Bai, Yan; Zhang, Xinfeng

    2012-02-01

    Culture of pharmaceutical plant is a comprehensive multi-disciplinary theory, which has a long history of application. In order to improve the quality of this course, some reformation schemes have been carried out, including stimulating enthusiasm for learning, refining the basic concepts and theories, promoting the case study, emphasis on latest achievements, enhancing exercise in laboratory and planting base, and guiding students to do scientific and technological innovation. Meanwhile, the authors point out some teaching problems of this course.

  18. A multi-criteria assessment of scenarios on thermal processing of infectious hospital wastes: A case study for Central Macedonia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karagiannidis, A.; Papageorgiou, A., E-mail: apapa@auth.g; Perkoulidis, G.

    In Greece more than 14,000 tonnes of infectious hospital waste are produced yearly; a significant part of it is still mismanaged. Only one off-site licensed incineration facility for hospital wastes is in operation, with the remaining of the market covered by various hydroclave and autoclave units, whereas numerous problems are still generally encountered regarding waste segregation, collection, transportation and management, as well as often excessive entailed costs. Everyday practices still include dumping the majority of solid hospital waste into household disposal sites and landfills after sterilization, still largely without any preceding recycling and separation steps. Discussed in the present papermore » are the implemented and future treatment practices of infectious hospital wastes in Central Macedonia; produced quantities are reviewed, actual treatment costs are addressed critically, whereas the overall situation in Greece is discussed. Moreover, thermal treatment processes that could be applied for the treatment of infectious hospital wastes in the region are assessed via the multi-criteria decision method Analytic Hierarchy Process. Furthermore, a sensitivity analysis was performed and the analysis demonstrated that a centralized autoclave or hydroclave plant near Thessaloniki is the best performing option, depending however on the selection and weighing of criteria of the multi-criteria process. Moreover the study found that a common treatment option for the treatment of all infectious hospital wastes produced in the Region of Central Macedonia, could offer cost and environmental benefits. In general the multi-criteria decision method, as well as the conclusions and remarks of this study can be used as a basis for future planning and anticipation of the needs for investments in the area of medical waste management.« less

  19. Indentation theory on a half-space of transversely isotropic multi-ferroic composite medium: sliding friction effect

    NASA Astrophysics Data System (ADS)

    Wu, F.; Wu, T.-H.; Li, X.-Y.

    2018-03-01

    This article aims to present a systematic indentation theory on a half-space of multi-ferroic composite medium with transverse isotropy. The effect of sliding friction between the indenter and substrate is taken into account. The cylindrical flat-ended indenter is assumed to be electrically/magnetically conducting or insulating, which leads to four sets of mixed boundary-value problems. The indentation forces in the normal and tangential directions are related to the Coulomb friction law. For each case, the integral equations governing the contact behavior are developed by means of the generalized method of potential theory, and the corresponding coupling field is obtained in terms of elementary functions. The effect of sliding on the contact behavior is investigated. Finite element method (FEM) in the context of magneto-electro-elasticity is developed to discuss the validity of the analytical solutions. The obtained analytical solutions may serve as benchmarks to various simplified analyses and numerical codes and as a guide for future experimental studies.

  20. Association between circulating vitamin K1 and coronary calcium progression in community-dwelling adults: the Multi-Ethnic Study of Atherosclerosis

    USDA-ARS?s Scientific Manuscript database

    While animal studies found vitamin K treatment reduced vascular calcification, human data are limited. Using a case-cohort design, we determined the association between vitamin K status and coronary artery calcium (CAC) progression in the Multi-ethnic Study of Atherosclerosis. Serum phylloquinone (v...

  1. Computer-aided diagnosis system: a Bayesian hybrid classification method.

    PubMed

    Calle-Alonso, F; Pérez, C J; Arias-Nicolás, J P; Martín, J

    2013-10-01

    A novel method to classify multi-class biomedical objects is presented. The method is based on a hybrid approach which combines pairwise comparison, Bayesian regression and the k-nearest neighbor technique. It can be applied in a fully automatic way or in a relevance feedback framework. In the latter case, the information obtained from both an expert and the automatic classification is iteratively used to improve the results until a certain accuracy level is achieved, then, the learning process is finished and new classifications can be automatically performed. The method has been applied in two biomedical contexts by following the same cross-validation schemes as in the original studies. The first one refers to cancer diagnosis, leading to an accuracy of 77.35% versus 66.37%, originally obtained. The second one considers the diagnosis of pathologies of the vertebral column. The original method achieves accuracies ranging from 76.5% to 96.7%, and from 82.3% to 97.1% in two different cross-validation schemes. Even with no supervision, the proposed method reaches 96.71% and 97.32% in these two cases. By using a supervised framework the achieved accuracy is 97.74%. Furthermore, all abnormal cases were correctly classified. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. Multi-energy Coordinated Evaluation for Energy Internet

    NASA Astrophysics Data System (ADS)

    Jia, Dongqiang; Sun, Jian; Wang, Cunping; Hong, Xiao; Ma, Xiufan; Xiong, Wenting; Shen, Yaqi

    2017-05-01

    This paper reviews the current research status of multi-energy coordinated evaluation for energy Internet. Taking the coordinated optimization effect of wind energy, solar energy and other energy sources into consideration, 17 evaluation indexes, such as the substitution coefficient of cold heat and power, the ratio of wind and solar energy, and the rate of energy storage ratio, were designed from five aspects, including the acceptance of renewable energy, energy complementary alternative benefits, peak valley difference, the degree of equipment utilization and user needs. At the same time, this article attaches importance to the economic and social benefits of the coordination of multiple energy sources. Ultimately, a comprehensive multi-energy coordination evaluation index system of regional energy Internet was put forward from the safe operation, coordination and optimization, economic and social benefits four aspects, and a comprehensive evaluation model was established. This model uses the optimal combination weighting method based on moment estimation and Topsis evaluation analysis method, so both the subjective and objective weight of the index are considered and the coordinate evaluation of multi-energy is realized. Finally the perfection of the index system and the validity of the evaluation method are verified by a case analysis.

  3. A dimension reduction method for flood compensation operation of multi-reservoir system

    NASA Astrophysics Data System (ADS)

    Jia, B.; Wu, S.; Fan, Z.

    2017-12-01

    Multiple reservoirs cooperation compensation operations coping with uncontrolled flood play vital role in real-time flood mitigation. This paper come up with a reservoir flood compensation operation index (ResFCOI), which formed by elements of flood control storage, flood inflow volume, flood transmission time and cooperation operations period, then establish a flood cooperation compensation operations model of multi-reservoir system, according to the ResFCOI to determine a computational order of each reservoir, and lastly the differential evolution algorithm is implemented for computing single reservoir flood compensation optimization in turn, so that a dimension reduction method is formed to reduce computational complexity. Shiguan River Basin with two large reservoirs and an extensive uncontrolled flood area, is used as a case study, results show that (a) reservoirs' flood discharges and the uncontrolled flood are superimposed at Jiangjiaji Station, while the formed flood peak flow is as small as possible; (b) cooperation compensation operations slightly increase in usage of flood storage capacity in reservoirs, when comparing to rule-based operations; (c) it takes 50 seconds in average when computing a cooperation compensation operations scheme. The dimension reduction method to guide flood compensation operations of multi-reservoir system, can make each reservoir adjust its flood discharge strategy dynamically according to the uncontrolled flood magnitude and pattern, so as to mitigate the downstream flood disaster.

  4. [Relationship between multi-slice spiral CT angiography imaging features and in-hospital death of patients with aortic dissection].

    PubMed

    Xiao, Z Y; Wang, H J; Yao, C L; Gu, G R; Xue, Y; Yin, J; Chen, J; Zhang, C; Tong, C Y; Song, Z J

    2017-03-24

    Objective: To explore the imaging manifestations of multi-slice spiral CT angiography (CTA) and relationship with in-hospital death in patients with aortic dissection (AD). Methods: The clinical data of 429 patients with AD who underwent CTA in Zhongshan Hospital of Fudan University between January 2009 and January 2016 were retrospectively analyzed. AD patients were divided into 2 groups, including operation group who underwent surgery or interventional therapy (370 cases) and non-operation group who underwent medical conservative treatment(59 cases). The multi-slice spiral CTA imaging features of AD were analyzed, and multivariate logistic regression analysis was used to investigate the relationship between imaging manifestations and in-hospital death in AD patients. Results: There were 12 cases (3.24%) of in-hospital death in operation group, and 28 cases (47.46%) of in-hospital death in non-operation group( P <0.001). AD involved different vascular branches. Multi-slice spiral CTA can clearly show the dissection of true and false lumen, and intimal tear was detected in 363 (84.62%) cases, outer wall calcification was revealed in 63 (14.69%) cases, and thrombus formation was present in 227 (52.91%) cases. The multivariate logistic regression analysis showed that the number of branch vessels involved ( OR =1.374, 95% CI 1.081-1.745, P =0.009) and tearing false lumen range( OR =2.059, 95% CI 1.252-3.385, P =0.004) were independent risk factors of in-hospital death in AD patients, and the number of branch vessels involved ( OR =1.600, 95% CI 1.062-2.411, P =0.025) was independent risk factor of in-hospital death in the operation group, while the tearing false lumen range ( OR =2.315, 95% CI 1.019-5.262, P =0.045) was independent risk factor of in-hospital death of non-operation group. Conclusions: Multi-slice spiral CTA can clearly show the entire AD, true and false lumen, intimal tear, wall calcification and thrombosis of AD patients. The number of branch vessels involved and tearing false lumen range are the independent risk factors of in-hospital death in AD patients.

  5. Multi-dimensional multi-species modeling of transient electrodeposition in LIGA microfabrication.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Gregory Herbert; Chen, Ken Shuang

    2004-06-01

    This report documents the efforts and accomplishments of the LIGA electrodeposition modeling project which was headed by the ASCI Materials and Physics Modeling Program. A multi-dimensional framework based on GOMA was developed for modeling time-dependent diffusion and migration of multiple charged species in a dilute electrolyte solution with reduction electro-chemical reactions on moving deposition surfaces. By combining the species mass conservation equations with the electroneutrality constraint, a Poisson equation that explicitly describes the electrolyte potential was derived. The set of coupled, nonlinear equations governing species transport, electric potential, velocity, hydrodynamic pressure, and mesh motion were solved in GOMA, using themore » finite-element method and a fully-coupled implicit solution scheme via Newton's method. By treating the finite-element mesh as a pseudo solid with an arbitrary Lagrangian-Eulerian formulation and by repeatedly performing re-meshing with CUBIT and re-mapping with MAPVAR, the moving deposition surfaces were tracked explicitly from start of deposition until the trenches were filled with metal, thus enabling the computation of local current densities that potentially influence the microstructure and frictional/mechanical properties of the deposit. The multi-dimensional, multi-species, transient computational framework was demonstrated in case studies of two-dimensional nickel electrodeposition in single and multiple trenches, without and with bath stirring or forced flow. Effects of buoyancy-induced convection on deposition were also investigated. To further illustrate its utility, the framework was employed to simulate deposition in microscreen-based LIGA molds. Lastly, future needs for modeling LIGA electrodeposition are discussed.« less

  6. A Case Study in User Support for Managing OpenSim Based Multi User Learning Environments

    ERIC Educational Resources Information Center

    Perera, Indika; Miller, Alan; Allison, Colin

    2017-01-01

    Immersive 3D Multi User Learning Environments (MULE) have shown sufficient success to warrant their consideration as a mainstream educational paradigm. These are based on 3D Multi User Virtual Environment platforms (MUVE), and although they have been used for various innovative educational projects their complex permission systems and large…

  7. FVCOM one-way and two-way nesting using ESMF: Development and validation

    NASA Astrophysics Data System (ADS)

    Qi, Jianhua; Chen, Changsheng; Beardsley, Robert C.

    2018-04-01

    Built on the Earth System Modeling Framework (ESMF), the one-way and two-way nesting methods were implemented into the unstructured-grid Finite-Volume Community Ocean Model (FVCOM). These methods help utilize the unstructured-grid multi-domain nesting of FVCOM with an aim at resolving the multi-scale physical and ecosystem processes. A detail of procedures on implementing FVCOM into ESMF was described. The experiments were made to validate and evaluate the performance of the nested-grid FVCOM system. The first was made for a wave-current interaction case with a two-domain nesting with an emphasis on qualifying a critical need of nesting to resolve a high-resolution feature near the coast and harbor with little loss in computational efficiency. The second was conducted for the pseudo river plume cases to examine the differences in the model-simulated salinity between one-way and two-way nesting approaches and evaluate the performance of mass conservative two-way nesting method. The third was carried out for the river plume case in the realistic geometric domain in Mass Bay, supporting the importance for having the two-way nesting for coastal-estuarine integrated modeling. The nesting method described in this paper has been used in the Northeast Coastal Ocean Forecast System (NECOFS)-a global-regional-coastal nesting FVCOM system that has been placed into the end-to-end forecast and hindcast operations since 2007.

  8. Spatial analysis of county-based gonorrhoea incidence in mainland China, from 2004 to 2009.

    PubMed

    Yin, Fei; Feng, Zijian; Li, Xiaosong

    2012-07-01

    Gonorrhoea is one of the most common sexually transmissible infections in mainland China. Effective spatial monitoring of gonorrhoea incidence is important for successful implementation of control and prevention programs. The county-level gonorrhoea incidence rates for all of mainland China was monitored through examining spatial patterns. County-level data on gonorrhoea cases between 2004 and 2009 were obtained from the China Information System for Disease Control and Prevention. Bayesian smoothing and exploratory spatial data analysis (ESDA) methods were used to characterise the spatial distribution pattern of gonorrhoea cases. During the 6-year study period, the average annual gonorrhoea incidence was 12.41 cases per 100000 people. Using empirical Bayes smoothed rates, the local Moran test identified one significant single-centre cluster and two significant multi-centre clusters of high gonorrhoea risk (all P-values <0.01). Bayesian smoothing and ESDA methods can assist public health officials in using gonorrhoea surveillance data to identify high risk areas. Allocating more resources to such areas could effectively reduce gonorrhoea incidence.

  9. Quantifying aflatoxins in peanuts using fluorescence spectroscopy coupled with multi-way methods: Resurrecting second-order advantage in excitation-emission matrices with rank overlap problem

    NASA Astrophysics Data System (ADS)

    Sajjadi, S. Maryam; Abdollahi, Hamid; Rahmanian, Reza; Bagheri, Leila

    2016-03-01

    A rapid, simple and inexpensive method using fluorescence spectroscopy coupled with multi-way methods for the determination of aflatoxins B1 and B2 in peanuts has been developed. In this method, aflatoxins are extracted with a mixture of water and methanol (90:10), and then monitored by fluorescence spectroscopy producing EEMs. Although the combination of EEMs and multi-way methods is commonly used to determine analytes in complex chemical systems with unknown interference(s), rank overlap problem in excitation and emission profiles may restrain the application of this strategy. If there is rank overlap in one mode, there are several three-way algorithms such as PARAFAC under some constraints that can resolve this kind of data successfully. However, the analysis of EEM data is impossible when some species have rank overlap in both modes because the information of the data matrix is equivalent to a zero-order data for that species, which is the case in our study. Aflatoxins B1 and B2 have the same shape of spectral profiles in both excitation and emission modes and we propose creating a third order data for each sample using solvent as a new additional selectivity mode. This third order data, in turn, converted to the second order data by augmentation, a fact which resurrects the second order advantage in original EEMs. The three-way data is constructed by stacking augmented data in the third way, and then analyzed by two powerful second order calibration methods (BLLS-RBL and PARAFAC) to quantify the analytes in four kinds of peanut samples. The results of both methods are in good agreement and reasonable recoveries are obtained.

  10. Multi-Model Ensemble Wake Vortex Prediction

    NASA Technical Reports Server (NTRS)

    Koerner, Stephan; Holzaepfel, Frank; Ahmad, Nash'at N.

    2015-01-01

    Several multi-model ensemble methods are investigated for predicting wake vortex transport and decay. This study is a joint effort between National Aeronautics and Space Administration and Deutsches Zentrum fuer Luft- und Raumfahrt to develop a multi-model ensemble capability using their wake models. An overview of different multi-model ensemble methods and their feasibility for wake applications is presented. The methods include Reliability Ensemble Averaging, Bayesian Model Averaging, and Monte Carlo Simulations. The methodologies are evaluated using data from wake vortex field experiments.

  11. Buildings Change Detection Based on Shape Matching for Multi-Resolution Remote Sensing Imagery

    NASA Astrophysics Data System (ADS)

    Abdessetar, M.; Zhong, Y.

    2017-09-01

    Buildings change detection has the ability to quantify the temporal effect, on urban area, for urban evolution study or damage assessment in disaster cases. In this context, changes analysis might involve the utilization of the available satellite images with different resolutions for quick responses. In this paper, to avoid using traditional method with image resampling outcomes and salt-pepper effect, building change detection based on shape matching is proposed for multi-resolution remote sensing images. Since the object's shape can be extracted from remote sensing imagery and the shapes of corresponding objects in multi-scale images are similar, it is practical for detecting buildings changes in multi-scale imagery using shape analysis. Therefore, the proposed methodology can deal with different pixel size for identifying new and demolished buildings in urban area using geometric properties of objects of interest. After rectifying the desired multi-dates and multi-resolutions images, by image to image registration with optimal RMS value, objects based image classification is performed to extract buildings shape from the images. Next, Centroid-Coincident Matching is conducted, on the extracted building shapes, based on the Euclidean distance measurement between shapes centroid (from shape T0 to shape T1 and vice versa), in order to define corresponding building objects. Then, New and Demolished buildings are identified based on the obtained distances those are greater than RMS value (No match in the same location).

  12. Evaluation of laser ablation crater relief by white light micro interferometer

    NASA Astrophysics Data System (ADS)

    Gurov, Igor; Volkov, Mikhail; Zhukova, Ekaterina; Ivanov, Nikita; Margaryants, Nikita; Potemkin, Andrey; Samokhvalov, Andrey; Shelygina, Svetlana

    2017-06-01

    A multi-view scanning method is suggested to assess a complicated surface relief by white light interferometer. Peculiarities of the method are demonstrated on a special object in the form of quadrangular pyramid cavity, which is formed at measurement of micro-hardness of materials using a hardness gauge. An algorithm of the joint processing of multi-view scanning results is developed that allows recovering correct relief values. Laser ablation craters were studied experimentally, and their relief was recovered using the developed method. It is shown that the multi-view scanning reduces ambiguity when determining the local depth of the laser ablation craters micro relief. Results of experimental studies of the multi-view scanning method and data processing algorithm are presented.

  13. Multi-parameter optimization of piezoelectric actuators for multi-mode active vibration control of cylindrical shells

    NASA Astrophysics Data System (ADS)

    Hu, K. M.; Li, Hua

    2018-07-01

    A novel technique for the multi-parameter optimization of distributed piezoelectric actuators is presented in this paper. The proposed method is designed to improve the performance of multi-mode vibration control in cylindrical shells. The optimization parameters of actuator patch configuration include position, size, and tilt angle. The modal control force of tilted orthotropic piezoelectric actuators is derived and the multi-parameter cylindrical shell optimization model is established. The linear quadratic energy index is employed as the optimization criterion. A geometric constraint is proposed to prevent overlap between tilted actuators, which is plugged into a genetic algorithm to search the optimal configuration parameters. A simply-supported closed cylindrical shell with two actuators serves as a case study. The vibration control efficiencies of various parameter sets are evaluated via frequency response and transient response simulations. The results show that the linear quadratic energy indexes of position and size optimization decreased by 14.0% compared to position optimization; those of position and tilt angle optimization decreased by 16.8%; and those of position, size, and tilt angle optimization decreased by 25.9%. It indicates that, adding configuration optimization parameters is an efficient approach to improving the vibration control performance of piezoelectric actuators on shells.

  14. Adaptive control of a jet turboshaft engine driving a variable pitch propeller using multiple models

    NASA Astrophysics Data System (ADS)

    Ahmadian, Narjes; Khosravi, Alireza; Sarhadi, Pouria

    2017-08-01

    In this paper, a multiple model adaptive control (MMAC) method is proposed for a gas turbine engine. The model of a twin spool turbo-shaft engine driving a variable pitch propeller includes various operating points. Variations in fuel flow and propeller pitch inputs produce different operating conditions which force the controller to be adopted rapidly. Important operating points are three idle, cruise and full thrust cases for the entire flight envelope. A multi-input multi-output (MIMO) version of second level adaptation using multiple models is developed. Also, stability analysis using Lyapunov method is presented. The proposed method is compared with two conventional first level adaptation and model reference adaptive control techniques. Simulation results for JetCat SPT5 turbo-shaft engine demonstrate the performance and fidelity of the proposed method.

  15. Support vector machines-based fault diagnosis for turbo-pump rotor

    NASA Astrophysics Data System (ADS)

    Yuan, Sheng-Fa; Chu, Fu-Lei

    2006-05-01

    Most artificial intelligence methods used in fault diagnosis are based on empirical risk minimisation principle and have poor generalisation when fault samples are few. Support vector machines (SVM) is a new general machine-learning tool based on structural risk minimisation principle that exhibits good generalisation even when fault samples are few. Fault diagnosis based on SVM is discussed. Since basic SVM is originally designed for two-class classification, while most of fault diagnosis problems are multi-class cases, a new multi-class classification of SVM named 'one to others' algorithm is presented to solve the multi-class recognition problems. It is a binary tree classifier composed of several two-class classifiers organised by fault priority, which is simple, and has little repeated training amount, and the rate of training and recognition is expedited. The effectiveness of the method is verified by the application to the fault diagnosis for turbo pump rotor.

  16. Newly occurred L4 spondylolysis in the lumbar spine with pre-existence L5 spondylolysis among sports players: case reports and biomechanical analysis.

    PubMed

    Sairyo, Koichi; Sakai, Toshinori; Yasui, Natsuo; Kiapour, Ali; Biyani, Ashok; Ebraheim, Nabil; Goel, Vijay K

    2009-10-01

    Case series and a biomechanical study using a finite element (FE) analysis. To report three cases with multi-level spondylolysis and to understand the mechanism biomechanically. Multi-level spondylolysis is a very rare condition. There have been few reports in the literature on multi-level spondylolysis among sports players. We reviewed three cases of the condition, clinically. These patients were very active young sports players and had newly developed fresh L4 spondylolysis and pre-existing L5 terminal stage spondylolysis. Thus, we assumed that L5 spondylolysis may have increased the pars stress at the cranial adjacent levels, leading to newly developed spondylolysis at these levels. Biomechanically, we investigated pars stress at L4 with or without spondylolysis at L5 using the finite element technique. L4 pars stress decreased in the presence of L5 spondylolysis, which does not support our first hypothesis. It seems that multi-level spondylolysis may occur due to genetic and not biomechanical reasons.

  17. Systemic Sclerosis Classification Criteria: Developing methods for multi-criteria decision analysis with 1000Minds

    PubMed Central

    Johnson, Sindhu R.; Naden, Raymond P.; Fransen, Jaap; van den Hoogen, Frank; Pope, Janet E.; Baron, Murray; Tyndall, Alan; Matucci-Cerinic, Marco; Denton, Christopher P.; Distler, Oliver; Gabrielli, Armando; van Laar, Jacob M.; Mayes, Maureen; Steen, Virginia; Seibold, James R.; Clements, Phillip; Medsger, Thomas A.; Carreira, Patricia E.; Riemekasten, Gabriela; Chung, Lorinda; Fessler, Barri J.; Merkel, Peter A.; Silver, Richard; Varga, John; Allanore, Yannick; Mueller-Ladner, Ulf; Vonk, Madelon C.; Walker, Ulrich A.; Cappelli, Susanna; Khanna, Dinesh

    2014-01-01

    Objective Classification criteria for systemic sclerosis (SSc) are being developed. The objectives were to: develop an instrument for collating case-data and evaluate its sensibility; use forced-choice methods to reduce and weight criteria; and explore agreement between experts on the probability that cases were classified as SSc. Study Design and Setting A standardized instrument was tested for sensibility. The instrument was applied to 20 cases covering a range of probabilities that each had SSc. Experts rank-ordered cases from highest to lowest probability; reduced and weighted the criteria using forced-choice methods; and re-ranked the cases. Consistency in rankings was evaluated using intraclass correlation coefficients (ICC). Results Experts endorsed clarity (83%), comprehensibility (100%), face and content validity (100%). Criteria were weighted (points): finger skin thickening (14–22), finger-tip lesions (9–21), friction rubs (21), finger flexion contractures (16), pulmonary fibrosis (14), SSc-related antibodies (15), Raynaud’s phenomenon (13), calcinosis (12), pulmonary hypertension (11), renal crisis (11), telangiectasia (10), abnormal nailfold capillaries (10), esophageal dilation (7) and puffy fingers (5). The ICC across experts was 0.73 (95%CI 0.58,0.86) and improved to 0.80 (95%CI 0.68,0.90). Conclusions Using a sensible instrument and forced-choice methods, the number of criteria were reduced by 39% (23 to 14) and weighted. Our methods reflect the rigors of measurement science, and serves as a template for developing classification criteria. PMID:24721558

  18. Demultiplexing based on frequency-domain joint decision MMA for MDM system

    NASA Astrophysics Data System (ADS)

    Caili, Gong; Li, Li; Guijun, Hu

    2016-06-01

    In this paper, we propose a demultiplexing method based on frequency-domain joint decision multi-modulus algorithm (FD-JDMMA) for mode division multiplexing (MDM) system. The performance of FD-JDMMA is compared with frequency-domain multi-modulus algorithm (FD-MMA) and frequency-domain least mean square (FD-LMS) algorithm. The simulation results show that FD-JDMMA outperforms FD-MMA in terms of BER and convergence speed in the cases of mQAM (m=4, 16 and 64) formats. And it is also demonstrated that FD-JDMMA achieves better BER performance and converges faster than FD-LMS in the cases of 16QAM and 64QAM. Furthermore, FD-JDMMA maintains similar computational complexity as the both equalization algorithms.

  19. Multi-Institutional Evaluation of Digital Tomosynthesis, Dual-Energy Radiography, and Conventional Chest Radiography for the Detection and Management of Pulmonary Nodules.

    PubMed

    Dobbins, James T; McAdams, H Page; Sabol, John M; Chakraborty, Dev P; Kazerooni, Ella A; Reddy, Gautham P; Vikgren, Jenny; Båth, Magnus

    2017-01-01

    Purpose To conduct a multi-institutional, multireader study to compare the performance of digital tomosynthesis, dual-energy (DE) imaging, and conventional chest radiography for pulmonary nodule detection and management. Materials and Methods In this binational, institutional review board-approved, HIPAA-compliant prospective study, 158 subjects (43 subjects with normal findings) were enrolled at four institutions. Informed consent was obtained prior to enrollment. Subjects underwent chest computed tomography (CT) and imaging with conventional chest radiography (posteroanterior and lateral), DE imaging, and tomosynthesis with a flat-panel imaging device. Three experienced thoracic radiologists identified true locations of nodules (n = 516, 3-20-mm diameters) with CT and recommended case management by using Fleischner Society guidelines. Five other radiologists marked nodules and indicated case management by using images from conventional chest radiography, conventional chest radiography plus DE imaging, tomosynthesis, and tomosynthesis plus DE imaging. Sensitivity, specificity, and overall accuracy were measured by using the free-response receiver operating characteristic method and the receiver operating characteristic method for nodule detection and case management, respectively. Results were further analyzed according to nodule diameter categories (3-4 mm, >4 mm to 6 mm, >6 mm to 8 mm, and >8 mm to 20 mm). Results Maximum lesion localization fraction was higher for tomosynthesis than for conventional chest radiography in all nodule size categories (3.55-fold for all nodules, P < .001; 95% confidence interval [CI]: 2.96, 4.15). Case-level sensitivity was higher with tomosynthesis than with conventional chest radiography for all nodules (1.49-fold, P < .001; 95% CI: 1.25, 1.73). Case management decisions showed better overall accuracy with tomosynthesis than with conventional chest radiography, as given by the area under the receiver operating characteristic curve (1.23-fold, P < .001; 95% CI: 1.15, 1.32). There were no differences in any specificity measures. DE imaging did not significantly affect nodule detection when paired with either conventional chest radiography or tomosynthesis. Conclusion Tomosynthesis outperformed conventional chest radiography for lung nodule detection and determination of case management; DE imaging did not show significant differences over conventional chest radiography or tomosynthesis alone. These findings indicate performance likely achievable with a range of reader expertise. © RSNA, 2016 Online supplemental material is available for this article.

  20. Multi-Institutional Evaluation of Digital Tomosynthesis, Dual-Energy Radiography, and Conventional Chest Radiography for the Detection and Management of Pulmonary Nodules

    PubMed Central

    McAdams, H. Page; Sabol, John M.; Chakraborty, Dev P.; Kazerooni, Ella A.; Reddy, Gautham P.; Vikgren, Jenny; Båth, Magnus

    2017-01-01

    Purpose To conduct a multi-institutional, multireader study to compare the performance of digital tomosynthesis, dual-energy (DE) imaging, and conventional chest radiography for pulmonary nodule detection and management. Materials and Methods In this binational, institutional review board–approved, HIPAA-compliant prospective study, 158 subjects (43 subjects with normal findings) were enrolled at four institutions. Informed consent was obtained prior to enrollment. Subjects underwent chest computed tomography (CT) and imaging with conventional chest radiography (posteroanterior and lateral), DE imaging, and tomosynthesis with a flat-panel imaging device. Three experienced thoracic radiologists identified true locations of nodules (n = 516, 3–20-mm diameters) with CT and recommended case management by using Fleischner Society guidelines. Five other radiologists marked nodules and indicated case management by using images from conventional chest radiography, conventional chest radiography plus DE imaging, tomosynthesis, and tomosynthesis plus DE imaging. Sensitivity, specificity, and overall accuracy were measured by using the free-response receiver operating characteristic method and the receiver operating characteristic method for nodule detection and case management, respectively. Results were further analyzed according to nodule diameter categories (3–4 mm, >4 mm to 6 mm, >6 mm to 8 mm, and >8 mm to 20 mm). Results Maximum lesion localization fraction was higher for tomosynthesis than for conventional chest radiography in all nodule size categories (3.55-fold for all nodules, P < .001; 95% confidence interval [CI]: 2.96, 4.15). Case-level sensitivity was higher with tomosynthesis than with conventional chest radiography for all nodules (1.49-fold, P < .001; 95% CI: 1.25, 1.73). Case management decisions showed better overall accuracy with tomosynthesis than with conventional chest radiography, as given by the area under the receiver operating characteristic curve (1.23-fold, P < .001; 95% CI: 1.15, 1.32). There were no differences in any specificity measures. DE imaging did not significantly affect nodule detection when paired with either conventional chest radiography or tomosynthesis. Conclusion Tomosynthesis outperformed conventional chest radiography for lung nodule detection and determination of case management; DE imaging did not show significant differences over conventional chest radiography or tomosynthesis alone. These findings indicate performance likely achievable with a range of reader expertise. © RSNA, 2016 Online supplemental material is available for this article. PMID:27439324

  1. Organizational Context Matters: A Research Toolkit for Conducting Standardized Case Studies of Integrated Care Initiatives

    PubMed Central

    Grudniewicz, Agnes; Gray, Carolyn Steele; Wodchis, Walter P.; Carswell, Peter; Baker, G. Ross

    2017-01-01

    Introduction: The variable success of integrated care initiatives has led experts to recommend tailoring design and implementation to the organizational context. Yet, organizational contexts are rarely described, understood, or measured with sufficient depth and breadth in empirical studies or in practice. We thus lack knowledge of when and specifically how organizational contexts matter. To facilitate the accumulation of evidence, we developed a research toolkit for conducting case studies using standardized measures of the (inter-)organizational context for integrating care. Theory and Methods: We used a multi-method approach to develop the research toolkit: (1) development and validation of the Context and Capabilities for Integrating Care (CCIC) Framework, (2) identification, assessment, and selection of survey instruments, (3) development of document review methods, (4) development of interview guide resources, and (5) pilot testing of the document review guidelines, consolidated survey, and interview guide. Results: The toolkit provides a framework and measurement tools that examine 18 organizational and inter-organizational factors that affect the implementation and success of integrated care initiatives. Discussion and Conclusion: The toolkit can be used to characterize and compare organizational contexts across cases and enable comparison of results across studies. This information can enhance our understanding of the influence of organizational contexts, support the transfer of best practices, and help explain why some integrated care initiatives succeed and some fail. PMID:28970750

  2. Day-Case Treatment of Peripheral Arterial Disease: Results from a Multi-Center European Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spiliopoulos, Stavros, E-mail: stavspiliop@med.uoa.gr, E-mail: stavspiliop@upatras.gr; Karnabatidis, Dimitrios, E-mail: karnaby@med.upatras.gr; Katsanos, Konstantinos, E-mail: katsanos@med.upatras.gr

    PurposeThe purpose of the study was to investigate safety and feasibility of day-case endovascular procedures for the management of peripheral arterial disease.Materials and MethodsThis was a multi-center, retrospective study including all patients treated over a 30-month period with endovascular angioplasty or stenting for intermittent claudication (IC) or critical limb ischemia (CLI) on a day-case basis, in Interventional Radiology (IR) departments of three European tertiary hospitals. Exclusion criteria were not related to the type of lesion and included unavailability of an adult able to take care of patient overnight; high bleeding risk and ASA score ≥4. Primary efficacy outcome was themore » rate of procedures performed on an outpatient basis requiring no further hospitalization and primary safety outcome was freedom from 30-day major complications’ rate.ResultsThe study included 652 patients (male 75 %; mean age 68 ± 10 years; range: 27–93), 24.6 % treated for CLI. In 53.3 % of the cases a 6Fr sheath was used. Technical success was 97.1 %. Haemostasis was obtained by manual compression in 52.4 % of the accesses. The primary efficacy outcome occurred in 95.4 % (622/652 patients) and primary safety outcome in 98.6 % (643/652 patients). Major complications included five (0.7 %) retroperitoneal hematomas requiring transfusion; one (0.1 %) common femoral artery pseudoaneurysm successfully treated with US-guided thrombin injection, two cases of intra-procedural distal embolization treated with catheter-directed local thrombolysis and one on-table cardiac arrest necessitating >24 h recovery. No major complication was noted after same-day discharge.ConclusionsDay-case endovascular procedures for the treatment of IC or CLI can be safely and efficiently performed in experienced IR departments of large tertiary hospitals.« less

  3. Examination of multi-model ensemble seasonal prediction methods using a simple climate system

    NASA Astrophysics Data System (ADS)

    Kang, In-Sik; Yoo, Jin Ho

    2006-02-01

    A simple climate model was designed as a proxy for the real climate system, and a number of prediction models were generated by slightly perturbing the physical parameters of the simple model. A set of long (240 years) historical hindcast predictions were performed with various prediction models, which are used to examine various issues of multi-model ensemble seasonal prediction, such as the best ways of blending multi-models and the selection of models. Based on these results, we suggest a feasible way of maximizing the benefit of using multi models in seasonal prediction. In particular, three types of multi-model ensemble prediction systems, i.e., the simple composite, superensemble, and the composite after statistically correcting individual predictions (corrected composite), are examined and compared to each other. The superensemble has more of an overfitting problem than the others, especially for the case of small training samples and/or weak external forcing, and the corrected composite produces the best prediction skill among the multi-model systems.

  4. Unintended outcomes evaluation approach: A plausible way to evaluate unintended outcomes of social development programmes.

    PubMed

    Jabeen, Sumera

    2018-06-01

    Social development programmes are deliberate attempts to bring about change and unintended outcomes can be considered as inherent to any such intervention. There is now a solid consensus among the international evaluation community regarding the need to consider unintended outcomes as a key aspect in any evaluative study. However, this concern often equates to nothing more than false piety. Exiting evaluation theory suffers from overlap of terminology, inadequate categorisation of unintended outcomes and lack of guidance on how to study them. To advance the knowledge of evaluation theory, methods and practice, the author has developed an evaluation approach to study unintended effects using a theory building, testing and refinement process. A comprehensive classification of unintended outcomes on the basis of knowability, value, distribution and temporality helped specify various type of unintended outcomes for programme evaluation. Corresponding to this classification, a three-step evaluation process was proposed including a) outlining programme intentions b) forecasting likely unintended effects c) mapping the anticipated and understanding unanticipated unintended outcomes. This unintended outcomes evaluation approach (UOEA) was then trialled by undertaking a multi-site and multi-method case study of a poverty alleviation programme in Pakistan and refinements were made to the approach.The case study revealed that this programme was producing a number of unintended effects, mostly negative, affecting those already disadvantaged such as the poorest, women and children. The trialling process demonstrated the effectiveness of the UOEA and suggests that this can serve as a useful guide for future evaluation practice. It also provides the discipline of evaluation with an empirically-based reference point for further theoretical developments in the study of unintended outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Multi-fluid Dynamics for Supersonic Jet-and-Crossflows and Liquid Plug Rupture

    NASA Astrophysics Data System (ADS)

    Hassan, Ezeldin A.

    Multi-fluid dynamics simulations require appropriate numerical treatments based on the main flow characteristics, such as flow speed, turbulence, thermodynamic state, and time and length scales. In this thesis, two distinct problems are investigated: supersonic jet and crossflow interactions; and liquid plug propagation and rupture in an airway. Gaseous non-reactive ethylene jet and air crossflow simulation represents essential physics for fuel injection in SCRAMJET engines. The regime is highly unsteady, involving shocks, turbulent mixing, and large-scale vortical structures. An eddy-viscosity-based multi-scale turbulence model is proposed to resolve turbulent structures consistent with grid resolution and turbulence length scales. Predictions of the time-averaged fuel concentration from the multi-scale model is improved over Reynolds-averaged Navier-Stokes models originally derived from stationary flow. The response to the multi-scale model alone is, however, limited, in cases where the vortical structures are small and scattered thus requiring prohibitively expensive grids in order to resolve the flow field accurately. Statistical information related to turbulent fluctuations is utilized to estimate an effective turbulent Schmidt number, which is shown to be highly varying in space. Accordingly, an adaptive turbulent Schmidt number approach is proposed, by allowing the resolved field to adaptively influence the value of turbulent Schmidt number in the multi-scale turbulence model. The proposed model estimates a time-averaged turbulent Schmidt number adapted to the computed flowfield, instead of the constant value common to the eddy-viscosity-based Navier-Stokes models. This approach is assessed using a grid-refinement study for the normal injection case, and tested with 30 degree injection, showing improved results over the constant turbulent Schmidt model both in mean and variance of fuel concentration predictions. For the incompressible liquid plug propagation and rupture study, numerical simulations are conducted using an Eulerian-Lagrangian approach with a continuous-interface method. A reconstruction scheme is developed to allow topological changes during plug rupture by altering the connectivity information of the interface mesh. Rupture time is shown to be delayed as the initial precursor film thickness increases. During the plug rupture process, a sudden increase of mechanical stresses on the tube wall is recorded, which can cause tissue damage.

  6. Coordination of networked systems on digraphs with multiple leaders via pinning control

    NASA Astrophysics Data System (ADS)

    Chen, Gang; Lewis, Frank L.

    2012-02-01

    It is well known that achieving consensus among a group of multi-vehicle systems by local distributed control is feasible if and only if all nodes in the communication digraph are reachable from a single (root) node. In this article, we take into account a more general case that the communication digraph of the networked multi-vehicle systems is weakly connected and has two or more zero-in-degree and strongly connected subgraphs, i.e. there are two or more leader groups. Based on the pinning control strategy, the feasibility problem of achieving second-order controlled consensus is studied. At first, a necessary and sufficient condition is given when the topology is fixed. Then the method to design the controller and the rule to choose the pinned vehicles are discussed. The proposed approach allows us to extend several existing results for undirected graphs to directed balanced graphs. A sufficient condition is proposed in the case where the coupling topology is variable. As an illustrative example, a second-order controlled consensus scheme is applied to coordinate the movement of networked multiple mobile robots.

  7. Automatic seizure detection based on the combination of newborn multi-channel EEG and HRV information

    NASA Astrophysics Data System (ADS)

    Mesbah, Mostefa; Balakrishnan, Malarvili; Colditz, Paul B.; Boashash, Boualem

    2012-12-01

    This article proposes a new method for newborn seizure detection that uses information extracted from both multi-channel electroencephalogram (EEG) and a single channel electrocardiogram (ECG). The aim of the study is to assess whether additional information extracted from ECG can improve the performance of seizure detectors based solely on EEG. Two different approaches were used to combine this extracted information. The first approach, known as feature fusion, involves combining features extracted from EEG and heart rate variability (HRV) into a single feature vector prior to feeding it to a classifier. The second approach, called classifier or decision fusion, is achieved by combining the independent decisions of the EEG and the HRV-based classifiers. Tested on recordings obtained from eight newborns with identified EEG seizures, the proposed neonatal seizure detection algorithms achieved 95.20% sensitivity and 88.60% specificity for the feature fusion case and 95.20% sensitivity and 94.30% specificity for the classifier fusion case. These results are considerably better than those involving classifiers using EEG only (80.90%, 86.50%) or HRV only (85.70%, 84.60%).

  8. Hybrid neural intelligent system to predict business failure in small-to-medium-size enterprises.

    PubMed

    Borrajo, M Lourdes; Baruque, Bruno; Corchado, Emilio; Bajo, Javier; Corchado, Juan M

    2011-08-01

    During the last years there has been a growing need of developing innovative tools that can help small to medium sized enterprises to predict business failure as well as financial crisis. In this study we present a novel hybrid intelligent system aimed at monitoring the modus operandi of the companies and predicting possible failures. This system is implemented by means of a neural-based multi-agent system that models the different actors of the companies as agents. The core of the multi-agent system is a type of agent that incorporates a case-based reasoning system and automates the business control process and failure prediction. The stages of the case-based reasoning system are implemented by means of web services: the retrieval stage uses an innovative weighted voting summarization of self-organizing maps ensembles-based method and the reuse stage is implemented by means of a radial basis function neural network. An initial prototype was developed and the results obtained related to small and medium enterprises in a real scenario are presented.

  9. Real-Time PCR Typing of Escherichia coli Based on Multiple Single Nucleotide Polymorphisms--a Convenient and Rapid Method.

    PubMed

    Lager, Malin; Mernelius, Sara; Löfgren, Sture; Söderman, Jan

    2016-01-01

    Healthcare-associated infections caused by Escherichia coli and antibiotic resistance due to extended-spectrum beta-lactamase (ESBL) production constitute a threat against patient safety. To identify, track, and control outbreaks and to detect emerging virulent clones, typing tools of sufficient discriminatory power that generate reproducible and unambiguous data are needed. A probe based real-time PCR method targeting multiple single nucleotide polymorphisms (SNP) was developed. The method was based on the multi locus sequence typing scheme of Institute Pasteur and by adaptation of previously described typing assays. An 8 SNP-panel that reached a Simpson's diversity index of 0.95 was established, based on analysis of sporadic E. coli cases (ESBL n = 27 and non-ESBL n = 53). This multi-SNP assay was used to identify the sequence type 131 (ST131) complex according to the Achtman's multi locus sequence typing scheme. However, it did not fully discriminate within the complex but provided a diagnostic signature that outperformed a previously described detection assay. Pulsed-field gel electrophoresis typing of isolates from a presumed outbreak (n = 22) identified two outbreaks (ST127 and ST131) and three different non-outbreak-related isolates. Multi-SNP typing generated congruent data except for one non-outbreak-related ST131 isolate. We consider multi-SNP real-time PCR typing an accessible primary generic E. coli typing tool for rapid and uniform type identification.

  10. Multi-Population Invariance with Dichotomous Measures: Combining Multi-Group and MIMIC Methodologies in Evaluating the General Aptitude Test in the Arabic Language

    ERIC Educational Resources Information Center

    Sideridis, Georgios D.; Tsaousis, Ioannis; Al-harbi, Khaleel A.

    2015-01-01

    The purpose of the present study was to extend the model of measurement invariance by simultaneously estimating invariance across multiple populations in the dichotomous instrument case using multi-group confirmatory factor analytic and multiple indicator multiple causes (MIMIC) methodologies. Using the Arabic version of the General Aptitude Test…

  11. Physical and Electronic Isolation of Carbon Nanotube Conductors

    NASA Technical Reports Server (NTRS)

    OKeeffe, James; Biegel, Bryan (Technical Monitor)

    2001-01-01

    Multi-walled nanotubes are proposed as a method to electrically and physically isolate nanoscale conductors from their surroundings. We use tight binding (TB) and density functional theory (DFT) to simulate the effects of an external electric field on multi-wall nanotubes. Two categories of multi-wall nanotube are investigated, those with metallic and semiconducting outer shells. In the metallic case, simulations show that the outer wall effectively screens the inner core from an applied electric field. This offers the ability to reduce crosstalk between nanotube conductors. A semiconducting outer shell is found not to perturb an electric field incident on the inner core, thereby providing physical isolation while allowing the tube to remain electrically coupled to its surroundings.

  12. An Examination of Diversity within Three Southeastern Academic Libraries: A Mixed-Methods, Multi-Site Study

    ERIC Educational Resources Information Center

    Shaffer, Christopher A.

    2014-01-01

    The purpose of this study was to determine the extent to which three academic libraries in the Southeastern United States could be considered diverse. This was a multi-site, mixed methods study. It examined the climate and culture of the libraries, which was assessed through two methods; the first, through survey responses from full-time faculty…

  13. Mathematical model of snake-type multi-directional wave generation

    NASA Astrophysics Data System (ADS)

    Muarif; Halfiani, Vera; Rusdiana, Siti; Munzir, Said; Ramli, Marwan

    2018-01-01

    Research on extreme wave generation is one intensive research on water wave study because the fact that the occurrence of this wave in the ocean can cause serious damage to the ships and offshore structures. One method to be used to generate the wave is self-correcting. This method controls the signal on the wavemakers in a wave tank. Some studies also consider the nonlinear wave generation in a wave tank by using numerical approach. Study on wave generation is essential in the effectiveness and efficiency of offshore structure model testing before it can be operated in the ocean. Generally, there are two types of wavemakers implemented in the hydrodynamic laboratory, piston-type and flap-type. The flap-type is preferred to conduct a testing to a ship in deep water. Single flap wavemaker has been explained in many studies yet snake-type wavemaker (has more than one flap) is still a case needed to be examined. Hence, the formulation in controlling the wavemaker need to be precisely analyzed such that the given input can generate the desired wave in the space-limited wave tank. By applying the same analogy and methodhology as the previous study, this article represents multi-directional wave generation by implementing snake-type wavemakers.

  14. SU-E-T-395: Multi-GPU-Based VMAT Treatment Plan Optimization Using a Column-Generation Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Z; Shi, F; Jia, X

    Purpose: GPU has been employed to speed up VMAT optimizations from hours to minutes. However, its limited memory capacity makes it difficult to handle cases with a huge dose-deposition-coefficient (DDC) matrix, e.g. those with a large target size, multiple arcs, small beam angle intervals and/or small beamlet size. We propose multi-GPU-based VMAT optimization to solve this memory issue to make GPU-based VMAT more practical for clinical use. Methods: Our column-generation-based method generates apertures sequentially by iteratively searching for an optimal feasible aperture (referred as pricing problem, PP) and optimizing aperture intensities (referred as master problem, MP). The PP requires accessmore » to the large DDC matrix, which is implemented on a multi-GPU system. Each GPU stores a DDC sub-matrix corresponding to one fraction of beam angles and is only responsible for calculation related to those angles. Broadcast and parallel reduction schemes are adopted for inter-GPU data transfer. MP is a relatively small-scale problem and is implemented on one GPU. One headand- neck cancer case was used for test. Three different strategies for VMAT optimization on single GPU were also implemented for comparison: (S1) truncating DDC matrix to ignore its small value entries for optimization; (S2) transferring DDC matrix part by part to GPU during optimizations whenever needed; (S3) moving DDC matrix related calculation onto CPU. Results: Our multi-GPU-based implementation reaches a good plan within 1 minute. Although S1 was 10 seconds faster than our method, the obtained plan quality is worse. Both S2 and S3 handle the full DDC matrix and hence yield the same plan as in our method. However, the computation time is longer, namely 4 minutes and 30 minutes, respectively. Conclusion: Our multi-GPU-based VMAT optimization can effectively solve the limited memory issue with good plan quality and high efficiency, making GPUbased ultra-fast VMAT planning practical for real clinical use.« less

  15. Building an international network for a primary care research program: reflections on challenges and solutions in the set-up and delivery of a prospective observational study of acute cough in 13 European countries

    PubMed Central

    2011-01-01

    Background Implementing a primary care clinical research study in several countries can make it possible to recruit sufficient patients in a short period of time that allows important clinical questions to be answered. Large multi-country studies in primary care are unusual and are typically associated with challenges requiring innovative solutions. We conducted a multi-country study and through this paper, we share reflections on the challenges we faced and some of the solutions we developed with a special focus on the study set up, structure and development of Primary Care Networks (PCNs). Method GRACE-01 was a multi-European country, investigator-driven prospective observational study implemented by 14 Primary Care Networks (PCNs) within 13 European Countries. General Practitioners (GPs) recruited consecutive patients with an acute cough. GPs completed a case report form (CRF) and the patient completed a daily symptom diary. After study completion, the coordinating team discussed the phases of the study and identified challenges and solutions that they considered might be interesting and helpful to researchers setting up a comparable study. Results The main challenges fell within three domains as follows: i) selecting, setting up and maintaining PCNs; ii) designing local context-appropriate data collection tools and efficient data management systems; and iii) gaining commitment and trust from all involved and maintaining enthusiasm. The main solutions for each domain were: i) appointing key individuals (National Network Facilitator and Coordinator) with clearly defined tasks, involving PCNs early in the development of study materials and procedures. ii) rigorous back translations of all study materials and the use of information systems to closely monitor each PCNs progress; iii) providing strong central leadership with high level commitment to the value of the study, frequent multi-method communication, establishing a coherent ethos, celebrating achievements, incorporating social events and prizes within meetings, and providing a framework for exploitation of local data. Conclusions Many challenges associated with multi-country primary care research can be overcome by engendering strong, effective communication, commitment and involvement of all local researchers. The practical solutions identified and the lessons learned in implementing the GRACE-01 study may assist in establishing other international primary care clinical research platforms. Trial registration ClinicalTrials.gov Identifier: NCT00353951 PMID:21794112

  16. Detecting Water Bodies in LANDSAT8 Oli Image Using Deep Learning

    NASA Astrophysics Data System (ADS)

    Jiang, W.; He, G.; Long, T.; Ni, Y.

    2018-04-01

    Water body identifying is critical to climate change, water resources, ecosystem service and hydrological cycle. Multi-layer perceptron(MLP) is the popular and classic method under deep learning framework to detect target and classify image. Therefore, this study adopts this method to identify the water body of Landsat8. To compare the performance of classification, the maximum likelihood and water index are employed for each study area. The classification results are evaluated from accuracy indices and local comparison. Evaluation result shows that multi-layer perceptron(MLP) can achieve better performance than the other two methods. Moreover, the thin water also can be clearly identified by the multi-layer perceptron. The proposed method has the application potential in mapping global scale surface water with multi-source medium-high resolution satellite data.

  17. Improved Holistic Analysis of Rayleigh Waves for Single- and Multi-Offset Data: Joint Inversion of Rayleigh-Wave Particle Motion and Vertical- and Radial-Component Velocity Spectra

    NASA Astrophysics Data System (ADS)

    Dal Moro, Giancarlo; Moustafa, Sayed S. R.; Al-Arifi, Nassir S.

    2018-01-01

    Rayleigh waves often propagate according to complex mode excitation so that the proper identification and separation of specific modes can be quite difficult or, in some cases, just impossible. Furthermore, the analysis of a single component (i.e., an inversion procedure based on just one objective function) necessarily prevents solving the problems related to the non-uniqueness of the solution. To overcome these issues and define a holistic analysis of Rayleigh waves, we implemented a procedure to acquire data that are useful to define and efficiently invert the three objective functions defined from the three following "objects": the velocity spectra of the vertical- and radial-components and the Rayleigh-wave particle motion (RPM) frequency-offset data. Two possible implementations are presented. In the first case we consider classical multi-offset (and multi-component) data, while in a second possible approach we exploit the data recorded by a single three-component geophone at a fixed offset from the source. Given the simple field procedures, the method could be particularly useful for the unambiguous geotechnical exploration of large areas, where more complex acquisition procedures, based on the joint acquisition of Rayleigh and Love waves, would not be economically viable. After illustrating the different kinds of data acquisition and the data processing, the results of the proposed methodology are illustrated in a case study. Finally, a series of theoretical and practical aspects are discussed to clarify some issues involved in the overall procedure (data acquisition and processing).

  18. Differentiation of Glioblastoma and Lymphoma Using Feature Extraction and Support Vector Machine.

    PubMed

    Yang, Zhangjing; Feng, Piaopiao; Wen, Tian; Wan, Minghua; Hong, Xunning

    2017-01-01

    Differentiation of glioblastoma multiformes (GBMs) and lymphomas using multi-sequence magnetic resonance imaging (MRI) is an important task that is valuable for treatment planning. However, this task is a challenge because GBMs and lymphomas may have a similar appearance in MRI images. This similarity may lead to misclassification and could affect the treatment results. In this paper, we propose a semi-automatic method based on multi-sequence MRI to differentiate these two types of brain tumors. Our method consists of three steps: 1) the key slice is selected from 3D MRIs and region of interests (ROIs) are drawn around the tumor region; 2) different features are extracted based on prior clinical knowledge and validated using a t-test; and 3) features that are helpful for classification are used to build an original feature vector and a support vector machine is applied to perform classification. In total, 58 GBM cases and 37 lymphoma cases are used to validate our method. A leave-one-out crossvalidation strategy is adopted in our experiments. The global accuracy of our method was determined as 96.84%, which indicates that our method is effective for the differentiation of GBM and lymphoma and can be applied in clinical diagnosis. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  19. Application of Wavelet-Based Methods for Accelerating Multi-Time-Scale Simulation of Bistable Heterogeneous Catalysis

    DOE PAGES

    Gur, Sourav; Frantziskonis, George N.; Univ. of Arizona, Tucson, AZ; ...

    2017-02-16

    Here, we report results from a numerical study of multi-time-scale bistable dynamics for CO oxidation on a catalytic surface in a flowing, well-mixed gas stream. The problem is posed in terms of surface and gas-phase submodels that dynamically interact in the presence of stochastic perturbations, reflecting the impact of molecular-scale fluctuations on the surface and turbulence in the gas. Wavelet-based methods are used to encode and characterize the temporal dynamics produced by each submodel and detect the onset of sudden state shifts (bifurcations) caused by nonlinear kinetics. When impending state shifts are detected, a more accurate but computationally expensive integrationmore » scheme can be used. This appears to make it possible, at least in some cases, to decrease the net computational burden associated with simulating multi-time-scale, nonlinear reacting systems by limiting the amount of time in which the more expensive integration schemes are required. Critical to achieving this is being able to detect unstable temporal transitions such as the bistable shifts in the example problem considered here. Lastly, our results indicate that a unique wavelet-based algorithm based on the Lipschitz exponent is capable of making such detections, even under noisy conditions, and may find applications in critical transition detection problems beyond catalysis.« less

  20. A General Formulation for Robust and Efficient Integration of Finite Differences and Phase Unwrapping on Sparse Multidimensional Domains

    NASA Astrophysics Data System (ADS)

    Costantini, Mario; Malvarosa, Fabio; Minati, Federico

    2010-03-01

    Phase unwrapping and integration of finite differences are key problems in several technical fields. In SAR interferometry and differential and persistent scatterers interferometry digital elevation models and displacement measurements can be obtained after unambiguously determining the phase values and reconstructing the mean velocities and elevations of the observed targets, which can be performed by integrating differential estimates of these quantities (finite differences between neighboring points).In this paper we propose a general formulation for robust and efficient integration of finite differences and phase unwrapping, which includes standard techniques methods as sub-cases. The proposed approach allows obtaining more reliable and accurate solutions by exploiting redundant differential estimates (not only between nearest neighboring points) and multi-dimensional information (e.g. multi-temporal, multi-frequency, multi-baseline observations), or external data (e.g. GPS measurements). The proposed approach requires the solution of linear or quadratic programming problems, for which computationally efficient algorithms exist.The validation tests obtained on real SAR data confirm the validity of the method, which was integrated in our production chain and successfully used also in massive productions.

  1. Car-to-pedestrian collision reconstruction with injury as an evaluation index.

    PubMed

    Weng, Yiliu; Jin, Xianlong; Zhao, Zhijie; Zhang, Xiaoyun

    2010-07-01

    Reconstruction of accidents is currently considered as a useful means in the analysis of accidents. By multi-body dynamics and numerical methods, and by adopting vehicle and pedestrian models, the scenario of the crash can often be simulated. When reconstructing the collisions, questions often arise regarding the criteria for the evaluation of simulation results. This paper proposes a reconstruction method for car-to-pedestrian collisions based on injuries of the pedestrians. In this method, pedestrian injury becomes a critical index in judging the correctness of the reconstruction result and guiding the simulation process. Application of this method to a real accident case is also presented in this paper. The study showed a good agreement between injuries obtained by numerical simulation and that by forensic identification. Copyright 2010 Elsevier Ltd. All rights reserved.

  2. Hermite WENO limiting for multi-moment finite-volume methods using the ADER-DT time discretization for 1-D systems of conservation laws

    DOE PAGES

    Norman, Matthew R.

    2014-11-24

    New Hermite Weighted Essentially Non-Oscillatory (HWENO) interpolants are developed and investigated within the Multi-Moment Finite-Volume (MMFV) formulation using the ADER-DT time discretization. Whereas traditional WENO methods interpolate pointwise, function-based WENO methods explicitly form a non-oscillatory, high-order polynomial over the cell in question. This study chooses a function-based approach and details how fast convergence to optimal weights for smooth flow is ensured. Methods of sixth-, eighth-, and tenth-order accuracy are developed. We compare these against traditional single-moment WENO methods of fifth-, seventh-, ninth-, and eleventh-order accuracy to compare against more familiar methods from literature. The new HWENO methods improve upon existingmore » HWENO methods (1) by giving a better resolution of unreinforced contact discontinuities and (2) by only needing a single HWENO polynomial to update both the cell mean value and cell mean derivative. Test cases to validate and assess these methods include 1-D linear transport, the 1-D inviscid Burger's equation, and the 1-D inviscid Euler equations. Smooth and non-smooth flows are used for evaluation. These HWENO methods performed better than comparable literature-standard WENO methods for all regimes of discontinuity and smoothness in all tests herein. They exhibit improved optimal accuracy due to the use of derivatives, and they collapse to solutions similar to typical WENO methods when limiting is required. The study concludes that the new HWENO methods are robust and effective when used in the ADER-DT MMFV framework. Finally, these results are intended to demonstrate capability rather than exhaust all possible implementations.« less

  3. WIND Validation Cases: Computational Study of Thermally-perfect Gases

    NASA Technical Reports Server (NTRS)

    DalBello, Teryn; Georgiadis, Nick (Technical Monitor)

    2002-01-01

    The ability of the WIND Navier-Stokes code to predict the physics of multi-species gases is investigated in support of future high-speed, high-temperature propulsion applications relevant to NASA's Space Transportation efforts. Three benchmark cases are investigated to evaluate the capability of the WIND chemistry model to accurately predict the aerodynamics of multi-species chemically non-reacting (frozen) gases. Case 1 represents turbulent mixing of sonic hydrogen and supersonic vitiated air. Case 2 consists of heated and unheated round supersonic jet exiting to ambient. Case 3 represents 2-D flow through a converging-diverging Mach 2 nozzle. For Case 1, the WIND results agree fairly well with experimental results and that significant mixing occurs downstream of the hydrogen injection point. For Case 2, the results show that the Wilke and Sutherland viscosity laws gave similar results, and the available SST turbulence model does not predict round supersonic nozzle flows accurately. For Case 3, results show that experimental, frozen, and 1-D gas results agree fairly well, and that frozen, homogeneous, multi-species gas calculations can be approximated by running in perfect gas mode while specifying the mixture gas constant and Ratio of Specific Heats.

  4. Proposing integrated Shannon's entropy-inverse data envelopment analysis methods for resource allocation problem under a fuzzy environment

    NASA Astrophysics Data System (ADS)

    Çakır, Süleyman

    2017-10-01

    In this study, a two-phase methodology for resource allocation problems under a fuzzy environment is proposed. In the first phase, the imprecise Shannon's entropy method and the acceptability index are suggested, for the first time in the literature, to select input and output variables to be used in the data envelopment analysis (DEA) application. In the second step, an interval inverse DEA model is executed for resource allocation in a short run. In an effort to exemplify the practicality of the proposed fuzzy model, a real case application has been conducted involving 16 cement firms listed in Borsa Istanbul. The results of the case application indicated that the proposed hybrid model is a viable procedure to handle input-output selection and resource allocation problems under fuzzy conditions. The presented methodology can also lend itself to different applications such as multi-criteria decision-making problems.

  5. Scheduling and Pricing for Expected Ramp Capability in Real-Time Power Markets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ela, Erik; O'Malley, Mark

    2016-05-01

    Higher variable renewable generation penetrations are occurring throughout the world on different power systems. These resources increase the variability and uncertainty on the system which must be accommodated by an increase in the flexibility of the system resources in order to maintain reliability. Many scheduling strategies have been discussed and introduced to ensure that this flexibility is available at multiple timescales. To meet variability, that is, the expected changes in system conditions, two recent strategies have been introduced: time-coupled multi-period market clearing models and the incorporation of ramp capability constraints. To appropriately evaluate these methods, it is important to assessmore » both efficiency and reliability. But it is also important to assess the incentive structure to ensure that resources asked to perform in different ways have the proper incentives to follow these directions, which is a step often ignored in simulation studies. We find that there are advantages and disadvantages to both approaches. We also find that look-ahead horizon length in multi-period market models can impact incentives. This paper proposes scheduling and pricing methods that ensure expected ramps are met reliably, efficiently, and with associated prices based on true marginal costs that incentivize resources to do as directed by the market. Case studies show improvements of the new method.« less

  6. Applying the algorithm "assessing quality using image registration circuits" (AQUIRC) to multi-atlas segmentation

    NASA Astrophysics Data System (ADS)

    Datteri, Ryan; Asman, Andrew J.; Landman, Bennett A.; Dawant, Benoit M.

    2014-03-01

    Multi-atlas registration-based segmentation is a popular technique in the medical imaging community, used to transform anatomical and functional information from a set of atlases onto a new patient that lacks this information. The accuracy of the projected information on the target image is dependent on the quality of the registrations between the atlas images and the target image. Recently, we have developed a technique called AQUIRC that aims at estimating the error of a non-rigid registration at the local level and was shown to correlate to error in a simulated case. Herein, we extend upon this work by applying AQUIRC to atlas selection at the local level across multiple structures in cases in which non-rigid registration is difficult. AQUIRC is applied to 6 structures, the brainstem, optic chiasm, left and right optic nerves, and the left and right eyes. We compare the results of AQUIRC to that of popular techniques, including Majority Vote, STAPLE, Non-Local STAPLE, and Locally-Weighted Vote. We show that AQUIRC can be used as a method to combine multiple segmentations and increase the accuracy of the projected information on a target image, and is comparable to cutting edge methods in the multi-atlas segmentation field.

  7. Voxel-based Gaussian naïve Bayes classification of ischemic stroke lesions in individual T1-weighted MRI scans.

    PubMed

    Griffis, Joseph C; Allendorfer, Jane B; Szaflarski, Jerzy P

    2016-01-15

    Manual lesion delineation by an expert is the standard for lesion identification in MRI scans, but it is time-consuming and can introduce subjective bias. Alternative methods often require multi-modal MRI data, user interaction, scans from a control population, and/or arbitrary statistical thresholding. We present an approach for automatically identifying stroke lesions in individual T1-weighted MRI scans using naïve Bayes classification. Probabilistic tissue segmentation and image algebra were used to create feature maps encoding information about missing and abnormal tissue. Leave-one-case-out training and cross-validation was used to obtain out-of-sample predictions for each of 30 cases with left hemisphere stroke lesions. Our method correctly predicted lesion locations for 30/30 un-trained cases. Post-processing with smoothing (8mm FWHM) and cluster-extent thresholding (100 voxels) was found to improve performance. Quantitative evaluations of post-processed out-of-sample predictions on 30 cases revealed high spatial overlap (mean Dice similarity coefficient=0.66) and volume agreement (mean percent volume difference=28.91; Pearson's r=0.97) with manual lesion delineations. Our automated approach agrees with manual tracing. It provides an alternative to automated methods that require multi-modal MRI data, additional control scans, or user interaction to achieve optimal performance. Our fully trained classifier has applications in neuroimaging and clinical contexts. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Outbreak of E. coli O157:H7 associated with lettuce served at fast food chains in the Maritimes and Ontario, Canada, Dec 2012

    PubMed Central

    Tataryn, J; Morton, V; Cutler, J; McDonald, L; Whitfield, Y; Billard, B; Gad, RR; Hexemer, A

    2014-01-01

    Background Identification and control of multi-jurisdictional foodborne illness outbreaks can be complex because of their multidisciplinary nature and the number of investigative partners involved. Objective To describe the multi-jurisdictional outbreak response to an E. coli O157:H7 outbreak in Canada that highlights the importance of early notification and collaboration and the value of centralized interviewing. Methods Investigators from local, provincial and federal jurisdictions, using a national outbreak response protocol to clarify roles and responsibilities and facilitate collaboration, conducted a rapid investigation that included centralized re-interview of cases, descriptive methods, binomial probability, and traceback findings to identify the source of the outbreak. Results There were 31 laboratory confirmed cases identified in New Brunswick, Nova Scotia, and Ontario. Thirteen cases (42%) were hospitalized and one case (3%) developed hemolytic uremic syndrome; there were no deaths. Due to early notification a coordinated investigation was initiated before laboratory subtyping was available. Re-interview of cases identified 10 cases who had not initially reported exposure to the source of the outbreak. Less than one week after the Outbreak Investigation Coordinating Committee was formed, consumption of shredded lettuce from a fast food chain was identified as the likely source of the illnesses and the implicated importer/processor initiated a precautionary recall the same day. Conclusion This outbreak investigation highlights the importance of early notification, prompt re-interviewing and collaboration to rapidly identify the source of an outbreak. PMID:29769900

  9. Nonlinear dynamic simulation of single- and multi-spool core engines

    NASA Technical Reports Server (NTRS)

    Schobeiri, T.; Lippke, C.; Abouelkheir, M.

    1993-01-01

    In this paper a new computational method for accurate simulation of the nonlinear dynamic behavior of single- and multi-spool core engines, turbofan engines, and power generation gas turbine engines is presented. In order to perform the simulation, a modularly structured computer code has been developed which includes individual mathematical modules representing various engine components. The generic structure of the code enables the dynamic simulation of arbitrary engine configurations ranging from single-spool thrust generation to multi-spool thrust/power generation engines under adverse dynamic operating conditions. For precise simulation of turbine and compressor components, row-by-row calculation procedures were implemented that account for the specific turbine and compressor cascade and blade geometry and characteristics. The dynamic behavior of the subject engine is calculated by solving a number of systems of partial differential equations, which describe the unsteady behavior of the individual components. In order to ensure the capability, accuracy, robustness, and reliability of the code, comprehensive critical performance assessment and validation tests were performed. As representatives, three different transient cases with single- and multi-spool thrust and power generation engines were simulated. The transient cases range from operating with a prescribed fuel schedule, to extreme load changes, to generator and turbine shut down.

  10. The Aircraft Electric Taxi System: A Qualitative Multi Case Study

    NASA Astrophysics Data System (ADS)

    Johnson, Thomas Frank

    The problem this research addresses is the airline industry, and the seemingly unwillingness attitude towards adopting ways to taxi aircraft without utilizing thrust from the main engines. The purpose of the study was to get a better understanding of the decision-making process of airline executives, in respect to investing in cost saving technology. A qualitative research method is used from personal interviews with 24 airline executives from two major U.S. airlines, related industry journal articles, and aircraft performance data. The following three research questions are addressed. RQ1. Does the cost of jet fuel influence airline executives' decision of adopting the aircraft electric taxi system technology? RQ2 Does the measurable payback period for a return on investment influence airline executives' decision of adopting ETS technology? RQ3. Does the amount of government assistance influence airline executives' decision of adopting ETS technology? A multi case research study design is used with a triangulation technique. The participant perceptions indicate the need to reduce operating costs, they have concerns about investment risk, and they are in favor of future government sponsored performance improvement projects. Based on the framework, findings and implications of this study, a future research paper could focus on the positive environmental effects of the ETS application. A study could be conducted on current airport area air quality and the effects that aircraft main engine thrust taxiing has on the surrounding air quality.

  11. Cervical cancer: a qualitative study on subjectivity, family, gender and health services

    PubMed Central

    Pelcastre-Villafuerte, Blanca E; Tirado-Gómez, Laura L; Mohar-Betancourt, Alejandro; López-Cervantes, Malaquías

    2007-01-01

    Background In 2002, cervical cancer was one of the leading causes of death in Mexico. Quantitative techniques allowed for the identification of socioeconomic, behavioral and biological characteristics that are part of its etiology. However such characteristics, are inadequate to explain sufficiently the role that emotions, family networks and socially-constructed categories such as gender play in the demand and utilization of health services for cervical cancer diagnosis and treatment and neither the timely undertaking of preventive actions, such as getting a PAP smear or seeking adequate and continuons treatment. Methods A qualitative study was carried out to analyze the role of different social and cultural factors in the timely detection of cervical cancer. As part of a multi-level, multi-method research effort, this particular study was based on individual interviews with women diagnosed with cervical cancer (identified as the "cases"), their female friends and relatives (identified as the "controls") and the cases' husbands. Results The results showed that both: denial and fear are two important components that regulate the behavior of both the women and their partners. Women with a small support network may have limited opportunities for taking action in favor of their own health and wellbeing. Conclusion Women tend not to worry about their health, in general and neither about cervical cancer in particular, as a consequence of their conceptualizations regarding their body and feminine identify – both of which are socially determined. Furthermore, it is necessary to improve the quality of information provided in health services. PMID:17331256

  12. GAMETES: a fast, direct algorithm for generating pure, strict, epistatic models with random architectures.

    PubMed

    Urbanowicz, Ryan J; Kiralis, Jeff; Sinnott-Armstrong, Nicholas A; Heberling, Tamra; Fisher, Jonathan M; Moore, Jason H

    2012-10-01

    Geneticists who look beyond single locus disease associations require additional strategies for the detection of complex multi-locus effects. Epistasis, a multi-locus masking effect, presents a particular challenge, and has been the target of bioinformatic development. Thorough evaluation of new algorithms calls for simulation studies in which known disease models are sought. To date, the best methods for generating simulated multi-locus epistatic models rely on genetic algorithms. However, such methods are computationally expensive, difficult to adapt to multiple objectives, and unlikely to yield models with a precise form of epistasis which we refer to as pure and strict. Purely and strictly epistatic models constitute the worst-case in terms of detecting disease associations, since such associations may only be observed if all n-loci are included in the disease model. This makes them an attractive gold standard for simulation studies considering complex multi-locus effects. We introduce GAMETES, a user-friendly software package and algorithm which generates complex biallelic single nucleotide polymorphism (SNP) disease models for simulation studies. GAMETES rapidly and precisely generates random, pure, strict n-locus models with specified genetic constraints. These constraints include heritability, minor allele frequencies of the SNPs, and population prevalence. GAMETES also includes a simple dataset simulation strategy which may be utilized to rapidly generate an archive of simulated datasets for given genetic models. We highlight the utility and limitations of GAMETES with an example simulation study using MDR, an algorithm designed to detect epistasis. GAMETES is a fast, flexible, and precise tool for generating complex n-locus models with random architectures. While GAMETES has a limited ability to generate models with higher heritabilities, it is proficient at generating the lower heritability models typically used in simulation studies evaluating new algorithms. In addition, the GAMETES modeling strategy may be flexibly combined with any dataset simulation strategy. Beyond dataset simulation, GAMETES could be employed to pursue theoretical characterization of genetic models and epistasis.

  13. Responsible innovation in port development: the Rotterdam Maasvlakte 2 and the Dalian Dayao Bay extension projects.

    PubMed

    Ravesteijn, Wim; Liu, Yi; Yan, Ping

    2015-01-01

    The paper outlines and specifies 'responsible port innovation', introducing the development of a methodological and procedural step-by-step plan for the implementation and evaluation of (responsible) innovations. Subsequently, it uses this as a guideline for the analysis and evaluation of two case-studies. The construction of the Rotterdam Maasvlakte 2 Port meets most of the formulated requirements, though making values more explicit and treating it as a process right from the start could have benefitted the project. The Dalian Dayao Port could improve its decision-making procedures in several respects, including the introduction of new methods to handle value tensions. Both projects show that public support is crucial in responsible port innovation and that it should be not only a multi-faceted but also a multi-level strategy.

  14. Aspects of effective supersymmetric theories

    NASA Astrophysics Data System (ADS)

    Tziveloglou, Panteleimon

    This work consists of two parts. In the first part we construct the complete extension of the Minimal Supersymmetric Standard Model by higher dimensional effective operators and then study its phenomenology. These operators encapsulate the effects on LHC physics of any kind of new degrees of freedom at the multiTeV scale. The effective analysis includes the case where the multiTeV physics is the supersymmetry breaking sector itself. In that case the appropriate framework is nonlinear supersymmetry. We choose to realize the nonlinear symmetry by the method of constrained superfields. Beyond the new effective couplings, the analysis suggests an interpretation of the 'little hierarchy problem' as an indication of new physics at multiTeV scale. In the second part we explore the power of constrained superfields in extended supersymmetry. It is known that in N = 2 supersymmetry the gauge kinetic function cannot depend on hypermultiplet scalars. However, it is also known that the low energy effective action of a D-brane in an N = 2 supersymmetric bulk includes the DBI action, where the gauge kinetic function does depend on the dilaton. We show how the nonlinearization of the second SUSY (imposed by the presence of the D-brane) opens this possibility, by constructing the global N = 1 linear + 1 nonlinear invariant coupling of a hypermultiplet with a gauge multiplet. The constructed theory enjoys interesting features, including a novel super-Higgs mechanism without gravity.

  15. Multi-View Interaction Modelling of human collaboration processes: a business process study of head and neck cancer care in a Dutch academic hospital.

    PubMed

    Stuit, Marco; Wortmann, Hans; Szirbik, Nick; Roodenburg, Jan

    2011-12-01

    In the healthcare domain, human collaboration processes (HCPs), which consist of interactions between healthcare workers from different (para)medical disciplines and departments, are of growing importance as healthcare delivery becomes increasingly integrated. Existing workflow-based process modelling tools for healthcare process management, which are the most commonly applied, are not suited for healthcare HCPs mainly due to their focus on the definition of task sequences instead of the graphical description of human interactions. This paper uses a case study of a healthcare HCP at a Dutch academic hospital to evaluate a novel interaction-centric process modelling method. The HCP under study is the care pathway performed by the head and neck oncology team. The evaluation results show that the method brings innovative, effective, and useful features. First, it collects and formalizes the tacit domain knowledge of the interviewed healthcare workers in individual interaction diagrams. Second, the method automatically integrates these local diagrams into a single global interaction diagram that reflects the consolidated domain knowledge. Third, the case study illustrates how the method utilizes a graphical modelling language for effective tree-based description of interactions, their composition and routing relations, and their roles. A process analysis of the global interaction diagram is shown to identify HCP improvement opportunities. The proposed interaction-centric method has wider applicability since interactions are the core of most multidisciplinary patient-care processes. A discussion argues that, although (multidisciplinary) collaboration is in many cases not optimal in the healthcare domain, it is increasingly considered a necessity to improve integration, continuity, and quality of care. The proposed method is helpful to describe, analyze, and improve the functioning of healthcare collaboration. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Chuchu, E-mail: chenchuchu@lsec.cc.ac.cn; Hong, Jialin, E-mail: hjl@lsec.cc.ac.cn; Zhang, Liying, E-mail: lyzhang@lsec.cc.ac.cn

    Stochastic Maxwell equations with additive noise are a system of stochastic Hamiltonian partial differential equations intrinsically, possessing the stochastic multi-symplectic conservation law. It is shown that the averaged energy increases linearly with respect to the evolution of time and the flow of stochastic Maxwell equations with additive noise preserves the divergence in the sense of expectation. Moreover, we propose three novel stochastic multi-symplectic methods to discretize stochastic Maxwell equations in order to investigate the preservation of these properties numerically. We make theoretical discussions and comparisons on all of the three methods to observe that all of them preserve the correspondingmore » discrete version of the averaged divergence. Meanwhile, we obtain the corresponding dissipative property of the discrete averaged energy satisfied by each method. Especially, the evolution rates of the averaged energies for all of the three methods are derived which are in accordance with the continuous case. Numerical experiments are performed to verify our theoretical results.« less

  17. The Multi-Literacy Development of a Young Trilingual Child: Four Leading Literacy Activities from Birth to Age Six

    ERIC Educational Resources Information Center

    Kim, Mi Song

    2014-01-01

    This study examines the multiplicity of literacies while incorporating multiple modes of meaning to understand a young trilingual child's meaning-making processes. This qualitative study reports the results of a combination of ethnographic observations and a longitudinal case study of one child's multi-literacy development from birth to…

  18. The National Birth Defects Prevention Study: a review of the methods

    PubMed Central

    Reefhuis, Jennita; Gilboa, Suzanne M.; Anderka, Marlene; Browne, Marilyn L.; Feldkamp, Marcia L.; Hobbs, Charlotte A.; Jenkins, Mary M.; Langlois, Peter H.; Newsome, Kimberly B.; Olshan, Andrew F.; Romitti, Paul A.; Shapira, Stuart K.; Shaw, Gary M.; Tinker, Sarah C.; Honein, Margaret A.

    2015-01-01

    Background The National Birth Defects Prevention Study (NBDPS) is a large population-based multi-center case-control study of major birth defects in the United States. Methods Data collection took place from 1998 through 2013 on pregnancies ending between October 1997 and December 2011. Cases could be live born, stillborn or induced terminations, and were identified from birth defects surveillance programs in Arkansas, California, Georgia, Iowa, Massachusetts, New Jersey, New York, North Carolina, Texas and Utah. Controls were live born infants without major birth defects identified from the same geographical regions and time periods as cases via either vital records or birth hospitals. Computer-assisted telephone interviews were completed with women between 6 weeks and 24 months after the estimated date of delivery. After completion of interviews, families received buccal cell collection kits for the mother, father and infant (if living). Results There were 47,832 eligible cases and 18,272 eligible controls. Among these, 32,187 (67%) and 11,814 (65%) respectively, provided interview information about their pregnancies. Buccal cell collection kits with a cytobrush for at least one family member were returned by 19,065 case and 6,211 control families (65% and 59% of those who were sent a kit). More than 500 projects have been proposed by the collaborators and over 200 manuscripts published using data from the NBDPS through December 2014. Conclusion The NBDPS has made substantial contributions to the field of birth defects epidemiology through its rigorous design, including case classification, detailed questionnaire and specimen collection, large study population, and collaborative activities across Centers. PMID:26033852

  19. Multi-Core Processor Memory Contention Benchmark Analysis Case Study

    NASA Technical Reports Server (NTRS)

    Simon, Tyler; McGalliard, James

    2009-01-01

    Multi-core processors dominate current mainframe, server, and high performance computing (HPC) systems. This paper provides synthetic kernel and natural benchmark results from an HPC system at the NASA Goddard Space Flight Center that illustrate the performance impacts of multi-core (dual- and quad-core) vs. single core processor systems. Analysis of processor design, application source code, and synthetic and natural test results all indicate that multi-core processors can suffer from significant memory subsystem contention compared to similar single-core processors.

  20. Feasibility study analysis for multi-function dual energy oven (case study: tapioca crackers small medium enterprise)

    NASA Astrophysics Data System (ADS)

    Soraya, N. W.; El Hadi, R. M.; Chumaidiyah, E.; Tripiawan, W.

    2017-12-01

    Conventional drying process is constrained by weather (cloudy / rainy), and requires wide drying area, and provides low-quality product. Multi-function dual energy oven is the appropriate technology to solve these problems. The oven uses solar thermal or gas heat for drying various type of products, including tapioca crackers. Investment analysis in technical, operational, and financial aspects show that the multi-function dual energy oven is feasible to be implemented for small medium enterprise (SME) processing tapioca crackers.

  1. Multi-label literature classification based on the Gene Ontology graph.

    PubMed

    Jin, Bo; Muller, Brian; Zhai, Chengxiang; Lu, Xinghua

    2008-12-08

    The Gene Ontology is a controlled vocabulary for representing knowledge related to genes and proteins in a computable form. The current effort of manually annotating proteins with the Gene Ontology is outpaced by the rate of accumulation of biomedical knowledge in literature, which urges the development of text mining approaches to facilitate the process by automatically extracting the Gene Ontology annotation from literature. The task is usually cast as a text classification problem, and contemporary methods are confronted with unbalanced training data and the difficulties associated with multi-label classification. In this research, we investigated the methods of enhancing automatic multi-label classification of biomedical literature by utilizing the structure of the Gene Ontology graph. We have studied three graph-based multi-label classification algorithms, including a novel stochastic algorithm and two top-down hierarchical classification methods for multi-label literature classification. We systematically evaluated and compared these graph-based classification algorithms to a conventional flat multi-label algorithm. The results indicate that, through utilizing the information from the structure of the Gene Ontology graph, the graph-based multi-label classification methods can significantly improve predictions of the Gene Ontology terms implied by the analyzed text. Furthermore, the graph-based multi-label classifiers are capable of suggesting Gene Ontology annotations (to curators) that are closely related to the true annotations even if they fail to predict the true ones directly. A software package implementing the studied algorithms is available for the research community. Through utilizing the information from the structure of the Gene Ontology graph, the graph-based multi-label classification methods have better potential than the conventional flat multi-label classification approach to facilitate protein annotation based on the literature.

  2. The multi-layer multi-configuration time-dependent Hartree method for bosons: theory, implementation, and applications.

    PubMed

    Cao, Lushuai; Krönke, Sven; Vendrell, Oriol; Schmelcher, Peter

    2013-10-07

    We develop the multi-layer multi-configuration time-dependent Hartree method for bosons (ML-MCTDHB), a variational numerically exact ab initio method for studying the quantum dynamics and stationary properties of general bosonic systems. ML-MCTDHB takes advantage of the permutation symmetry of identical bosons, which allows for investigations of the quantum dynamics from few to many-body systems. Moreover, the multi-layer feature enables ML-MCTDHB to describe mixed bosonic systems consisting of arbitrary many species. Multi-dimensional as well as mixed-dimensional systems can be accurately and efficiently simulated via the multi-layer expansion scheme. We provide a detailed account of the underlying theory and the corresponding implementation. We also demonstrate the superior performance by applying the method to the tunneling dynamics of bosonic ensembles in a one-dimensional double well potential, where a single-species bosonic ensemble of various correlation strengths and a weakly interacting two-species bosonic ensemble are considered.

  3. Alcohol Warning Label Awareness and Attention: A Multi-method Study.

    PubMed

    Pham, Cuong; Rundle-Thiele, Sharyn; Parkinson, Joy; Li, Shanshi

    2018-01-01

    Evaluation of alcohol warning labels requires careful consideration ensuring that research captures more than awareness given that labels may not be prominent enough to attract attention. This study investigates attention of current in market alcohol warning labels and examines whether attention can be enhanced through theoretically informed design. Attention scores obtained through self-report methods are compared to objective measures (eye-tracking). A multi-method experimental design was used delivering four conditions, namely control, colour, size and colour and size. The first study (n = 559) involved a self-report survey to measure attention. The second study (n = 87) utilized eye-tracking to measure fixation count and duration and time to first fixation. Analysis of Variance (ANOVA) was utilized. Eye-tracking identified that 60% of participants looked at the current in market alcohol warning label while 81% looked at the optimized design (larger and red). In line with observed attention self-reported attention increased for the optimized design. The current study casts doubt on dominant practices (largely self-report), which have been used to evaluate alcohol warning labels. Awareness cannot be used to assess warning label effectiveness in isolation in cases where attention does not occur 100% of the time. Mixed methods permit objective data collection methodologies to be triangulated with surveys to assess warning label effectiveness. Attention should be incorporated as a measure in warning label effectiveness evaluations. Colour and size changes to the existing Australian warning labels aided by theoretically informed design increased attention. © The Author 2017. Medical Council on Alcohol and Oxford University Press. All rights reserved.

  4. [A novel method of multi-channel feature extraction combining multivariate autoregression and multiple-linear principal component analysis].

    PubMed

    Wang, Jinjia; Zhang, Yanna

    2015-02-01

    Brain-computer interface (BCI) systems identify brain signals through extracting features from them. In view of the limitations of the autoregressive model feature extraction method and the traditional principal component analysis to deal with the multichannel signals, this paper presents a multichannel feature extraction method that multivariate autoregressive (MVAR) model combined with the multiple-linear principal component analysis (MPCA), and used for magnetoencephalography (MEG) signals and electroencephalograph (EEG) signals recognition. Firstly, we calculated the MVAR model coefficient matrix of the MEG/EEG signals using this method, and then reduced the dimensions to a lower one, using MPCA. Finally, we recognized brain signals by Bayes Classifier. The key innovation we introduced in our investigation showed that we extended the traditional single-channel feature extraction method to the case of multi-channel one. We then carried out the experiments using the data groups of IV-III and IV - I. The experimental results proved that the method proposed in this paper was feasible.

  5. Accurate Prediction of Motor Failures by Application of Multi CBM Tools: A Case Study

    NASA Astrophysics Data System (ADS)

    Dutta, Rana; Singh, Veerendra Pratap; Dwivedi, Jai Prakash

    2018-02-01

    Motor failures are very difficult to predict accurately with a single condition-monitoring tool as both electrical and the mechanical systems are closely related. Electrical problem, like phase unbalance, stator winding insulation failures can, at times, lead to vibration problem and at the same time mechanical failures like bearing failure, leads to rotor eccentricity. In this case study of a 550 kW blower motor it has been shown that a rotor bar crack was detected by current signature analysis and vibration monitoring confirmed the same. In later months in a similar motor vibration monitoring predicted bearing failure and current signature analysis confirmed the same. In both the cases, after dismantling the motor, the predictions were found to be accurate. In this paper we will be discussing the accurate predictions of motor failures through use of multi condition monitoring tools with two case studies.

  6. Real-time electroholography using a multiple-graphics processing unit cluster system with a single spatial light modulator and the InfiniBand network

    NASA Astrophysics Data System (ADS)

    Niwase, Hiroaki; Takada, Naoki; Araki, Hiromitsu; Maeda, Yuki; Fujiwara, Masato; Nakayama, Hirotaka; Kakue, Takashi; Shimobaba, Tomoyoshi; Ito, Tomoyoshi

    2016-09-01

    Parallel calculations of large-pixel-count computer-generated holograms (CGHs) are suitable for multiple-graphics processing unit (multi-GPU) cluster systems. However, it is not easy for a multi-GPU cluster system to accomplish fast CGH calculations when CGH transfers between PCs are required. In these cases, the CGH transfer between the PCs becomes a bottleneck. Usually, this problem occurs only in multi-GPU cluster systems with a single spatial light modulator. To overcome this problem, we propose a simple method using the InfiniBand network. The computational speed of the proposed method using 13 GPUs (NVIDIA GeForce GTX TITAN X) was more than 3000 times faster than that of a CPU (Intel Core i7 4770) when the number of three-dimensional (3-D) object points exceeded 20,480. In practice, we achieved ˜40 tera floating point operations per second (TFLOPS) when the number of 3-D object points exceeded 40,960. Our proposed method was able to reconstruct a real-time movie of a 3-D object comprising 95,949 points.

  7. Predicting Teacher Emotional Labour Based on Multi-Frame Leadership Orientations: A Case from Turkey

    ERIC Educational Resources Information Center

    Özdemir, Murat; Koçak, Seval

    2018-01-01

    Human behaviours in organisations are closely associated with leadership styles. The main purpose of this study is to find out the relationship between teachers' perception about multi-frame leadership orientations of principals and teachers' emotional labour. The study is based on Bolman and Deal's Four Frames Model, and, therefore, the…

  8. Can bipolar disorder be viewed as a multi-system inflammatory disease?

    PubMed Central

    Leboyer, Marion; Soreca, Isabella; Scott, Jan; Frye, Mark; Henry, Chantal; Tamouza, Ryad; Kupfer, David J.

    2012-01-01

    Background Patients with bipolar disorder are known to be at high risk of premature death. Comorbid cardio-vascular diseases are a leading cause of excess mortality, well above the risk associated with suicide. In this review, we explore comorbid medical disorders, highlighting evidence that bipolar disorder can be effectively conceptualized as a multi-systemic inflammatory disease. Methods We conducted a systematic PubMed search of all English-language articles recently published with bipolar disorder cross-referenced with the following terms: mortality and morbidity, cardio-vascular, diabetes, obesity, metabolic syndrome, inflammation, auto-antibody, retro-virus, stress, sleep and circadian rhythm. Results Evidence gathered so far suggests that the multi-system involvement is present from the early stages, and therefore requires proactive screening and diagnostic procedures, as well as comprehensive treatment to reduce progression and premature mortality. Exploring the biological pathways that could account for the observed link show that dysregulated inflammatory background could be a common factor underlying cardio-vascular and bipolar disorders. Viewing bipolar disorder as a multi-system disorder should help us to re-conceptualize disorders of the mind as “disorders of the brain and the body”. Limitations The current literature substantially lacks longitudinal and mechanistic studies, as well as comparison studies to explore the magnitude of the medical burden in bipolar disorder compared to major mood disorders as well as psychotic disorders. It is also necessary to look for subgroups of bipolar disorder based on their rates of comorbid disorders. Conclusions Comorbid medical illnesses in bipolar disorder might be viewed not only as the consequence of health behaviors and of psychotropic medications, but rather as an early manifestation of a multi-systemic disorder. Medical monitoring is thus a critical component of case assessment. Exploring common biological pathways of inflammation should help biomarkers discovery, ultimately leading to innovative diagnostic tools, new methods of prevention and personalized treatments. PMID:22497876

  9. Research on Multi - Person Parallel Modeling Method Based on Integrated Model Persistent Storage

    NASA Astrophysics Data System (ADS)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying

    2018-03-01

    This paper mainly studies the multi-person parallel modeling method based on the integrated model persistence storage. The integrated model refers to a set of MDDT modeling graphics system, which can carry out multi-angle, multi-level and multi-stage description of aerospace general embedded software. Persistent storage refers to converting the data model in memory into a storage model and converting the storage model into a data model in memory, where the data model refers to the object model and the storage model is a binary stream. And multi-person parallel modeling refers to the need for multi-person collaboration, the role of separation, and even real-time remote synchronization modeling.

  10. Hybrid PV/diesel solar power system design using multi-level factor analysis optimization

    NASA Astrophysics Data System (ADS)

    Drake, Joshua P.

    Solar power systems represent a large area of interest across a spectrum of organizations at a global level. It was determined that a clear understanding of current state of the art software and design methods, as well as optimization methods, could be used to improve the design methodology. Solar power design literature was researched for an in depth understanding of solar power system design methods and algorithms. Multiple software packages for the design and optimization of solar power systems were analyzed for a critical understanding of their design workflow. In addition, several methods of optimization were studied, including brute force, Pareto analysis, Monte Carlo, linear and nonlinear programming, and multi-way factor analysis. Factor analysis was selected as the most efficient optimization method for engineering design as it applied to solar power system design. The solar power design algorithms, software work flow analysis, and factor analysis optimization were combined to develop a solar power system design optimization software package called FireDrake. This software was used for the design of multiple solar power systems in conjunction with an energy audit case study performed in seven Tibetan refugee camps located in Mainpat, India. A report of solar system designs for the camps, as well as a proposed schedule for future installations was generated. It was determined that there were several improvements that could be made to the state of the art in modern solar power system design, though the complexity of current applications is significant.

  11. Optimizing Safety, Fidelity and Usability of an Intelligent Clinical Support Tool (ICST) For Acute Hospital Care: an Australian Case Study Using a Multi-Method Delphi Process.

    PubMed

    Botti, Mari; Redley, Bernice; Nguyen, Lemai; Coleman, Kimberley; Wickramasinghe, Nilmini

    2015-01-01

    This research focuses on a major health priority for Australia by addressing existing gaps in the implementation of nursing informatics solutions in healthcare. It serves to inform the successful deployment of IT solutions designed to support patient-centered, frontline acute healthcare delivery by multidisciplinary care teams. The outcomes can guide future evaluations of the contribution of IT solutions to the efficiency, safety and quality of care delivery in acute hospital settings.

  12. Thermally induced optical deformation of a Nd:YVO4 active disk under the action of multi-beam spatially periodic diode pumping

    NASA Astrophysics Data System (ADS)

    Guryev, D. A.; Nikolaev, D. A.; Tsvetkov, V. B.; Shcherbakov, I. A.

    2018-05-01

    A study of how the transverse distribution of an optical path changes in a Nd:YVO4 active disk was carried out in a ten-beam spatially periodic diode pumping in the one-dimensional case. The pumping beams’ transverse dimensions were comparable with the distances between them. The investigations were carried out using laser interferometry methods. It was found that the optical thickness changing in the active disk along the line of pumping spots was well described by a Gaussian function.

  13. A hybrid method for X-ray optics simulation: combining geometric ray-tracing and wavefront propagation

    DOE PAGES

    Shi, Xianbo; Reininger, Ruben; Sanchez del Rio, Manuel; ...

    2014-05-15

    A new method for beamline simulation combining ray-tracing and wavefront propagation is described. The 'Hybrid Method' computes diffraction effects when the beam is clipped by an aperture or mirror length and can also simulate the effect of figure errors in the optical elements when diffraction is present. The effect of different spatial frequencies of figure errors on the image is compared withSHADOWresults pointing to the limitations of the latter. The code has been benchmarked against the multi-electron version ofSRWin one dimension to show its validity in the case of fully, partially and non-coherent beams. The results demonstrate that the codemore » is considerably faster than the multi-electron version ofSRWand is therefore a useful tool for beamline design and optimization.« less

  14. Real-Time Multi-Target Localization from Unmanned Aerial Vehicles

    PubMed Central

    Wang, Xuan; Liu, Jinghong; Zhou, Qianfei

    2016-01-01

    In order to improve the reconnaissance efficiency of unmanned aerial vehicle (UAV) electro-optical stabilized imaging systems, a real-time multi-target localization scheme based on an UAV electro-optical stabilized imaging system is proposed. First, a target location model is studied. Then, the geodetic coordinates of multi-targets are calculated using the homogeneous coordinate transformation. On the basis of this, two methods which can improve the accuracy of the multi-target localization are proposed: (1) the real-time zoom lens distortion correction method; (2) a recursive least squares (RLS) filtering method based on UAV dead reckoning. The multi-target localization error model is established using Monte Carlo theory. In an actual flight, the UAV flight altitude is 1140 m. The multi-target localization results are within the range of allowable error. After we use a lens distortion correction method in a single image, the circular error probability (CEP) of the multi-target localization is reduced by 7%, and 50 targets can be located at the same time. The RLS algorithm can adaptively estimate the location data based on multiple images. Compared with multi-target localization based on a single image, CEP of the multi-target localization using RLS is reduced by 25%. The proposed method can be implemented on a small circuit board to operate in real time. This research is expected to significantly benefit small UAVs which need multi-target geo-location functions. PMID:28029145

  15. Real-Time Multi-Target Localization from Unmanned Aerial Vehicles.

    PubMed

    Wang, Xuan; Liu, Jinghong; Zhou, Qianfei

    2016-12-25

    In order to improve the reconnaissance efficiency of unmanned aerial vehicle (UAV) electro-optical stabilized imaging systems, a real-time multi-target localization scheme based on an UAV electro-optical stabilized imaging system is proposed. First, a target location model is studied. Then, the geodetic coordinates of multi-targets are calculated using the homogeneous coordinate transformation. On the basis of this, two methods which can improve the accuracy of the multi-target localization are proposed: (1) the real-time zoom lens distortion correction method; (2) a recursive least squares (RLS) filtering method based on UAV dead reckoning. The multi-target localization error model is established using Monte Carlo theory. In an actual flight, the UAV flight altitude is 1140 m. The multi-target localization results are within the range of allowable error. After we use a lens distortion correction method in a single image, the circular error probability (CEP) of the multi-target localization is reduced by 7%, and 50 targets can be located at the same time. The RLS algorithm can adaptively estimate the location data based on multiple images. Compared with multi-target localization based on a single image, CEP of the multi-target localization using RLS is reduced by 25%. The proposed method can be implemented on a small circuit board to operate in real time. This research is expected to significantly benefit small UAVs which need multi-target geo-location functions.

  16. Evaluation of outbreak detection performance using multi-stream syndromic surveillance for influenza-like illness in rural Hubei Province, China: a temporal simulation model based on healthcare-seeking behaviors.

    PubMed

    Fan, Yunzhou; Wang, Ying; Jiang, Hongbo; Yang, Wenwen; Yu, Miao; Yan, Weirong; Diwan, Vinod K; Xu, Biao; Dong, Hengjin; Palm, Lars; Nie, Shaofa

    2014-01-01

    Syndromic surveillance promotes the early detection of diseases outbreaks. Although syndromic surveillance has increased in developing countries, performance on outbreak detection, particularly in cases of multi-stream surveillance, has scarcely been evaluated in rural areas. This study introduces a temporal simulation model based on healthcare-seeking behaviors to evaluate the performance of multi-stream syndromic surveillance for influenza-like illness. Data were obtained in six towns of rural Hubei Province, China, from April 2012 to June 2013. A Susceptible-Exposed-Infectious-Recovered model generated 27 scenarios of simulated influenza A (H1N1) outbreaks, which were converted into corresponding simulated syndromic datasets through the healthcare-behaviors model. We then superimposed converted syndromic datasets onto the baselines obtained to create the testing datasets. Outbreak performance of single-stream surveillance of clinic visit, frequency of over the counter drug purchases, school absenteeism, and multi-stream surveillance of their combinations were evaluated using receiver operating characteristic curves and activity monitoring operation curves. In the six towns examined, clinic visit surveillance and school absenteeism surveillance exhibited superior performances of outbreak detection than over the counter drug purchase frequency surveillance; the performance of multi-stream surveillance was preferable to signal-stream surveillance, particularly at low specificity (Sp <90%). The temporal simulation model based on healthcare-seeking behaviors offers an accessible method for evaluating the performance of multi-stream surveillance.

  17. A scale-entropy diffusion equation to describe the multi-scale features of turbulent flames near a wall

    NASA Astrophysics Data System (ADS)

    Queiros-Conde, D.; Foucher, F.; Mounaïm-Rousselle, C.; Kassem, H.; Feidt, M.

    2008-12-01

    Multi-scale features of turbulent flames near a wall display two kinds of scale-dependent fractal features. In scale-space, an unique fractal dimension cannot be defined and the fractal dimension of the front is scale-dependent. Moreover, when the front approaches the wall, this dependency changes: fractal dimension also depends on the wall-distance. Our aim here is to propose a general geometrical framework that provides the possibility to integrate these two cases, in order to describe the multi-scale structure of turbulent flames interacting with a wall. Based on the scale-entropy quantity, which is simply linked to the roughness of the front, we thus introduce a general scale-entropy diffusion equation. We define the notion of “scale-evolutivity” which characterises the deviation of a multi-scale system from the pure fractal behaviour. The specific case of a constant “scale-evolutivity” over the scale-range is studied. In this case, called “parabolic scaling”, the fractal dimension is a linear function of the logarithm of scale. The case of a constant scale-evolutivity in the wall-distance space implies that the fractal dimension depends linearly on the logarithm of the wall-distance. We then verified experimentally, that parabolic scaling represents a good approximation of the real multi-scale features of turbulent flames near a wall.

  18. A Successful Replication of the River Visitor Inventory and Monitoring Process for Capacity Management

    Treesearch

    Kenneth Chilman; James Vogel; Greg Brown; John H. Burde

    2004-01-01

    This paper has 3 purposes: to discuss 1. case study research and its utility for recreation management decisionmaking, 2. the recreation visitor inventory and monitoring process developed from case study research, and 3. a successful replication of the process in a large-scale, multi-year application. Although case study research is discussed in research textbooks as...

  19. Orthodontic aligners and root resorption: A systematic review.

    PubMed

    Elhaddaoui, Rajae; Qoraich, Halima Saadia; Bahije, Loubna; Zaoui, Fatima

    2017-03-01

    Root resorption is one of the leading problems in orthodontic treatment. Most earlier studies have assessed the incidence and severity of root resorption following orthodontic treatment using fixed appliances as well as associated factors. However, few studies have assessed these parameters in the context of orthodontic treatment using thermoplastic splints or aligners. The aim of this systematic review was to assess the incidence and severity of root resorption following orthodontic treatment using aligners and associated factors. A comparative analysis was also made with fixed multi-bracket treatments. The data bases consulted were: Medline, Embase, EBSCO Host, Cochrane Library and Science Direct. Our search included meta-analyses, randomized and non-randomized controled trials, cohort studies and descriptive studies published before December 2015 and evidencing a connection with the incidence and severity of root resorption following orthodontic treatment using aligners alone or compared with fixed multi-bracket treatments. Among the 93 selected references, only 3 studies met our selection criteria. The incidence of root resorption ranged between 0 and 46%, of which 6% were severe cases. Relative to fixed multi-bracket non-extraction treatments to correct the same malocclusions, the incidence of resorption ranged between 2% and 50%, of which 22% were severe cases. In both techniques, the incidence of resorption was higher for the maxillary incisors and was not influenced by either age or sex. In malocclusion cases not requiring extractions, orthodontic aligner treatment is possibly associated with a lower incidence of resorption than fixed multi-bracket treatment. Further research encompassing extraction cases is needed to better assess the incidence and severity of root resorption following the use of these removable appliances. Copyright © 2016 CEO. Published by Elsevier Masson SAS. All rights reserved.

  20. [Individual growth modeling of the penshell Atrina maura (Bivalvia: Pinnidae) using a multi model inference approach].

    PubMed

    Aragón-Noriega, Eugenio Alberto

    2013-09-01

    Growth models of marine animals, for fisheries and/or aquaculture purposes, are based on the popular von Bertalanffy model. This tool is mostly used because its parameters are used to evaluate other fisheries models, such as yield per recruit; nevertheless, there are other alternatives (such as Gompertz, Logistic, Schnute) not yet used by fishery scientists, that may result useful depending on the studied species. The penshell Atrina maura, has been studied for fisheries or aquaculture supplies, but its individual growth has not yet been studied before. The aim of this study was to model the absolute growth of the penshell A. maura using length-age data. For this, five models were assessed to obtain growth parameters: von Bertalanffy, Gompertz, Logistic, Schnute case 1 and Schnute and Richards. The criterion used to select the best models was the Akaike information criterion, as well as the residual squared sum and R2 adjusted. To get the average asymptotic length, the multi model inference approach was used. According to Akaike information criteria, the Gompertz model better described the absolute growth of A. maura. Following the multi model inference approach the average asymptotic shell length was 218.9 mm (IC 212.3-225.5) of shell length. I concluded that the use of the multi model approach and the Akaike information criteria represented the most robust method for growth parameter estimation of A. maura and the von Bertalanffy growth model should not be selected a priori as the true model to obtain the absolute growth in bivalve mollusks like in the studied species in this paper.

  1. Assortativity Patterns in Multi-dimensional Inter-organizational Networks: A Case Study of the Humanitarian Relief Sector

    NASA Astrophysics Data System (ADS)

    Zhao, Kang; Ngamassi, Louis-Marie; Yen, John; Maitland, Carleen; Tapia, Andrea

    We use computational tools to study assortativity patterns in multi-dimensional inter-organizational networks on the basis of different node attributes. In the case study of an inter-organizational network in the humanitarian relief sector, we consider not only macro-level topological patterns, but also assortativity on the basis of micro-level organizational attributes. Unlike assortative social networks, this inter-organizational network exhibits disassortative or random patterns on three node attributes. We believe organizations' seek of complementarity is one of the main reasons for the special patterns. Our analysis also provides insights on how to promote collaborations among the humanitarian relief organizations.

  2. Motion reconstruction of animal groups: From schooling fish to swarming mosquitoes

    NASA Astrophysics Data System (ADS)

    Butail, Sachit

    The long-term goal of this research is to provide kinematic data for the design and validation of spatial models of collective behavior in animal groups. The specific research objective of this dissertation is to apply methods from nonlinear estimation and computer vision to construct multi-target tracking systems that process multi-view calibrated video to reconstruct the three-dimensional movement of animals in a group. We adapt the tracking systems for the study of two animal species: Danio aequipinnatus, a common species of schooling fish, and Anopheles gambiae, the most important vector of malaria in sub-Saharan Africa. Together these tracking systems span variability in target size on image, density, and movement. For tracking fish, we automatically initialize, predict, and reconstruct shape trajectories of multiple fish through occlusions. For mosquitoes, which appear as faded streaks on in-field footage, we provide methods to extract velocity information from the streaks, adaptively seek missing measurements, and resolve occlusions within a multi-hypothesis framework. In each case the research has yielded an unprecedented volume of trajectory data for subsequent analysis. We present kinematic data of fast-start response in fish schools and first-ever trajectories of wild mosquito swarming and mating events. The broader impact of this work is to advance the understanding of animal groups for the design of bio-inspired robotic systems, where, similar to the animal groups we study, the collective is able to perform tasks far beyond the capabilities of a single inexpensive robot.

  3. Experimental injury study of children seated behind collapsing front seats in rear impacts.

    PubMed

    Saczalski, Kenneth J; Sances, Anthony; Kumaresan, Srirangam; Burton, Joseph L; Lewis, Paul R

    2003-01-01

    In the mid 1990's the U.S. Department of Transportation made recommendations to place children and infants into the rear seating areas of motor vehicles to avoid front seat airbag induced injuries and fatalities. In most rear-impacts, however, the adult occupied front seats will collapse into the rear occupant area and pose another potentially serious injury hazard to the rear-seated children. Since rear-impacts involve a wide range of speeds, impact severity, and various sizes of adults in collapsing front seats, a multi-variable experimental method was employed in conjunction with a multi-level "factorial analysis" technique to study injury potential of rear-seated children. Various sizes of Hybrid III adult surrogates, seated in a "typical" average strength collapsing type of front seat, and a three-year-old Hybrid III child surrogate, seated on a built-in booster seat located directly behind the front adult occupant, were tested at various impact severity levels in a popular "minivan" sled-buck test set up. A total of five test configurations were utilized in this study. Three levels of velocity changes ranging from 22.5 to 42.5 kph were used. The average of peak accelerations on the sled-buck tests ranged from approximately 8.2 G's up to about 11.1 G's, with absolute peak values of just over 14 G's at the higher velocity change. The parameters of the test configuration enabled the experimental data to be combined into a polynomial "injury" function of the two primary independent variables (i.e. front seat adult occupant weight and velocity change) so that the "likelihood" of rear child "injury potential" could be determined over a wide range of the key parameters. The experimentally derived head injury data was used to obtain a preliminary HIC (Head Injury Criteria) polynomial fit at the 900 level for the rear-seated child. Several actual accident cases were compared with the preliminary polynomial fit. This study provides a test efficient, multi-variable, method to compare the injury biomechanical data with actual accident cases.

  4. A multistage stochastic programming model for a multi-period strategic expansion of biofuel supply chain under evolving uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Fei; Huang, Yongxi

    Here, we develop a multistage, stochastic mixed-integer model to support biofuel supply chain expansion under evolving uncertainties. By utilizing the block-separable recourse property, we reformulate the multistage program in an equivalent two-stage program and solve it using an enhanced nested decomposition method with maximal non-dominated cuts. We conduct extensive numerical experiments and demonstrate the application of the model and algorithm in a case study based on the South Carolina settings. The value of multistage stochastic programming method is also explored by comparing the model solution with the counterparts of an expected value based deterministic model and a two-stage stochastic model.

  5. A multistage stochastic programming model for a multi-period strategic expansion of biofuel supply chain under evolving uncertainties

    DOE PAGES

    Xie, Fei; Huang, Yongxi

    2018-02-04

    Here, we develop a multistage, stochastic mixed-integer model to support biofuel supply chain expansion under evolving uncertainties. By utilizing the block-separable recourse property, we reformulate the multistage program in an equivalent two-stage program and solve it using an enhanced nested decomposition method with maximal non-dominated cuts. We conduct extensive numerical experiments and demonstrate the application of the model and algorithm in a case study based on the South Carolina settings. The value of multistage stochastic programming method is also explored by comparing the model solution with the counterparts of an expected value based deterministic model and a two-stage stochastic model.

  6. Control system of the inspection robots group applying auctions and multi-criteria analysis for task allocation

    NASA Astrophysics Data System (ADS)

    Panfil, Wawrzyniec; Moczulski, Wojciech

    2017-10-01

    In the paper presented is a control system of a mobile robots group intended for carrying out inspection missions. The main research problem was to define such a control system in order to facilitate a cooperation of the robots resulting in realization of the committed inspection tasks. Many of the well-known control systems use auctions for tasks allocation, where a subject of an auction is a task to be allocated. It seems that in the case of missions characterized by much larger number of tasks than number of robots it will be better if robots (instead of tasks) are subjects of auctions. The second identified problem concerns the one-sided robot-to-task fitness evaluation. Simultaneous assessment of the robot-to-task fitness and task attractiveness for robot should affect positively for the overall effectiveness of the multi-robot system performance. The elaborated system allows to assign tasks to robots using various methods for evaluation of fitness between robots and tasks, and using some tasks allocation methods. There is proposed the method for multi-criteria analysis, which is composed of two assessments, i.e. robot's concurrency position for task among other robots and task's attractiveness for robot among other tasks. Furthermore, there are proposed methods for tasks allocation applying the mentioned multi-criteria analysis method. The verification of both the elaborated system and the proposed tasks' allocation methods was carried out with the help of simulated experiments. The object under test was a group of inspection mobile robots being a virtual counterpart of the real mobile-robot group.

  7. A multi-objective programming model for assessment the GHG emissions in MSW management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mavrotas, George, E-mail: mavrotas@chemeng.ntua.gr; Skoulaxinou, Sotiria; Gakis, Nikos

    2013-09-15

    Highlights: • The multi-objective multi-period optimization model. • The solution approach for the generation of the Pareto front with mathematical programming. • The very detailed description of the model (decision variables, parameters, equations). • The use of IPCC 2006 guidelines for landfill emissions (first order decay model) in the mathematical programming formulation. - Abstract: In this study a multi-objective mathematical programming model is developed for taking into account GHG emissions for Municipal Solid Waste (MSW) management. Mathematical programming models are often used for structure, design and operational optimization of various systems (energy, supply chain, processes, etc.). The last twenty yearsmore » they are used all the more often in Municipal Solid Waste (MSW) management in order to provide optimal solutions with the cost objective being the usual driver of the optimization. In our work we consider the GHG emissions as an additional criterion, aiming at a multi-objective approach. The Pareto front (Cost vs. GHG emissions) of the system is generated using an appropriate multi-objective method. This information is essential to the decision maker because he can explore the trade-offs in the Pareto curve and select his most preferred among the Pareto optimal solutions. In the present work a detailed multi-objective, multi-period mathematical programming model is developed in order to describe the waste management problem. Apart from the bi-objective approach, the major innovations of the model are (1) the detailed modeling considering 34 materials and 42 technologies, (2) the detailed calculation of the energy content of the various streams based on the detailed material balances, and (3) the incorporation of the IPCC guidelines for the CH{sub 4} generated in the landfills (first order decay model). The equations of the model are described in full detail. Finally, the whole approach is illustrated with a case study referring to the application of the model in a Greek region.« less

  8. A Multi-Scale Settlement Matching Algorithm Based on ARG

    NASA Astrophysics Data System (ADS)

    Yue, Han; Zhu, Xinyan; Chen, Di; Liu, Lingjia

    2016-06-01

    Homonymous entity matching is an important part of multi-source spatial data integration, automatic updating and change detection. Considering the low accuracy of existing matching methods in dealing with matching multi-scale settlement data, an algorithm based on Attributed Relational Graph (ARG) is proposed. The algorithm firstly divides two settlement scenes at different scales into blocks by small-scale road network and constructs local ARGs in each block. Then, ascertains candidate sets by merging procedures and obtains the optimal matching pairs by comparing the similarity of ARGs iteratively. Finally, the corresponding relations between settlements at large and small scales are identified. At the end of this article, a demonstration is presented and the results indicate that the proposed algorithm is capable of handling sophisticated cases.

  9. Communication: Density functional theory model for multi-reference systems based on the exact-exchange hole normalization

    NASA Astrophysics Data System (ADS)

    Laqua, Henryk; Kussmann, Jörg; Ochsenfeld, Christian

    2018-03-01

    The correct description of multi-reference electronic ground states within Kohn-Sham density functional theory (DFT) requires an ensemble-state representation, employing fractionally occupied orbitals. However, the use of fractional orbital occupation leads to non-normalized exact-exchange holes, resulting in large fractional-spin errors for conventional approximative density functionals. In this communication, we present a simple approach to directly include the exact-exchange-hole normalization into DFT. Compared to conventional functionals, our model strongly improves the description for multi-reference systems, while preserving the accuracy in the single-reference case. We analyze the performance of our proposed method at the example of spin-averaged atoms and spin-restricted bond dissociation energy surfaces.

  10. Communication: Density functional theory model for multi-reference systems based on the exact-exchange hole normalization.

    PubMed

    Laqua, Henryk; Kussmann, Jörg; Ochsenfeld, Christian

    2018-03-28

    The correct description of multi-reference electronic ground states within Kohn-Sham density functional theory (DFT) requires an ensemble-state representation, employing fractionally occupied orbitals. However, the use of fractional orbital occupation leads to non-normalized exact-exchange holes, resulting in large fractional-spin errors for conventional approximative density functionals. In this communication, we present a simple approach to directly include the exact-exchange-hole normalization into DFT. Compared to conventional functionals, our model strongly improves the description for multi-reference systems, while preserving the accuracy in the single-reference case. We analyze the performance of our proposed method at the example of spin-averaged atoms and spin-restricted bond dissociation energy surfaces.

  11. Analytical solutions for coupling fractional partial differential equations with Dirichlet boundary conditions

    NASA Astrophysics Data System (ADS)

    Ding, Xiao-Li; Nieto, Juan J.

    2017-11-01

    In this paper, we consider the analytical solutions of coupling fractional partial differential equations (FPDEs) with Dirichlet boundary conditions on a finite domain. Firstly, the method of successive approximations is used to obtain the analytical solutions of coupling multi-term time fractional ordinary differential equations. Then, the technique of spectral representation of the fractional Laplacian operator is used to convert the coupling FPDEs to the coupling multi-term time fractional ordinary differential equations. By applying the obtained analytical solutions to the resulting multi-term time fractional ordinary differential equations, the desired analytical solutions of the coupling FPDEs are given. Our results are applied to derive the analytical solutions of some special cases to demonstrate their applicability.

  12. Strategies for the structural analysis of multi-protein complexes: lessons from the 3D-Repertoire project.

    PubMed

    Collinet, B; Friberg, A; Brooks, M A; van den Elzen, T; Henriot, V; Dziembowski, A; Graille, M; Durand, D; Leulliot, N; Saint André, C; Lazar, N; Sattler, M; Séraphin, B; van Tilbeurgh, H

    2011-08-01

    Structural studies of multi-protein complexes, whether by X-ray diffraction, scattering, NMR spectroscopy or electron microscopy, require stringent quality control of the component samples. The inability to produce 'keystone' subunits in a soluble and correctly folded form is a serious impediment to the reconstitution of the complexes. Co-expression of the components offers a valuable alternative to the expression of single proteins as a route to obtain sufficient amounts of the sample of interest. Even in cases where milligram-scale quantities of purified complex of interest become available, there is still no guarantee that good quality crystals can be obtained. At this step, protein engineering of one or more components of the complex is frequently required to improve solubility, yield or the ability to crystallize the sample. Subsequent characterization of these constructs may be performed by solution techniques such as Small Angle X-ray Scattering and Nuclear Magnetic Resonance to identify 'well behaved' complexes. Herein, we recount our experiences gained at protein production and complex assembly during the European 3D Repertoire project (3DR). The goal of this consortium was to obtain structural information on multi-protein complexes from yeast by combining crystallography, electron microscopy, NMR and in silico modeling methods. We present here representative set case studies of complexes that were produced and analyzed within the 3DR project. Our experience provides useful insight into strategies that are more generally applicable for structural analysis of protein complexes. Copyright © 2011 Elsevier Inc. All rights reserved.

  13. Firefighter Workplace Learning: An Exploratory Case Study

    ERIC Educational Resources Information Center

    Tracey, Edward A.

    2014-01-01

    Despite there being a significant amount of research investigating workplace learning, research exploring firefighter workplace learning is almost nonexistent. The purpose of this qualitative multi-case study was to explore how firefighters conceptualize, report, and practice workplace learning. The researcher also investigated how firefighters…

  14. Nuclear quantum effects and kinetic isotope effects in enzyme reactions.

    PubMed

    Vardi-Kilshtain, Alexandra; Nitoker, Neta; Major, Dan Thomas

    2015-09-15

    Enzymes are extraordinarily effective catalysts evolved to perform well-defined and highly specific chemical transformations. Studying the nature of rate enhancements and the mechanistic strategies in enzymes is very important, both from a basic scientific point of view, as well as in order to improve rational design of biomimetics. Kinetic isotope effect (KIE) is a very important tool in the study of chemical reactions and has been used extensively in the field of enzymology. Theoretically, the prediction of KIEs in condensed phase environments such as enzymes is challenging due to the need to include nuclear quantum effects (NQEs). Herein we describe recent progress in our group in the development of multi-scale simulation methods for the calculation of NQEs and accurate computation of KIEs. We also describe their application to several enzyme systems. In particular we describe the use of combined quantum mechanics/molecular mechanics (QM/MM) methods in classical and quantum simulations. The development of various novel path-integral methods is reviewed. These methods are tailor suited to enzyme systems, where only a few degrees of freedom involved in the chemistry need to be quantized. The application of the hybrid QM/MM quantum-classical simulation approach to three case studies is presented. The first case involves the proton transfer in alanine racemase. The second case presented involves orotidine 5'-monophosphate decarboxylase where multidimensional free energy simulations together with kinetic isotope effects are combined in the study of the reaction mechanism. Finally, we discuss the proton transfer in nitroalkane oxidase, where the enzyme employs tunneling as a catalytic fine-tuning tool. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Postbuckling analysis of multi-layered graphene sheets under non-uniform biaxial compression

    NASA Astrophysics Data System (ADS)

    Farajpour, Ali; Arab Solghar, Alireza; Shahidi, Alireza

    2013-01-01

    In this article, the nonlinear buckling characteristics of multi-layered graphene sheets are investigated. The graphene sheet is modeled as an orthotropic nanoplate with size-dependent material properties. The graphene film is subjected by non-uniformly distributed in-plane load through its thickness. To include the small scale and the geometrical nonlinearity effects, the governing differential equations are derived based on the nonlocal elasticity theory in conjunction with the von Karman geometrical model. Explicit expressions for the postbuckling loads of single- and double-layered graphene sheets with simply supported edges under biaxial compression are obtained. For numerical results, six types of armchair and zigzag graphene sheets with different aspect ratio are considered. The present formulation and method of solution are validated by comparing the results, in the limit cases, with those available in the open literature. Excellent agreement between the obtained and available results is observed. Finally, the effects of nonlocal parameter, buckling mode number, compression ratio and non-uniform parameter on the postbuckling behavior of multi-layered graphene sheets are studied.

  16. Spatial resolution recovery utilizing multi-ray tracing and graphic processing unit in PET image reconstruction.

    PubMed

    Liang, Yicheng; Peng, Hao

    2015-02-07

    Depth-of-interaction (DOI) poses a major challenge for a PET system to achieve uniform spatial resolution across the field-of-view, particularly for small animal and organ-dedicated PET systems. In this work, we implemented an analytical method to model system matrix for resolution recovery, which was then incorporated in PET image reconstruction on a graphical processing unit platform, due to its parallel processing capacity. The method utilizes the concepts of virtual DOI layers and multi-ray tracing to calculate the coincidence detection response function for a given line-of-response. The accuracy of the proposed method was validated for a small-bore PET insert to be used for simultaneous PET/MR breast imaging. In addition, the performance comparisons were studied among the following three cases: 1) no physical DOI and no resolution modeling; 2) two physical DOI layers and no resolution modeling; and 3) no physical DOI design but with a different number of virtual DOI layers. The image quality was quantitatively evaluated in terms of spatial resolution (full-width-half-maximum and position offset), contrast recovery coefficient and noise. The results indicate that the proposed method has the potential to be used as an alternative to other physical DOI designs and achieve comparable imaging performances, while reducing detector/system design cost and complexity.

  17. A Multi-Objective Partition Method for Marine Sensor Networks Based on Degree of Event Correlation.

    PubMed

    Huang, Dongmei; Xu, Chenyixuan; Zhao, Danfeng; Song, Wei; He, Qi

    2017-09-21

    Existing marine sensor networks acquire data from sea areas that are geographically divided, and store the data independently in their affiliated sea area data centers. In the case of marine events across multiple sea areas, the current network structure needs to retrieve data from multiple data centers, and thus severely affects real-time decision making. In this study, in order to provide a fast data retrieval service for a marine sensor network, we use all the marine sensors as the vertices, establish the edge based on marine events, and abstract the marine sensor network as a graph. Then, we construct a multi-objective balanced partition method to partition the abstract graph into multiple regions and store them in the cloud computing platform. This method effectively increases the correlation of the sensors and decreases the retrieval cost. On this basis, an incremental optimization strategy is designed to dynamically optimize existing partitions when new sensors are added into the network. Experimental results show that the proposed method can achieve the optimal layout for distributed storage in the process of disaster data retrieval in the China Sea area, and effectively optimize the result of partitions when new buoys are deployed, which eventually will provide efficient data access service for marine events.

  18. A multi-level rapid prototyping drill guide template reduces the perforation risk of pedicle screw placement in the lumbar and sacral spine.

    PubMed

    Merc, Matjaz; Drstvensek, Igor; Vogrin, Matjaz; Brajlih, Tomaz; Recnik, Gregor

    2013-07-01

    The method of free-hand pedicle screw placement is generally safe although it carries potential risks. For this reason, several highly accurate computer-assisted systems were developed and are currently on the market. However, these devices have certain disadvantages. We have developed a method of pedicle screw placement in the lumbar and sacral region using a multi-level drill guide template, created with the rapid prototyping technology and have validated it in a clinical study. The aim of the study was to manufacture and evaluate the accuracy of a multi-level drill guide template for lumbar and first sacral pedicle screw placement and to compare it with the free-hand technique under fluoroscopy supervision. In 2011 and 2012, a randomized clinical trial was performed on 20 patients. 54 screws were implanted in the trial group using templates and 54 in the control group using the fluoroscopy-supervised free-hand technique. Furthermore, applicability for the first sacral level was tested. Preoperative CT-scans were taken and templates were designed using the selective laser sintering method. Postoperative evaluation and statistical analysis of pedicle violation, displacement, screw length and deviation were performed for both groups. The incidence of cortex perforation was significantly reduced in the template group; likewise, the deviation and displacement level of screws in the sagittal plane. In both groups there was no significantly important difference in deviation and displacement level in the transversal plane as not in pedicle screw length. The results for the first sacral level resembled the main investigated group. The method significantly lowers the incidence of cortex perforation and is therefore potentially applicable in clinical practice, especially in some selected cases. The applied method, however, carries a potential for errors during manufacturing and practical usage and therefore still requires further improvements.

  19. Comparative analysis of Pareto surfaces in multi-criteria IMRT planning

    NASA Astrophysics Data System (ADS)

    Teichert, K.; Süss, P.; Serna, J. I.; Monz, M.; Küfer, K. H.; Thieke, C.

    2011-06-01

    In the multi-criteria optimization approach to IMRT planning, a given dose distribution is evaluated by a number of convex objective functions that measure tumor coverage and sparing of the different organs at risk. Within this context optimizing the intensity profiles for any fixed set of beams yields a convex Pareto set in the objective space. However, if the number of beam directions and irradiation angles are included as free parameters in the formulation of the optimization problem, the resulting Pareto set becomes more intricate. In this work, a method is presented that allows for the comparison of two convex Pareto sets emerging from two distinct beam configuration choices. For the two competing beam settings, the non-dominated and the dominated points of the corresponding Pareto sets are identified and the distance between the two sets in the objective space is calculated and subsequently plotted. The obtained information enables the planner to decide if, for a given compromise, the current beam setup is optimal. He may then re-adjust his choice accordingly during navigation. The method is applied to an artificial case and two clinical head neck cases. In all cases no configuration is dominating its competitor over the whole Pareto set. For example, in one of the head neck cases a seven-beam configuration turns out to be superior to a nine-beam configuration if the highest priority is the sparing of the spinal cord. The presented method of comparing Pareto sets is not restricted to comparing different beam angle configurations, but will allow for more comprehensive comparisons of competing treatment techniques (e.g. photons versus protons) than with the classical method of comparing single treatment plans.

  20. Design Multi-Sides System Unmanned Surface Vehicle (USV) Rocket

    NASA Astrophysics Data System (ADS)

    Syam, Rafiudin; Sutresman, Onny; Mappaita, Abdullah; Amiruddin; Wiranata, Ardi

    2018-02-01

    This study aims to design and test USV multislide forms. This system is excellent for maneuvering on the x-y-z coordinates. The disadvantage of a single side USV is that it is very difficult to maneuver to achieve very dynamic targets. While for multi sides system easily maneuvered though x-y-z coordinates. In addition to security defense purposes, multi-side system is also good for maritime intelligence, surveillance. In this case, electric deducted fan with Multi-Side system so that the vehicle can still operate even in reverse condition. Multipleside USV experiments have done with good results. In a USV study designed to use two propulsions.

  1. Exploiting the capabilities of the Sentinel-2 multi spectral instrument for predicting growing stock volume in forest ecosystems

    NASA Astrophysics Data System (ADS)

    Mura, Matteo; Bottalico, Francesca; Giannetti, Francesca; Bertani, Remo; Giannini, Raffaello; Mancini, Marco; Orlandini, Simone; Travaglini, Davide; Chirici, Gherardo

    2018-04-01

    The spatial prediction of growing stock volume is one of the most frequent application of remote sensing for supporting the sustainable management of forest ecosystems. For such a purpose data from active or passive sensors are used as predictor variables in combination with measures taken in the field in sampling plots. The Sentinel-2 (S2) satellites are equipped with a Multi Spectral Instrument (MSI) capable of acquiring 13 bands in the visible and infrared domains with a spatial resolution varying between 10 and 60 m. The present study aimed at evaluating the performance of the S2-MSI imagery for estimating the growing stock volume of forest ecosystems. To do so we used 240 plots measured in two study areas in Italy. The imputation was carried out with eight k-Nearest Neighbours (k-NN) methods available in the open source YaImpute R package. In order to evaluate the S2-MSI performance we repeated the experimental protocol also with two other sets of images acquired by two well-known satellites equipped with multi spectral instruments: Landsat 8 OLI and RapidEye scanner. We found that S2 worked better than Landsat in 37.5% of the cases and in 62.5% of the cases better than RapidEye. In one study area the best performance was obtained with Landsat OLI (RMSD = 6.84%) and in the other with S2 (RMSD = 22.94%), both with the k-NN system based on a distance matrix calculated with the Random Forest algorithm. The results confirmed that S2 images are suitable for predicting growing stock volume obtaining good performances (average RMSD for both the test areas of less than 19%).

  2. Coupled multi-disciplinary simulation of composite engine structures in propulsion environment

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Singhal, Surendra N.

    1992-01-01

    A computational simulation procedure is described for the coupled response of multi-layered multi-material composite engine structural components which are subjected to simultaneous multi-disciplinary thermal, structural, vibration, and acoustic loadings including the effect of hostile environments. The simulation is based on a three dimensional finite element analysis technique in conjunction with structural mechanics codes and with acoustic analysis methods. The composite material behavior is assessed at the various composite scales, i.e., the laminate/ply/constituents (fiber/matrix), via a nonlinear material characterization model. Sample cases exhibiting nonlinear geometrical, material, loading, and environmental behavior of aircraft engine fan blades, are presented. Results for deformed shape, vibration frequency, mode shapes, and acoustic noise emitted from the fan blade, are discussed for their coupled effect in hot and humid environments. Results such as acoustic noise for coupled composite-mechanics/heat transfer/structural/vibration/acoustic analyses demonstrate the effectiveness of coupled multi-disciplinary computational simulation and the various advantages of composite materials compared to metals.

  3. Singular value decomposition based impulsive noise reduction in multi-frequency phase-sensitive demodulation of electrical impedance tomography

    NASA Astrophysics Data System (ADS)

    Hao, Zhenhua; Cui, Ziqiang; Yue, Shihong; Wang, Huaxiang

    2018-06-01

    As an important means in electrical impedance tomography (EIT), multi-frequency phase-sensitive demodulation (PSD) can be viewed as a matched filter for measurement signals and as an optimal linear filter in the case of Gaussian-type noise. However, the additive noise usually possesses impulsive noise characteristics, so it is a challenging task to reduce the impulsive noise in multi-frequency PSD effectively. In this paper, an approach for impulsive noise reduction in multi-frequency PSD of EIT is presented. Instead of linear filters, a singular value decomposition filter is employed as the pre-stage filtering module prior to PSD, which has advantages of zero phase shift, little distortion, and a high signal-to-noise ratio (SNR) in digital signal processing. Simulation and experimental results demonstrated that the proposed method can effectively eliminate the influence of impulsive noise in multi-frequency PSD, and it was capable of achieving a higher SNR and smaller demodulation error.

  4. Designing an optimal software intensive system acquisition: A game theoretic approach

    NASA Astrophysics Data System (ADS)

    Buettner, Douglas John

    The development of schedule-constrained software-intensive space systems is challenging. Case study data from national security space programs developed at the U.S. Air Force Space and Missile Systems Center (USAF SMC) provide evidence of the strong desire by contractors to skip or severely reduce software development design and early defect detection methods in these schedule-constrained environments. The research findings suggest recommendations to fully address these issues at numerous levels. However, the observations lead us to investigate modeling and theoretical methods to fundamentally understand what motivated this behavior in the first place. As a result, Madachy's inspection-based system dynamics model is modified to include unit testing and an integration test feedback loop. This Modified Madachy Model (MMM) is used as a tool to investigate the consequences of this behavior on the observed defect dynamics for two remarkably different case study software projects. Latin Hypercube sampling of the MMM with sample distributions for quality, schedule and cost-driven strategies demonstrate that the higher cost and effort quality-driven strategies provide consistently better schedule performance than the schedule-driven up-front effort-reduction strategies. Game theory reasoning for schedule-driven engineers cutting corners on inspections and unit testing is based on the case study evidence and Austin's agency model to describe the observed phenomena. Game theory concepts are then used to argue that the source of the problem and hence the solution to developers cutting corners on quality for schedule-driven system acquisitions ultimately lies with the government. The game theory arguments also lead to the suggestion that the use of a multi-player dynamic Nash bargaining game provides a solution for our observed lack of quality game between the government (the acquirer) and "large-corporation" software developers. A note is provided that argues this multi-player dynamic Nash bargaining game also provides the solution to Freeman Dyson's problem, for a way to place a label of good or bad on systems.

  5. Meeting Indigenous peoples' objectives in environmental flow assessments: Case studies from an Australian multi-jurisdictional water sharing initiative

    NASA Astrophysics Data System (ADS)

    Jackson, Sue; Pollino, Carmel; Maclean, Kirsten; Bark, Rosalind; Moggridge, Bradley

    2015-03-01

    The multi-dimensional relationships that Indigenous peoples have with water are only recently gaining recognition in water policy and management activities. Although Australian water policy stipulates that the native title interests of Indigenous peoples and their social, cultural and spiritual objectives be included in water plans, improved rates of Indigenous access to water have been slow to eventuate, particularly in those regions where the water resource is fully developed or allocated. Experimentation in techniques and approaches to both identify and determine Indigenous water requirements will be needed if environmental assessment processes and water sharing plans are to explicitly account for Indigenous water values. Drawing on two multidisciplinary case studies conducted in Australia's Murray-Darling Basin, we engage Indigenous communities to (i) understand their values and explore the application of methods to derive water requirements to meet those values; (ii) assess the impact of alternative water planning scenarios designed to address over-allocation to irrigation; and (iii) define additional volumes of water and potential works needed to meet identified Indigenous requirements. We provide a framework where Indigenous values can be identified and certain water needs quantified and advance a methodology to integrate Indigenous social, cultural and environmental objectives into environmental flow assessments.

  6. Development and Evaluation of a Multi-Institutional Case Studies-Based Course in Food Safety

    ERIC Educational Resources Information Center

    Pleitner, Aaron M.; Chapin, Travis K.; Hammons, Susan R.; Stelten, Anna Van; Nightingale, Kendra K.; Wiedmann, Martin; Johnston, Lynette M.; Oliver, Haley F.

    2015-01-01

    Developing novel, engaging courses in food safety is necessary to train professionals in this discipline. Courses that are interactive and case-based encourage development of critical thinking skills necessary for identifying and preventing foodborne disease outbreaks. The purpose of this study was to assess the efficacy of a case study…

  7. Forest fuel treatment detection using multi-temporal airborne Lidar data and high resolution aerial imagery ---- A case study at Sierra Nevada, California

    NASA Astrophysics Data System (ADS)

    Su, Y.; Guo, Q.; Collins, B.; Fry, D.; Kelly, M.

    2014-12-01

    Forest fuel treatments (FFT) are often employed in Sierra Nevada forest (located in California, US) to enhance forest health, regulate stand density, and reduce wildfire risk. However, there have been concerns that FFTs may have negative impacts on certain protected wildlife species. Due to the constraints and protection of resources (e.g., perennial streams, cultural resources, wildlife habitat, etc.), the actual FFT extents are usually different from planned extents. Identifying the actual extent of treated areas is of primary importance to understand the environmental influence of FFTs. Light detection and ranging (Lidar) is a powerful remote sensing technique that can provide accurate forest structure measurements, which provides great potential to monitor forest changes. This study used canopy height model (CHM) and canopy cover (CC) products derived from multi-temporal airborne Lidar data to detect FFTs by an approach combining a pixel-wise thresholding method and a object-of-interest segmentation method. We also investigated forest change following the implementation of landscape-scale FFT projects through the use of normalized difference vegetation index (NDVI) and standardized principle component analysis (PCA) from multi-temporal high resolution aerial imagery. The same FFT detection routine was applied on the Lidar data and aerial imagery for the purpose of comparing the capability of Lidar data and aerial imagery on FFT detection. Our results demonstrated that the FFT detection using Lidar derived CC products produced both the highest total accuracy and kappa coefficient, and was more robust at identifying areas with light FFTs. The accuracy using Lidar derived CHM products was significantly lower than that of the result using Lidar derived CC, but was still slightly higher than using aerial imagery. FFT detection results using NDVI and standardized PCA using multi-temporal aerial imagery produced almost identical total accuracy and kappa coefficient. Both methods showed relatively limited capacity to detect light FFT areas, and had higher false detection rate (recognized untreated areas as treated areas) compared to the methods using Lidar derived parameters.

  8. A Multi-Case Study of Professional Ethics in Alternative Education: Exploring Perspectives of Alternative School Administrators

    ERIC Educational Resources Information Center

    Duke, Richard T. RT, IV

    2017-01-01

    This qualitative case study explored perspectives of alternative school leaders regarding professional ethics and standards. The study researched two components of alternative school leadership: effective alternative school characteristics based on professional standards and making decisions around the best interests of students. This study…

  9. Harmonization Process and Reliability Assessment of Anthropometric Measurements in the Elderly EXERNET Multi-Centre Study

    PubMed Central

    Gómez-Cabello, Alba; Vicente-Rodríguez, Germán; Albers, Ulrike; Mata, Esmeralda; Rodriguez-Marroyo, Jose A.; Olivares, Pedro R.; Gusi, Narcis; Villa, Gerardo; Aznar, Susana; Gonzalez-Gross, Marcela; Casajús, Jose A.; Ara, Ignacio

    2012-01-01

    Background The elderly EXERNET multi-centre study aims to collect normative anthropometric data for old functionally independent adults living in Spain. Purpose To describe the standardization process and reliability of the anthropometric measurements carried out in the pilot study and during the final workshop, examining both intra- and inter-rater errors for measurements. Materials and Methods A total of 98 elderly from five different regions participated in the intra-rater error assessment, and 10 different seniors living in the city of Toledo (Spain) participated in the inter-rater assessment. We examined both intra- and inter-rater errors for heights and circumferences. Results For height, intra-rater technical errors of measurement (TEMs) were smaller than 0.25 cm. For circumferences and knee height, TEMs were smaller than 1 cm, except for waist circumference in the city of Cáceres. Reliability for heights and circumferences was greater than 98% in all cases. Inter-rater TEMs were 0.61 cm for height, 0.75 cm for knee-height and ranged between 2.70 and 3.09 cm for the circumferences measured. Inter-rater reliabilities for anthropometric measurements were always higher than 90%. Conclusion The harmonization process, including the workshop and pilot study, guarantee the quality of the anthropometric measurements in the elderly EXERNET multi-centre study. High reliability and low TEM may be expected when assessing anthropometry in elderly population. PMID:22860013

  10. An algorithm for retrieving rock-desertification from multispectral remote sensing images

    NASA Astrophysics Data System (ADS)

    Xia, Xueqi; Tian, Qingjiu; Liao, Yan

    2009-06-01

    Rock-desertification is a typical environmental and ecological problem in Southwest China. As remote sensing is an important means of monitoring spatial variation of rock-desertification, a method is developed for measurement and information retrieval of rock-desertification from multi-spectral high-resolution remote sensing images. MNF transform is applied to 4-band IKONOS multi-spectral remotely sensed data to reduce the number of spectral dimensions to three. In the 3-demension endmembers are extracted and analyzed. It is found that various vegetations group into a line defined as "vegetation line", in which "dark vegetations", such as coniferous forest and broadleaf forest, continuously change to "bright vegetations", such as grasses. It is presumed that is caused by deferent proportion of shadow mixed in leaves or branches in various types of vegetation. Normalized distance between the endmember of rocks and the vegetation line is defined as Geometric Rock-desertification Index (GRI), which was used to scale rock-desertification. The case study with ground truth validation in Puding, Guizhou province showed successes and the advantages of this method.

  11. Super-resolved Parallel MRI by Spatiotemporal Encoding

    PubMed Central

    Schmidt, Rita; Baishya, Bikash; Ben-Eliezer, Noam; Seginer, Amir; Frydman, Lucio

    2016-01-01

    Recent studies described an alternative “ultrafast” scanning method based on spatiotemporal (SPEN) principles. SPEN demonstrates numerous potential advantages over EPI-based alternatives, at no additional expense in experimental complexity. An important aspect that SPEN still needs to achieve for providing a competitive acquisition alternative entails exploiting parallel imaging algorithms, without compromising its proven capabilities. The present work introduces a combination of multi-band frequency-swept pulses simultaneously encoding multiple, partial fields-of-view; together with a new algorithm merging a Super-Resolved SPEN image reconstruction and SENSE multiple-receiving methods. The ensuing approach enables one to reduce both the excitation and acquisition times of ultrafast SPEN acquisitions by the customary acceleration factor R, without compromises in either the ensuing spatial resolution, SAR deposition, or the capability to operate in multi-slice mode. The performance of these new single-shot imaging sequences and their ancillary algorithms were explored on phantoms and human volunteers at 3T. The gains of the parallelized approach were particularly evident when dealing with heterogeneous systems subject to major T2/T2* effects, as is the case upon single-scan imaging near tissue/air interfaces. PMID:24120293

  12. Integrating optical finger motion tracking with surface touch events.

    PubMed

    MacRitchie, Jennifer; McPherson, Andrew P

    2015-01-01

    This paper presents a method of integrating two contrasting sensor systems for studying human interaction with a mechanical system, using piano performance as the case study. Piano technique requires both precise small-scale motion of fingers on the key surfaces and planned large-scale movement of the hands and arms. Where studies of performance often focus on one of these scales in isolation, this paper investigates the relationship between them. Two sensor systems were installed on an acoustic grand piano: a monocular high-speed camera tracking the position of painted markers on the hands, and capacitive touch sensors attach to the key surfaces which measure the location of finger-key contacts. This paper highlights a method of fusing the data from these systems, including temporal and spatial alignment, segmentation into notes and automatic fingering annotation. Three case studies demonstrate the utility of the multi-sensor data: analysis of finger flexion or extension based on touch and camera marker location, timing analysis of finger-key contact preceding and following key presses, and characterization of individual finger movements in the transitions between successive key presses. Piano performance is the focus of this paper, but the sensor method could equally apply to other fine motor control scenarios, with applications to human-computer interaction.

  13. Integrating optical finger motion tracking with surface touch events

    PubMed Central

    MacRitchie, Jennifer; McPherson, Andrew P.

    2015-01-01

    This paper presents a method of integrating two contrasting sensor systems for studying human interaction with a mechanical system, using piano performance as the case study. Piano technique requires both precise small-scale motion of fingers on the key surfaces and planned large-scale movement of the hands and arms. Where studies of performance often focus on one of these scales in isolation, this paper investigates the relationship between them. Two sensor systems were installed on an acoustic grand piano: a monocular high-speed camera tracking the position of painted markers on the hands, and capacitive touch sensors attach to the key surfaces which measure the location of finger-key contacts. This paper highlights a method of fusing the data from these systems, including temporal and spatial alignment, segmentation into notes and automatic fingering annotation. Three case studies demonstrate the utility of the multi-sensor data: analysis of finger flexion or extension based on touch and camera marker location, timing analysis of finger-key contact preceding and following key presses, and characterization of individual finger movements in the transitions between successive key presses. Piano performance is the focus of this paper, but the sensor method could equally apply to other fine motor control scenarios, with applications to human-computer interaction. PMID:26082732

  14. Characteristics of Behavior of Robots with Emotion Model

    NASA Astrophysics Data System (ADS)

    Sato, Shigehiko; Nozawa, Akio; Ide, Hideto

    Cooperated multi robots system has much dominance in comparison with single robot system. It is able to adapt to various circumstances and has a flexibility for variation of tasks. However it has still problems to control each robot, though methods for control multi robots system have been studied. Recently, the robots have been coming into real scene. And emotion and sensitivity of the robots have been widely studied. In this study, human emotion model based on psychological interaction was adapt to multi robots system to achieve methods for organization of multi robots. The characteristics of behavior of multi robots system achieved through computer simulation were analyzed. As a result, very complexed and interesting behavior was emerged even though it has rather simple configuration. And it has flexiblity in various circumstances. Additional experiment with actual robots will be conducted based on the emotion model.

  15. Investigation of upwind, multigrid, multiblock numerical schemes for three dimensional flows. Volume 1: Runge-Kutta methods for a thin layer Navier-Stokes solver

    NASA Technical Reports Server (NTRS)

    Cannizzaro, Frank E.; Ash, Robert L.

    1992-01-01

    A state-of-the-art computer code has been developed that incorporates a modified Runge-Kutta time integration scheme, upwind numerical techniques, multigrid acceleration, and multi-block capabilities (RUMM). A three-dimensional thin-layer formulation of the Navier-Stokes equations is employed. For turbulent flow cases, the Baldwin-Lomax algebraic turbulence model is used. Two different upwind techniques are available: van Leer's flux-vector splitting and Roe's flux-difference splitting. Full approximation multi-grid plus implicit residual and corrector smoothing were implemented to enhance the rate of convergence. Multi-block capabilities were developed to provide geometric flexibility. This feature allows the developed computer code to accommodate any grid topology or grid configuration with multiple topologies. The results shown in this dissertation were chosen to validate the computer code and display its geometric flexibility, which is provided by the multi-block structure.

  16. Analytical study and numerical solution of the inverse source problem arising in thermoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Holman, Benjamin R.

    In recent years, revolutionary "hybrid" or "multi-physics" methods of medical imaging have emerged. By combining two or three different types of waves these methods overcome limitations of classical tomography techniques and deliver otherwise unavailable, potentially life-saving diagnostic information. Thermoacoustic (and photoacoustic) tomography is the most developed multi-physics imaging modality. Thermo- and photo- acoustic tomography require reconstructing initial acoustic pressure in a body from time series of pressure measured on a surface surrounding the body. For the classical case of free space wave propagation, various reconstruction techniques are well known. However, some novel measurement schemes place the object of interest between reflecting walls that form a de facto resonant cavity. In this case, known methods cannot be used. In chapter 2 we present a fast iterative reconstruction algorithm for measurements made at the walls of a rectangular reverberant cavity with a constant speed of sound. We prove the convergence of the iterations under a certain sufficient condition, and demonstrate the effectiveness and efficiency of the algorithm in numerical simulations. In chapter 3 we consider the more general problem of an arbitrarily shaped resonant cavity with a non constant speed of sound and present the gradual time reversal method for computing solutions to the inverse source problem. It consists in solving back in time on the interval [0, T] the initial/boundary value problem for the wave equation, with the Dirichlet boundary data multiplied by a smooth cutoff function. If T is sufficiently large one obtains a good approximation to the initial pressure; in the limit of large T such an approximation converges (under certain conditions) to the exact solution.

  17. Multiple network alignment via multiMAGNA+.

    PubMed

    Vijayan, Vipin; Milenkovic, Tijana

    2017-08-21

    Network alignment (NA) aims to find a node mapping that identifies topologically or functionally similar network regions between molecular networks of different species. Analogous to genomic sequence alignment, NA can be used to transfer biological knowledge from well- to poorly-studied species between aligned network regions. Pairwise NA (PNA) finds similar regions between two networks while multiple NA (MNA) can align more than two networks. We focus on MNA. Existing MNA methods aim to maximize total similarity over all aligned nodes (node conservation). Then, they evaluate alignment quality by measuring the amount of conserved edges, but only after the alignment is constructed. Directly optimizing edge conservation during alignment construction in addition to node conservation may result in superior alignments. Thus, we present a novel MNA method called multiMAGNA++ that can achieve this. Indeed, multiMAGNA++ outperforms or is on par with existing MNA methods, while often completing faster than existing methods. That is, multiMAGNA++ scales well to larger network data and can be parallelized effectively. During method evaluation, we also introduce new MNA quality measures to allow for more fair MNA method comparison compared to the existing alignment quality measures. MultiMAGNA++ code is available on the method's web page at http://nd.edu/~cone/multiMAGNA++/.

  18. Regional fringe analysis for improving depth measurement in phase-shifting fringe projection profilometry

    NASA Astrophysics Data System (ADS)

    Chien, Kuang-Che Chang; Tu, Han-Yen; Hsieh, Ching-Huang; Cheng, Chau-Jern; Chang, Chun-Yen

    2018-01-01

    This study proposes a regional fringe analysis (RFA) method to detect the regions of a target object in captured shifted images to improve depth measurement in phase-shifting fringe projection profilometry (PS-FPP). In the RFA method, region-based segmentation is exploited to segment the de-fringed image of a target object, and a multi-level fuzzy-based classification with five presented features is used to analyze and discriminate the regions of an object from the segmented regions, which were associated with explicit fringe information. Then, in the experiment, the performance of the proposed method is tested and evaluated on 26 test cases made of five types of materials. The qualitative and quantitative results demonstrate that the proposed RFA method can effectively detect the desired regions of an object to improve depth measurement in the PS-FPP system.

  19. Sensor fusion for antipersonnel landmine detection: a case study

    NASA Astrophysics Data System (ADS)

    den Breejen, Eric; Schutte, Klamer; Cremer, Frank

    1999-08-01

    In this paper the multi sensor fusion results obtained within the European research project GEODE are presented. The layout of the test lane and the individual sensors used are described. The implementation of the SCOOP algorithm improves the ROC curves, as the false alarm surface and the number of false alarms both are taken into account. The confidence grids, as produced by the sensor manufacturers, of the sensors are used as input for the different sensor fusion methods implemented. The multisensor fusion methods implemented are Bayes, Dempster-Shafer, fuzzy probabilities and rules. The mapping of the confidence grids to the input parameters for fusion methods is an important step. Due to limited amount of the available data the entire test lane is used for training and evaluation. All four sensor fusion methods provide better detection results than the individual sensors.

  20. Optimal clustering of MGs based on droop controller for improving reliability using a hybrid of harmony search and genetic algorithms.

    PubMed

    Abedini, Mohammad; Moradi, Mohammad H; Hosseinian, S M

    2016-03-01

    This paper proposes a novel method to address reliability and technical problems of microgrids (MGs) based on designing a number of self-adequate autonomous sub-MGs via adopting MGs clustering thinking. In doing so, a multi-objective optimization problem is developed where power losses reduction, voltage profile improvement and reliability enhancement are considered as the objective functions. To solve the optimization problem a hybrid algorithm, named HS-GA, is provided, based on genetic and harmony search algorithms, and a load flow method is given to model different types of DGs as droop controller. The performance of the proposed method is evaluated in two case studies. The results provide support for the performance of the proposed method. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Simulation analysis of impulse characteristics of space debris irradiated by multi-pulse laser

    NASA Astrophysics Data System (ADS)

    Lin, Zhengguo; Jin, Xing; Chang, Hao; You, Xiangyu

    2018-02-01

    Cleaning space debris with laser is a hot topic in the field of space security research. Impulse characteristics are the basis of cleaning space debris with laser. In order to study the impulse characteristics of rotating irregular space debris irradiated by multi-pulse laser, the impulse calculation method of rotating space debris irradiated by multi-pulse laser is established based on the area matrix method. The calculation method of impulse and impulsive moment under multi-pulse irradiation is given. The calculation process of total impulse under multi-pulse irradiation is analyzed. With a typical non-planar space debris (cube) as example, the impulse characteristics of space debris irradiated by multi-pulse laser are simulated and analyzed. The effects of initial angular velocity, spot size and pulse frequency on impulse characteristics are investigated.

  2. Nonlinear transient analysis of multi-mass flexible rotors - theory and applications

    NASA Technical Reports Server (NTRS)

    Kirk, R. G.; Gunter, E. J.

    1973-01-01

    The equations of motion necessary to compute the transient response of multi-mass flexible rotors are formulated to include unbalance, rotor acceleration, and flexible damped nonlinear bearing stations. A method of calculating the unbalance response of flexible rotors from a modified Myklestad-Prohl technique is discussed in connection with the method of solution for the transient response. Several special cases of simplified rotor-bearing systems are presented and analyzed for steady-state response, stability, and transient behavior. These simplified rotor models produce extensive design information necessary to insure stable performance to elastic mounted rotor-bearing systems under varying levels and forms of excitation. The nonlinear journal bearing force expressions derived from the short bearing approximation are utilized in the study of the stability and transient response of the floating bush squeeze damper support system. Both rigid and flexible rotor models are studied, and results indicate that the stability of flexible rotors supported by journal bearings can be greatly improved by the use of squeeze damper supports. Results from linearized stability studies of flexible rotors indicate that a tuned support system can greatly improve the performance of the units from the standpoint of unbalanced response and impact loading. Extensive stability and design charts may be readily produced for given rotor specifications by the computer codes presented in this analysis.

  3. Selection of Thermal Worst-Case Orbits via Modified Efficient Global Optimization

    NASA Technical Reports Server (NTRS)

    Moeller, Timothy M.; Wilhite, Alan W.; Liles, Kaitlin A.

    2014-01-01

    Efficient Global Optimization (EGO) was used to select orbits with worst-case hot and cold thermal environments for the Stratospheric Aerosol and Gas Experiment (SAGE) III. The SAGE III system thermal model changed substantially since the previous selection of worst-case orbits (which did not use the EGO method), so the selections were revised to ensure the worst cases are being captured. The EGO method consists of first conducting an initial set of parametric runs, generated with a space-filling Design of Experiments (DoE) method, then fitting a surrogate model to the data and searching for points of maximum Expected Improvement (EI) to conduct additional runs. The general EGO method was modified by using a multi-start optimizer to identify multiple new test points at each iteration. This modification facilitates parallel computing and decreases the burden of user interaction when the optimizer code is not integrated with the model. Thermal worst-case orbits for SAGE III were successfully identified and shown by direct comparison to be more severe than those identified in the previous selection. The EGO method is a useful tool for this application and can result in computational savings if the initial Design of Experiments (DoE) is selected appropriately.

  4. Pareto frontier analyses based decision making tool for transportation of hazardous waste.

    PubMed

    Das, Arup; Mazumder, T N; Gupta, A K

    2012-08-15

    Transportation of hazardous wastes through a region poses immense threat on the development along its road network. The risk to the population, exposed to such activities, has been documented in the past. However, a comprehensive framework for routing hazardous wastes has often been overlooked. A regional Hazardous Waste Management scheme should incorporate a comprehensive framework for hazardous waste transportation. This framework would incorporate the various stakeholders involved in decision making. Hence, a multi-objective approach is required to safeguard the interest of all the concerned stakeholders. The objective of this study is to design a methodology for routing of hazardous wastes between the generating units and the disposal facilities through a capacity constrained network. The proposed methodology uses posteriori method with multi-objective approach to find non-dominated solutions for the system consisting of multiple origins and destinations. A case study of transportation of hazardous wastes in Kolkata Metropolitan Area has also been provided to elucidate the methodology. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Study and Application of a Multi Magnetoresistor Sensor System to Detect Corrosion in Suspension Cables

    NASA Astrophysics Data System (ADS)

    Torres, V.; Quek, S.; Gaydecki, P.

    2010-02-01

    Aging and deterioration of the main functional parts in civil structures is one of the biggest problems that private and governmental institutions, dedicated to operate and maintain such structures, are facing now days. In the case of relatively old suspension bridges, problems emerge due to corrosion and break of wires in the main cables. Decisive information and a reliable monitoring and evaluation are factors of great relevance required to prevent significant or catastrophic damages caused to the structure, and more importantly, to people. The main challenge for the NDE methods of inspection arises in dealing with the steel wrapping barrier of the suspension cable, which main function is to shield, shape and hold the bundles. The following work, presents a study of a multi-Magnetoresistive sensors system aiming to support the monitoring and evaluation of suspension cables at some of its stages. Modelling, signal acquisition, signal processing, experiments and the initial phases of implementation are presented and discussed widely.

  6. Adaptive subdomain modeling: A multi-analysis technique for ocean circulation models

    NASA Astrophysics Data System (ADS)

    Altuntas, Alper; Baugh, John

    2017-07-01

    Many coastal and ocean processes of interest operate over large temporal and geographical scales and require a substantial amount of computational resources, particularly when engineering design and failure scenarios are also considered. This study presents an adaptive multi-analysis technique that improves the efficiency of these computations when multiple alternatives are being simulated. The technique, called adaptive subdomain modeling, concurrently analyzes any number of child domains, with each instance corresponding to a unique design or failure scenario, in addition to a full-scale parent domain providing the boundary conditions for its children. To contain the altered hydrodynamics originating from the modifications, the spatial extent of each child domain is adaptively adjusted during runtime depending on the response of the model. The technique is incorporated in ADCIRC++, a re-implementation of the popular ADCIRC ocean circulation model with an updated software architecture designed to facilitate this adaptive behavior and to utilize concurrent executions of multiple domains. The results of our case studies confirm that the method substantially reduces computational effort while maintaining accuracy.

  7. [Clinical observation of apatinib mesylate for the treatment of multi-drug resistant advanced breast cancer].

    PubMed

    Lü, H M; Zhang, M W; Niu, L M; Zeng, H A; Yan, M

    2018-04-24

    Objective: To assess the clinical efficacy and adverse outcomes of apatinib mesylate for the treatment of multi-drug resistant advanced breast cancer. Methods: A total of 24 patients with multi-drug-resistant advanced breast cancer who underwent apatinib mesylate treatment were retrospectively analyzed at the Diagnosis and Treatment Center for Breast Cancer of Henan Cancer Hospital. Patients were reviewed every 4 weeks after initial treatment and then every 8 weeks after stable disease. Objective response rate (ORR), progression free survival (PFS), overall survival (OS) , toxicity and adverse outcomes of apatinib mesylate treatment were evaluated by imaging examinations. Results: Totally, 24 patients received apatinib mesylate at a dose of 500 mg QD. Out of the 24 patients treated, complete remission (CR) occurred in none of the patients, partial remission (PR) in 10 cases, stable disease (SD) in 10 cases, progressive disease (PD) in 4 cases, and drug with drawal in 2 cases due to adverse outcomes. Treatment with apatinib mesylate resulted in an ORR of 41.7% (10/24), disease control rate (DCR) of 83.3%, PFS of 4.7 months, and OS of 8.0 months. Adverse outcomes included proteinuria, high blood pressure, fatigue, hand-foot skin reaction (HFSR), hyperbilirubinemia, leukopenia, hair/skin pigmentation decreased. Most of the adverse events were tolerable and can be controlled after symptomatic management. Conclusions: Single-agent apatinib mesylate demonstrated the good short-term efficacy for multi-drug resistant advanced breast cancer in patients who previously underwent multiple line treatment failures. Adverse effects were controllable after symptomatic management. Treatment with apatinib mesylate maybe a viable option when other treatment modalities failed.

  8. Guideline adaptation and implementation planning: a prospective observational study

    PubMed Central

    2013-01-01

    Background Adaptation of high-quality practice guidelines for local use has been advanced as an efficient means to improve acceptability and applicability of evidence-informed care. In a pan-Canadian study, we examined how cancer care groups adapted pre-existing guidelines to their unique context and began implementation planning. Methods Using a mixed-methods, case-study design, five cases were purposefully sampled from self-identified groups and followed as they used a structured method and resources for guideline adaptation. Cases received the ADAPTE Collaboration toolkit, facilitation, methodological and logistical support, resources and assistance as required. Documentary and primary data collection methods captured individual case experience, including monthly summaries of meeting and field notes, email/telephone correspondence, and project records. Site visits, process audits, interviews, and a final evaluation forum with all cases contributed to a comprehensive account of participant experience. Results Study cases took 12 to >24 months to complete guideline adaptation. Although participants appreciated the structure, most found the ADAPTE method complex and lacking practical aspects. They needed assistance establishing individual guideline mandate and infrastructure, articulating health questions, executing search strategies, appraising evidence, and achieving consensus. Facilitation was described as a multi-faceted process, a team effort, and an essential ingredient for guideline adaptation. While front-line care providers implicitly identified implementation issues during adaptation, they identified a need to add an explicit implementation planning component. Conclusions Guideline adaptation is a positive initial step toward evidence-informed care, but adaptation (vs. ‘de novo’ development) did not meet expectations for reducing time or resource commitments. Undertaking adaptation is as much about the process (engagement and capacity building) as it is about the product (adapted guideline). To adequately address local concerns, cases found it necessary to also search and appraise primary studies, resulting in hybrid (adaptation plus de novo) guideline development strategies that required advanced methodological skills. Adaptation was found to be an action element in the knowledge translation continuum that required integration of an implementation perspective. Accordingly, the adaptation methodology and resources were reformulated and substantially augmented to provide practical assistance to groups not supported by a dedicated guideline panel and to provide more implementation planning support. The resulting framework is called CAN-IMPLEMENT. PMID:23656884

  9. A Case Study of Collaboration with Multi-Robots and Its Effect on Children's Interaction

    ERIC Educational Resources Information Center

    Hwang, Wu-Yuin; Wu, Sheng-Yi

    2014-01-01

    Learning how to carry out collaborative tasks is critical to the development of a student's capacity for social interaction. In this study, a multi-robot system was designed for students. In three different scenarios, students controlled robots in order to move dice; we then examined their collaborative strategies and their behavioral…

  10. Extending multi-tenant architectures: a database model for a multi-target support in SaaS applications

    NASA Astrophysics Data System (ADS)

    Rico, Antonio; Noguera, Manuel; Garrido, José Luis; Benghazi, Kawtar; Barjis, Joseph

    2016-05-01

    Multi-tenant architectures (MTAs) are considered a cornerstone in the success of Software as a Service as a new application distribution formula. Multi-tenancy allows multiple customers (i.e. tenants) to be consolidated into the same operational system. This way, tenants run and share the same application instance as well as costs, which are significantly reduced. Functional needs vary from one tenant to another; either companies from different sectors run different types of applications or, although deploying the same functionality, they do differ in the extent of their complexity. In any case, MTA leaves one major concern regarding the companies' data, their privacy and security, which requires special attention to the data layer. In this article, we propose an extended data model that enhances traditional MTAs in respect of this concern. This extension - called multi-target - allows MT applications to host, manage and serve multiple functionalities within the same multi-tenant (MT) environment. The practical deployment of this approach will allow SaaS vendors to target multiple markets or address different levels of functional complexity and yet commercialise just one single MT application. The applicability of the approach is demonstrated via a case study of a real multi-tenancy multi-target (MT2) implementation, called Globalgest.

  11. Uncovering One Trilingual Child's Multi-Literacies Development across Informal and Formal Learning Contexts

    ERIC Educational Resources Information Center

    Kim, Mi Song

    2016-01-01

    Due to globalisation and rapid technological change, today's educators need to help students develop multi-literacy competencies to enable them to function successfully in our culturally and linguistically diverse (CLD) and increasingly connected global and digital society. A qualitative, longitudinal case study attempted to uncover the…

  12. Hybrid E-Learning Tool TransLearning: Video Storytelling to Foster Vicarious Learning within Multi-Stakeholder Collaboration Networks

    ERIC Educational Resources Information Center

    van der Meij, Marjoleine G.; Kupper, Frank; Beers, Pieter J.; Broerse, Jacqueline E. W.

    2016-01-01

    E-learning and storytelling approaches can support informal vicarious learning within geographically widely distributed multi-stakeholder collaboration networks. This case study evaluates hybrid e-learning and video-storytelling approach "TransLearning" by investigation into how its storytelling e-tool supported informal vicarious…

  13. Minneapolis Multi-Ethnic Curriculum Project--Acculturation Unit.

    ERIC Educational Resources Information Center

    Skjervold, Christian K.; And Others

    The student booklet presents short case studies illustrating the acculturation unit of the Minneapolis Multi-Ethnic Curriculum Project for secondary schools. It is presented in nine chapters. Chapter I provides background information on immigration and points out ways acculturation takes place. Chapter II, "Barrio Boy," tells of life in…

  14. Multi-Tiered Systems of Support Preservice Residency: A Pilot Undergraduate Teacher Preparation Model

    ERIC Educational Resources Information Center

    Ross, Scott Warren; Lignugaris-Kraft, Ben

    2015-01-01

    This case study examined the implementation of a novel nontraditional teacher preparation program, "Multi-Tiered Systems of Support Preservice Residency Project" (MTSS-PR). The two-year program placed general and special education composite undergraduate majors full time in high-need schools implementing evidence-based systems of…

  15. Intuitionistic fuzzy analytical hierarchical processes for selecting the paradigms of mangroves in municipal wastewater treatment.

    PubMed

    Ouyang, Xiaoguang; Guo, Fen

    2018-04-01

    Municipal wastewater discharge is widespread and one of the sources of coastal eutrophication, and is especially uncontrolled in developing and undeveloped coastal regions. Mangrove forests are natural filters of pollutants in wastewater. There are three paradigms of mangroves for municipal wastewater treatment and the selection of the optimal one is a multi-criteria decision-making problem. Combining intuitionistic fuzzy theory, the Fuzzy Delphi Method and the fuzzy analytical hierarchical process (AHP), this study develops an intuitionistic fuzzy AHP (IFAHP) method. For the Fuzzy Delphi Method, the judgments of experts and representatives on criterion weights are made by linguistic variables and quantified by intuitionistic fuzzy theory, which is also used to weight the importance of experts and representatives. This process generates the entropy weights of criteria, which are combined with indices values and weights to rank the alternatives by the fuzzy AHP method. The IFAHP method was used to select the optimal paradigm of mangroves for treating municipal wastewater. The entropy weights were entrained by the valid evaluation of 64 experts and representatives via online survey. Natural mangroves were found to be the optimal paradigm for municipal wastewater treatment. By assigning different weights to the criteria, sensitivity analysis shows that natural mangroves remain to be the optimal paradigm under most scenarios. This study stresses the importance of mangroves for wastewater treatment. Decision-makers need to contemplate mangrove reforestation projects, especially where mangroves are highly deforested but wastewater discharge is uncontrolled. The IFAHP method is expected to be applied in other multi-criteria decision-making cases. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Abstracting of suspected illegal land use in urban areas using case-based classification of remote sensing images

    NASA Astrophysics Data System (ADS)

    Chen, Fulong; Wang, Chao; Yang, Chengyun; Zhang, Hong; Wu, Fan; Lin, Wenjuan; Zhang, Bo

    2008-11-01

    This paper proposed a method that uses a case-based classification of remote sensing images and applied this method to abstract the information of suspected illegal land use in urban areas. Because of the discrete cases for imagery classification, the proposed method dealt with the oscillation of spectrum or backscatter within the same land use category, and it not only overcame the deficiency of maximum likelihood classification (the prior probability of land use could not be obtained) but also inherited the advantages of the knowledge-based classification system, such as artificial intelligence and automatic characteristics. Consequently, the proposed method could do the classifying better. Then the researchers used the object-oriented technique for shadow removal in highly dense city zones. With multi-temporal SPOT 5 images whose resolution was 2.5×2.5 meters, the researchers found that the method can abstract suspected illegal land use information in urban areas using post-classification comparison technique.

  17. Applying Asynchronous Solutions to the Multi-Tasking Realities of a Teacher Education Faculty Unit: Case Study

    ERIC Educational Resources Information Center

    Moffett, David W.; Claxton, Melba S.; Jordan, Skye L.; Mercer, Patricia P.; Reid, Barbara K.

    2007-01-01

    The case study describes the early stages of building and using a learning management system (LMS) to aid in the productivity of an education faculty unit. Little to no research exists regarding teacher education units using LMSs to create an online web group for work purposes. The literature review preceding the case study illuminated some of the…

  18. DEEP SPACE: High Resolution VR Platform for Multi-user Interactive Narratives

    NASA Astrophysics Data System (ADS)

    Kuka, Daniela; Elias, Oliver; Martins, Ronald; Lindinger, Christopher; Pramböck, Andreas; Jalsovec, Andreas; Maresch, Pascal; Hörtner, Horst; Brandl, Peter

    DEEP SPACE is a large-scale platform for interactive, stereoscopic and high resolution content. The spatial and the system design of DEEP SPACE are facing constraints of CAVETM-like systems in respect to multi-user interactive storytelling. To be used as research platform and as public exhibition space for many people, DEEP SPACE is capable to process interactive, stereoscopic applications on two projection walls with a size of 16 by 9 meters and a resolution of four times 1080p (4K) each. The processed applications are ranging from Virtual Reality (VR)-environments to 3D-movies to computationally intensive 2D-productions. In this paper, we are describing DEEP SPACE as an experimental VR platform for multi-user interactive storytelling. We are focusing on the system design relevant for the platform, including the integration of the Apple iPod Touch technology as VR control, and a special case study that is demonstrating the research efforts in the field of multi-user interactive storytelling. The described case study, entitled "Papyrate's Island", provides a prototypical scenario of how physical drawings may impact on digital narratives. In this special case, DEEP SPACE helps us to explore the hypothesis that drawing, a primordial human creative skill, gives us access to entirely new creative possibilities in the domain of interactive storytelling.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Hongyi; Li, Yang; Zeng, Danielle

    Process integration and optimization is the key enabler of the Integrated Computational Materials Engineering (ICME) of carbon fiber composites. In this paper, automated workflows are developed for two types of composites: Sheet Molding Compounds (SMC) short fiber composites, and multi-layer unidirectional (UD) composites. For SMC, the proposed workflow integrates material processing simulation, microstructure representation volume element (RVE) models, material property prediction and structure preformation simulation to enable multiscale, multidisciplinary analysis and design. Processing parameters, microstructure parameters and vehicle subframe geometry parameters are defined as the design variables; the stiffness and weight of the structure are defined as the responses. Formore » multi-layer UD structure, this work focuses on the discussion of different design representation methods and their impacts on the optimization performance. Challenges in ICME process integration and optimization are also summarized and highlighted. Two case studies are conducted to demonstrate the integrated process and its application in optimization.« less

  20. [Emergency Doctor Training for Psychiatric Emergencies: Evaluation of an Interactive Training Program].

    PubMed

    Flüchter, Peter; Müller, Vincent; Bischof, Felix; Pajonk, Frank-Gerald Bernhard

    2017-03-01

    Aim Emergency physicians are often confronted with psychiatric emergencies, but are not well trained for it and often feel unable to cope sufficiently with them. The aim of this investigation was to examine whether multisensoric training may improve learning effects in the training of emergency physicians with regard to psychiatric emergencies. Method Participation in a multi-modal, multi-media training program with video case histories and subsequent evaluation by questionnaire. Results 66 emergency physicians assessed their learning effects. 75 % or 73 % rated it as "rather high" or "very high". In particular, in comparison with classical training/self-study 89 % assessed the effects in learning as "rather high" or "very high" . Conclusion This training receives a high level of acceptance. Using videos, learning content may be provided more practice-related. Thus, emergency physicians are able to develop a greater understanding of psychiatric emergencies. © Georg Thieme Verlag KG Stuttgart · New York.

  1. A framework for service enterprise workflow simulation with multi-agents cooperation

    NASA Astrophysics Data System (ADS)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  2. Generalized analytical solutions to sequentially coupled multi-species advective-dispersive transport equations in a finite domain subject to an arbitrary time-dependent source boundary condition

    NASA Astrophysics Data System (ADS)

    Chen, Jui-Sheng; Liu, Chen-Wuing; Liang, Ching-Ping; Lai, Keng-Hsin

    2012-08-01

    SummaryMulti-species advective-dispersive transport equations sequentially coupled with first-order decay reactions are widely used to describe the transport and fate of the decay chain contaminants such as radionuclide, chlorinated solvents, and nitrogen. Although researchers attempted to present various types of methods for analytically solving this transport equation system, the currently available solutions are mostly limited to an infinite or a semi-infinite domain. A generalized analytical solution for the coupled multi-species transport problem in a finite domain associated with an arbitrary time-dependent source boundary is not available in the published literature. In this study, we first derive generalized analytical solutions for this transport problem in a finite domain involving arbitrary number of species subject to an arbitrary time-dependent source boundary. Subsequently, we adopt these derived generalized analytical solutions to obtain explicit analytical solutions for a special-case transport scenario involving an exponentially decaying Bateman type time-dependent source boundary. We test the derived special-case solutions against the previously published coupled 4-species transport solution and the corresponding numerical solution with coupled 10-species transport to conduct the solution verification. Finally, we compare the new analytical solutions derived for a finite domain against the published analytical solutions derived for a semi-infinite domain to illustrate the effect of the exit boundary condition on coupled multi-species transport with an exponential decaying source boundary. The results show noticeable discrepancies between the breakthrough curves of all the species in the immediate vicinity of the exit boundary obtained from the analytical solutions for a finite domain and a semi-infinite domain for the dispersion-dominated condition.

  3. Generalized entanglement constraints in multi-qubit systems in terms of Tsallis entropy

    NASA Astrophysics Data System (ADS)

    Kim, Jeong San

    2016-10-01

    We provide generalized entanglement constraints in multi-qubit systems in terms of Tsallis entropy. Using quantum Tsallis entropy of order q, we first provide a generalized monogamy inequality of multi-qubit entanglement for q = 2 or 3. This generalization encapsulates the multi-qubit CKW-type inequality as a special case. We further provide a generalized polygamy inequality of multi-qubit entanglement in terms of Tsallis- q entropy for 1 ≤ q ≤ 2 or 3 ≤ q ≤ 4, which also contains the multi-qubit polygamy inequality as a special case.

  4. Climate change impacts: The challenge of quantifying multi-factor causation, multi-component responses, and leveraging from extremes

    NASA Astrophysics Data System (ADS)

    Field, C. B.

    2012-12-01

    Modeling climate change impacts is challenging for a variety of reasons. Some of these are related to causation. A weather or climate event is rarely the sole cause of an impact, and, for many impacts, social, economic, cultural, or ecological factors may play a larger role than climate. Other challenges are related to outcomes. Consequences of an event are often most severe when several kinds of responses interact, typically in unexpected ways. Many kinds of consequences are difficult to quantify, especially when they include a mix of market, cultural, personal, and ecological values. In addition, scale can be tremendously important. Modest impacts over large areas present very different challenges than severe but very local impacts. Finally, impacts may respond non-linearly to forcing, with behavior that changes qualitatively at one or more thresholds and with unexpected outcomes in extremes. Modeling these potentially complex interactions between drivers and impacts presents one set of challenges. Evaluating the models presents another. At least five kinds of approaches can contribute to the evaluation of impact models designed to provide insights in multi-driver, multi-responder, multi-scale, and extreme-driven contexts, even though none of these approaches is a complete or "silver-bullet" solution. The starting point for much of the evaluation in this space is case studies. Case studies can help illustrate links between processes and scales. They can highlight factors that amplify or suppress sensitivity to climate drivers, and they can suggest the consequences of intervening at different points. While case studies rarely provide concrete evidence about mechanisms, they can help move a mechanistic case from circumstantial to sound. Novel approaches to data collection, including crowd sourcing, can potentially provide tools and the number of relevant examples to develop case studies as statistically robust data sources. A critical condition for progress in this area is the ability to utilize data of uneven quality and standards. Novel approaches to meta-analysis provide other options for taking advantage of diverse case studies. Techniques for summarizing responses across impacts, drivers, and scales can play a huge role in increasing the value of information from case studies. In some cases, expert elicitation may provide alternatives for identifying mechanisms or for interpreting multi-factor drivers or responses. Especially when designed to focus on a well-defined set of observations, a sophisticated elicitation can establish formal confidence limits on responses that are otherwise difficult to constrain. A final possible approach involves a focus on the mechanisms contributing to an impact, rather than the impact itself. Approaches based on quantified mechanisms are especially appealing in the context of models where the number of interactions makes it difficult to intuitively understand the chain of connections from cause to effect, when actors differ in goals or sensitivities, or when scale affects parts of the system differently. With all of these approaches, useful evidence may not conform to traditional levels of statistical confidence. Some of the biggest challenges in taking advantage of the potential tools will involve defining what constitutes a meaningful evaluation.

  5. A Practice Approach of Multi-source Geospatial Data Integration for Web-based Geoinformation Services

    NASA Astrophysics Data System (ADS)

    Huang, W.; Jiang, J.; Zha, Z.; Zhang, H.; Wang, C.; Zhang, J.

    2014-04-01

    Geospatial data resources are the foundation of the construction of geo portal which is designed to provide online geoinformation services for the government, enterprise and public. It is vital to keep geospatial data fresh, accurate and comprehensive in order to satisfy the requirements of application and development of geographic location, route navigation, geo search and so on. One of the major problems we are facing is data acquisition. For us, integrating multi-sources geospatial data is the mainly means of data acquisition. This paper introduced a practice integration approach of multi-source geospatial data with different data model, structure and format, which provided the construction of National Geospatial Information Service Platform of China (NGISP) with effective technical supports. NGISP is the China's official geo portal which provides online geoinformation services based on internet, e-government network and classified network. Within the NGISP architecture, there are three kinds of nodes: national, provincial and municipal. Therefore, the geospatial data is from these nodes and the different datasets are heterogeneous. According to the results of analysis of the heterogeneous datasets, the first thing we do is to define the basic principles of data fusion, including following aspects: 1. location precision; 2.geometric representation; 3. up-to-date state; 4. attribute values; and 5. spatial relationship. Then the technical procedure is researched and the method that used to process different categories of features such as road, railway, boundary, river, settlement and building is proposed based on the principles. A case study in Jiangsu province demonstrated the applicability of the principle, procedure and method of multi-source geospatial data integration.

  6. Mentorship in Practice: A Multi-Method Approach.

    ERIC Educational Resources Information Center

    Schreck, Timothy J.; And Others

    This study was conducted to evaluate a field-based mentorship program using a multi-method approach. It explored the use of mentorship as practiced in the Florida Compact, a business education partnership established in Florida in 1987. The study was designed to identify differences between mentors and mentorees, as well as differences within…

  7. A Multi-Method Approach to Studying the Relationship between Character Strengths and Vocational Interests in Adolescents

    ERIC Educational Resources Information Center

    Proyer, Rene T.; Sidler, Nicole; Weber, Marco; Ruch, Willibald

    2012-01-01

    The relationship between character strengths and vocational interests was tested. In an online study, 197 thirteen to eighteen year-olds completed a questionnaire measuring character strengths and a multi-method measure for interests (questionnaire, nonverbal test, and objective personality tests). The main findings were that intellectual…

  8. Accidental degeneracies in nonlinear quantum deformed systems

    NASA Astrophysics Data System (ADS)

    Aleixo, A. N. F.; Balantekin, A. B.

    2011-09-01

    We construct a multi-parameter nonlinear deformed algebra for quantum confined systems that includes many other deformed models as particular cases. We demonstrate that such systems exhibit the property of accidental pairwise energy level degeneracies. We also study, as a special case of our multi-parameter deformation formalism, the extension of the Tamm-Dancoff cutoff deformed oscillator and the occurrence of accidental pairwise degeneracy in the energy levels of the deformed system. As an application, we discuss the case of a trigonometric Rosen-Morse potential, which is successfully used in models for quantum confined systems, ranging from electrons in quantum dots to quarks in hadrons.

  9. A New Computational Technique for the Generation of Optimised Aircraft Trajectories

    NASA Astrophysics Data System (ADS)

    Chircop, Kenneth; Gardi, Alessandro; Zammit-Mangion, David; Sabatini, Roberto

    2017-12-01

    A new computational technique based on Pseudospectral Discretisation (PSD) and adaptive bisection ɛ-constraint methods is proposed to solve multi-objective aircraft trajectory optimisation problems formulated as nonlinear optimal control problems. This technique is applicable to a variety of next-generation avionics and Air Traffic Management (ATM) Decision Support Systems (DSS) for strategic and tactical replanning operations. These include the future Flight Management Systems (FMS) and the 4-Dimensional Trajectory (4DT) planning and intent negotiation/validation tools envisaged by SESAR and NextGen for a global implementation. In particular, after describing the PSD method, the adaptive bisection ɛ-constraint method is presented to allow an efficient solution of problems in which two or multiple performance indices are to be minimized simultaneously. Initial simulation case studies were performed adopting suitable aircraft dynamics models and addressing a classical vertical trajectory optimisation problem with two objectives simultaneously. Subsequently, a more advanced 4DT simulation case study is presented with a focus on representative ATM optimisation objectives in the Terminal Manoeuvring Area (TMA). The simulation results are analysed in-depth and corroborated by flight performance analysis, supporting the validity of the proposed computational techniques.

  10. Incorporating ecosystem function concept in environmental planning and decision making by means of multi-criteria evaluation: the case-study of Kalloni, Lesbos, Greece.

    PubMed

    Oikonomou, Vera; Dimitrakopoulos, Panayiotis G; Troumbis, Andreas Y

    2011-01-01

    Nature provides life-support services which do not merely constitute the basis for ecosystem integrity but also benefit human societies. The importance of such multiple outputs is often ignored or underestimated in environmental planning and decision making. The economic valuation of ecosystem functions or services has been widely used to make these benefits economically visible and thus address this deficiency. Alternatively, the relative importance of the components of ecosystem value can be identified and compared by means of multi-criteria evaluation. Hereupon, this article proposes a conceptual framework that couples ecosystem function analysis, multi criteria evaluation and social research methodologies for introducing an ecosystem function-based planning and management approach. The framework consists of five steps providing the structure of a participative decision making process which is then tested and ratified, by applying the discrete multi-criteria method NAIADE, in the Kalloni Natura 2000 site, on Lesbos, Greece. Three scenarios were developed and evaluated with regard to their impacts on the different types of ecosystem functions and the social actors' value judgements. A conflict analysis permitted the better elaboration of the different views, outlining the coalitions formed in the local community and shaping the way towards reaching a consensus.

  11. Incorporating Ecosystem Function Concept in Environmental Planning and Decision Making by Means of Multi-Criteria Evaluation: The Case-Study of Kalloni, Lesbos, Greece

    NASA Astrophysics Data System (ADS)

    Oikonomou, Vera; Dimitrakopoulos, Panayiotis G.; Troumbis, Andreas Y.

    2011-01-01

    Nature provides life-support services which do not merely constitute the basis for ecosystem integrity but also benefit human societies. The importance of such multiple outputs is often ignored or underestimated in environmental planning and decision making. The economic valuation of ecosystem functions or services has been widely used to make these benefits economically visible and thus address this deficiency. Alternatively, the relative importance of the components of ecosystem value can be identified and compared by means of multi-criteria evaluation. Hereupon, this article proposes a conceptual framework that couples ecosystem function analysis, multi criteria evaluation and social research methodologies for introducing an ecosystem function-based planning and management approach. The framework consists of five steps providing the structure of a participative decision making process which is then tested and ratified, by applying the discrete multi-criteria method NAIADE, in the Kalloni Natura 2000 site, on Lesbos, Greece. Three scenarios were developed and evaluated with regard to their impacts on the different types of ecosystem functions and the social actors' value judgements. A conflict analysis permitted the better elaboration of the different views, outlining the coalitions formed in the local community and shaping the way towards reaching a consensus.

  12. Subjective scaling of mental workload in a multi-task environment

    NASA Technical Reports Server (NTRS)

    Daryanian, B.

    1982-01-01

    Those factors in a multi-task environment that contribute to the operators' "sense" of mental workload were identified. The subjective judgment as conscious experience of mental effort was decided to be the appropriate method of measurement. Thurstone's law of comparative judgment was employed in order to construct interval scales of subjective mental workload from paired comparisons data. An experimental paradigm (Simulated Multi-Task Decision-Making Environment) was employed to represent the ideal experimentally controlled environment in which human operators were asked to "attend" to different cases of Tulga's decision making tasks. Through various statistical analyses it was found that, in general, a lower number of tasks-to-be-processed per unit time (a condition associated with longer interarrival times), results in a lower mental workload, a higher consistency of judgments within a subject, a higher degree of agreement among the subjects, and larger distances between the cases on the Thurstone scale of subjective mental workload. The effects of various control variables and their interactions, and the different characteristics of the subjects on the variation of subjective mental workload are demonstrated.

  13. Consensus for second-order multi-agent systems with position sampled data

    NASA Astrophysics Data System (ADS)

    Wang, Rusheng; Gao, Lixin; Chen, Wenhai; Dai, Dameng

    2016-10-01

    In this paper, the consensus problem with position sampled data for second-order multi-agent systems is investigated. The interaction topology among the agents is depicted by a directed graph. The full-order and reduced-order observers with position sampled data are proposed, by which two kinds of sampled data-based consensus protocols are constructed. With the provided sampled protocols, the consensus convergence analysis of a continuous-time multi-agent system is equivalently transformed into that of a discrete-time system. Then, by using matrix theory and a sampled control analysis method, some sufficient and necessary consensus conditions based on the coupling parameters, spectrum of the Laplacian matrix and sampling period are obtained. While the sampling period tends to zero, our established necessary and sufficient conditions are degenerated to the continuous-time protocol case, which are consistent with the existing result for the continuous-time case. Finally, the effectiveness of our established results is illustrated by a simple simulation example. Project supported by the Natural Science Foundation of Zhejiang Province, China (Grant No. LY13F030005) and the National Natural Science Foundation of China (Grant No. 61501331).

  14. Tomographic Imaging of a Forested Area By Airborne Multi-Baseline P-Band SAR.

    PubMed

    Frey, Othmar; Morsdorf, Felix; Meier, Erich

    2008-09-24

    In recent years, various attempts have been undertaken to obtain information about the structure of forested areas from multi-baseline synthetic aperture radar data. Tomographic processing of such data has been demonstrated for airborne L-band data but the quality of the focused tomographic images is limited by several factors. In particular, the common Fourierbased focusing methods are susceptible to irregular and sparse sampling, two problems, that are unavoidable in case of multi-pass, multi-baseline SAR data acquired by an airborne system. In this paper, a tomographic focusing method based on the time-domain back-projection algorithm is proposed, which maintains the geometric relationship between the original sensor positions and the imaged target and is therefore able to cope with irregular sampling without introducing any approximations with respect to the geometry. The tomographic focusing quality is assessed by analysing the impulse response of simulated point targets and an in-scene corner reflector. And, in particular, several tomographic slices of a volume representing a forested area are given. The respective P-band tomographic data set consisting of eleven flight tracks has been acquired by the airborne E-SAR sensor of the German Aerospace Center (DLR).

  15. Investigation of orifice aeroacoustics by means of multi-port methods

    NASA Astrophysics Data System (ADS)

    Sack, Stefan; Åbom, Mats

    2017-10-01

    Comprehensive methods to cascade active multi-ports, e.g., for acoustic network prediction, have until now only been available for plane waves. This paper presents procedures to combine multi-ports with an arbitrary number of considered duct modes. A multi-port method is used to extract complex mode amplitudes from experimental data of single and tandem in-duct orifice plates for Helmholtz numbers up to around 4 and, hence, beyond the cut-on of several higher order modes. The theory of connecting single multi-ports to linear cascades is derived for the passive properties (the scattering of the system) and the active properties (the source cross-spectrum matrix of the system). One scope of this paper is to investigate the influence of the hydrodynamic near field on the accuracy of both the passive and the active predictions in multi-port cascades. The scattering and the source cross-spectrum matrix of tandem orifice configurations is measured for three cases, namely, with a distance between the plates of 10 duct diameter, for which the downstream orifice is outside the jet of the upstream orifice, 4 duct diameter, and 2 duct diameter (both inside the jet). The results are compared with predictions from single orifice measurements. It is shown that the scattering is only sensitive to disturbed inflow in certain frequency ranges where coupling between the flow and sound field exists, whereas the source cross-spectrum matrix is very sensitive to disturbed inflow for all frequencies. An important part of the analysis is based on an eigenvalue analysis of the scattering matrix and the source cross-spectrum matrix to evaluate the potential of sound amplification and dominant source mechanisms.

  16. Detection and identification of 700 drugs by multi-target screening with a 3200 Q TRAP LC-MS/MS system and library searching.

    PubMed

    Dresen, S; Ferreirós, N; Gnann, H; Zimmermann, R; Weinmann, W

    2010-04-01

    The multi-target screening method described in this work allows the simultaneous detection and identification of 700 drugs and metabolites in biological fluids using a hybrid triple-quadrupole linear ion trap mass spectrometer in a single analytical run. After standardization of the method, the retention times of 700 compounds were determined and transitions for each compound were selected by a "scheduled" survey MRM scan, followed by an information-dependent acquisition using the sensitive enhanced product ion scan of a Q TRAP hybrid instrument. The identification of the compounds in the samples analyzed was accomplished by searching the tandem mass spectrometry (MS/MS) spectra against the library we developed, which contains electrospray ionization-MS/MS spectra of over 1,250 compounds. The multi-target screening method together with the library was included in a software program for routine screening and quantitation to achieve automated acquisition and library searching. With the help of this software application, the time for evaluation and interpretation of the results could be drastically reduced. This new multi-target screening method has been successfully applied for the analysis of postmortem and traffic offense samples as well as proficiency testing, and complements screening with immunoassays, gas chromatography-mass spectrometry, and liquid chromatography-diode-array detection. Other possible applications are analysis in clinical toxicology (for intoxication cases), in psychiatry (antidepressants and other psychoactive drugs), and in forensic toxicology (drugs and driving, workplace drug testing, oral fluid analysis, drug-facilitated sexual assault).

  17. Parabens determination in cosmetic and personal care products exploiting a multi-syringe chromatographic (MSC) system and chemiluminescent detection.

    PubMed

    Rodas, Melisa; Portugal, Lindomar A; Avivar, Jessica; Estela, José Manuel; Cerdà, Víctor

    2015-10-01

    Parabens are widely used in dairy products, such as in cosmetics and personal care products. Thus, in this work a multi-syringe chromatographic (MSC) system is proposed for the first time for the determination of four parabens: methylparaben (MP), ethylparaben (EP), propylparaben (PP) and butylparaben (BP) in cosmetics and personal care products, as a simpler, practical, and low cost alternative to HPLC methods. Separation was achieved using a 5mm-long precolumn of reversed phase C18 and multi-isocratic separation, i.e. using two consecutive mobile phases, 12:88 acetonitrile:water and 28:72 acetonitrile:water. The use of a multi-syringe buret allowed the easy implementation of chemiluminescent (CL) detection after separation. The chemiluminescent detection is based on the reduction of Ce(IV) by p-hydroxybenzoic acid, product of the acid hydrolysis of parabens, to excite rhodamine 6G (Rho 6G) and measure the resulting light emission. Multivariate designs combined with the concepts of multiple response treatments and desirability functions have been employed to simultaneously optimize and evaluate the responses. The optimized method has proved to be sensitive and precise, obtaining limits of detection between 20 and 40 µg L(-1) and RSD <4.9% in all cases. The method was satisfactorily applied to cosmetics and personal care products, obtaining no significant differences at a confidence level of 95% comparing with the HPLC reference method. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Motor carrier case study evaluation report : appendix C, Vol. 2 : case study field notes, the Crescent Evaluation

    DOT National Transportation Integrated Search

    1994-02-01

    The Crescent Project element of the HELP Program is a bi-national multi-jurisdictional : cooperative research and demonstration initiative involving the public and private sectors in an : application of advanced technologies for the creation of an in...

  19. A Multi-Scale Method for Dynamics Simulation in Continuum Solvent Models I: Finite-Difference Algorithm for Navier-Stokes Equation.

    PubMed

    Xiao, Li; Cai, Qin; Li, Zhilin; Zhao, Hongkai; Luo, Ray

    2014-11-25

    A multi-scale framework is proposed for more realistic molecular dynamics simulations in continuum solvent models by coupling a molecular mechanics treatment of solute with a fluid mechanics treatment of solvent. This article reports our initial efforts to formulate the physical concepts necessary for coupling the two mechanics and develop a 3D numerical algorithm to simulate the solvent fluid via the Navier-Stokes equation. The numerical algorithm was validated with multiple test cases. The validation shows that the algorithm is effective and stable, with observed accuracy consistent with our design.

  20. Simulations of Ground and Space-Based Oxygen Atom Experiments

    NASA Technical Reports Server (NTRS)

    Finchum, A. (Technical Monitor); Cline, J. A.; Minton, T. K.; Braunstein, M.

    2003-01-01

    A low-earth orbit (LEO) materials erosion scenario and the ground-based experiment designed to simulate it are compared using the direct-simulation Monte Carlo (DSMC) method. The DSMC model provides a detailed description of the interactions between the hyperthermal gas flow and a normally oriented flat plate for each case. We find that while the general characteristics of the LEO exposure are represented in the ground-based experiment, multi-collision effects can potentially alter the impact energy and directionality of the impinging molecules in the ground-based experiment. Multi-collision phenomena also affect downstream flux measurements.

Top