The Role of Attention in Somatosensory Processing: A Multi-Trait, Multi-Method Analysis
ERIC Educational Resources Information Center
Wodka, Ericka L.; Puts, Nicolaas A. J.; Mahone, E. Mark; Edden, Richard A. E.; Tommerdahl, Mark; Mostofsky, Stewart H.
2016-01-01
Sensory processing abnormalities in autism have largely been described by parent report. This study used a multi-method (parent-report and measurement), multi-trait (tactile sensitivity and attention) design to evaluate somatosensory processing in ASD. Results showed multiple significant within-method (e.g., parent report of different…
Multi-criteria evaluation methods in the production scheduling
NASA Astrophysics Data System (ADS)
Kalinowski, K.; Krenczyk, D.; Paprocka, I.; Kempa, W.; Grabowik, C.
2016-08-01
The paper presents a discussion on the practical application of different methods of multi-criteria evaluation in the process of scheduling in manufacturing systems. Among the methods two main groups are specified: methods based on the distance function (using metacriterion) and methods that create a Pareto set of possible solutions. The basic criteria used for scheduling were also described. The overall procedure of evaluation process in production scheduling was presented. It takes into account the actions in the whole scheduling process and human decision maker (HDM) participation. The specified HDM decisions are related to creating and editing a set of evaluation criteria, selection of multi-criteria evaluation method, interaction in the searching process, using informal criteria and making final changes in the schedule for implementation. According to need, process scheduling may be completely or partially automated. Full automatization is possible in case of metacriterion based objective function and if Pareto set is selected - the final decision has to be done by HDM.
NASA Technical Reports Server (NTRS)
Pototzky, Anthony; Wieseman, Carol; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek
1991-01-01
Described here is the development and implementation of on-line, near real time controller performance evaluation (CPE) methods capability. Briefly discussed are the structure of data flow, the signal processing methods used to process the data, and the software developed to generate the transfer functions. This methodology is generic in nature and can be used in any type of multi-input/multi-output (MIMO) digital controller application, including digital flight control systems, digitally controlled spacecraft structures, and actively controlled wind tunnel models. Results of applying the CPE methodology to evaluate (in near real time) MIMO digital flutter suppression systems being tested on the Rockwell Active Flexible Wing (AFW) wind tunnel model are presented to demonstrate the CPE capability.
USDA-ARS?s Scientific Manuscript database
The cross-site process evaluation plan for the Childhood Obesity Research Demonstration (CORD) project is described here. The CORD project comprises 3 unique demonstration projects designed to integrate multi-level, multi-setting health care and public health interventions over a 4-year funding peri...
Evaluating the compatibility of multi-functional and intensive urban land uses
NASA Astrophysics Data System (ADS)
Taleai, M.; Sharifi, A.; Sliuzas, R.; Mesgari, M.
2007-12-01
This research is aimed at developing a model for assessing land use compatibility in densely built-up urban areas. In this process, a new model was developed through the combination of a suite of existing methods and tools: geographical information system, Delphi methods and spatial decision support tools: namely multi-criteria evaluation analysis, analytical hierarchy process and ordered weighted average method. The developed model has the potential to calculate land use compatibility in both horizontal and vertical directions. Furthermore, the compatibility between the use of each floor in a building and its neighboring land uses can be evaluated. The method was tested in a built-up urban area located in Tehran, the capital city of Iran. The results show that the model is robust in clarifying different levels of physical compatibility between neighboring land uses. This paper describes the various steps and processes of developing the proposed land use compatibility evaluation model (CEM).
The Role of Attention in Somatosensory Processing: A Multi-trait, Multi-method Analysis
Puts, Nicolaas A. J.; Mahone, E. Mark; Edden, Richard A. E.; Tommerdahl, Mark; Mostofsky, Stewart H.
2016-01-01
Sensory processing abnormalities in autism have largely been described by parent report. This study used a multi-method (parent-report and measurement), multi-trait (tactile sensitivity and attention) design to evaluate somatosensory processing in ASD. Results showed multiple significant within-method (e.g., parent report of different traits)/cross-trait (e.g., attention and tactile sensitivity) correlations, suggesting that parent-reported tactile sensory dysfunction and performance-based tactile sensitivity describe different behavioral phenomena. Additionally, both parent-reported tactile functioning and performance-based tactile sensitivity measures were significantly associated with measures of attention. Findings suggest that sensory (tactile) processing abnormalities in ASD are multifaceted, and may partially reflect a more global deficit in behavioral regulation (including attention). Challenges of relying solely on parent-report to describe sensory difficulties faced by children/families with ASD are also highlighted. PMID:27448580
Evaluation of laser ablation crater relief by white light micro interferometer
NASA Astrophysics Data System (ADS)
Gurov, Igor; Volkov, Mikhail; Zhukova, Ekaterina; Ivanov, Nikita; Margaryants, Nikita; Potemkin, Andrey; Samokhvalov, Andrey; Shelygina, Svetlana
2017-06-01
A multi-view scanning method is suggested to assess a complicated surface relief by white light interferometer. Peculiarities of the method are demonstrated on a special object in the form of quadrangular pyramid cavity, which is formed at measurement of micro-hardness of materials using a hardness gauge. An algorithm of the joint processing of multi-view scanning results is developed that allows recovering correct relief values. Laser ablation craters were studied experimentally, and their relief was recovered using the developed method. It is shown that the multi-view scanning reduces ambiguity when determining the local depth of the laser ablation craters micro relief. Results of experimental studies of the multi-view scanning method and data processing algorithm are presented.
Chung, Seungjoon; Seo, Chang Duck; Choi, Jae-Hoon; Chung, Jinwook
2014-01-01
Membrane distillation (MD) is an emerging desalination technology as an energy-saving alternative to conventional distillation and reverse osmosis method. The selection of appropriate membrane is a prerequisite for the design of an optimized MD process. We proposed a simple approximation method to evaluate the performance of membranes for MD process. Three hollow fibre-type commercial membranes with different thicknesses and pore sizes were tested. Experimental results showed that one membrane was advantageous due to the highest flux, whereas another membrane was due to the lowest feed temperature drop. Regression analyses and multi-stage calculations were used to account for the trade-offeffects of flux and feed temperature drop. The most desirable membrane was selected from tested membranes in terms of the mean flux in a multi-stage process. This method would be useful for the selection of the membranes without complicated simulation techniques.
Ding, Wen-jie; Chen, Wen-he; Deng, Ming-jia; Luo, Hui; Li, Lin; Liu, Jun-xin
2016-02-15
Co-processing of sewage sludge using the cement kiln can realize sludge harmless treatment, quantity reduction, stabilization and reutilization. The moisture content should be reduced to below 30% to meet the requirement of combustion. Thermal drying is an effective way for sludge desiccation. Odors and volatile organic compounds are generated and released during the sludge drying process, which could lead to odor pollution. The main odor pollutants were selected by the multi-index integrated assessment method. The concentration, olfactory threshold, threshold limit value, smell security level and saturated vapor pressure were considered as indexes based on the related regulations in China and foreign countries. Taking the pollution potential as the evaluation target, and the risk index and odor emission intensity as evaluation indexes, the odor pollution potential rated evaluation model of the pollutants was built according to the Weber-Fechner law. The aim of the present study is to form the rating evaluation method of odor potential pollution capacity suitable for the directly drying process of sludge.
METHODS FOR EVALUATING THE SUSTAINABILITY OF GREEN PROCESSES
A methodology, called GREENSCOPE (Gauging Reaction Effectiveness for the ENvironmental Sustainability of Chemistries with a multi-Objective Process Evaluator), has been developed in the U.S. EPA's Office of Research and Development to directly compare the sustainability of proces...
Governance for public health and health equity: The Tröndelag model for public health work.
Lillefjell, Monica; Magnus, Eva; Knudtsen, Margunn SkJei; Wist, Guri; Horghagen, Sissel; Espnes, Geir Arild; Maass, Ruca; Anthun, Kirsti Sarheim
2018-06-01
Multi-sectoral governance of population health is linked to the realization that health is the property of many societal systems. This study aims to contribute knowledge and methods that can strengthen the capacities of municipalities regarding how to work more systematically, knowledge-based and multi-sectoral in promoting health and health equity in the population. Process evaluation was conducted, applying a mixed-methods research design, combining qualitative and quantitative data collection methods. Processes strengthening systematic and multi-sectoral development, implementation and evaluation of research-based measures to promote health, quality of life, and health equity in, for and with municipalities were revealed. A step-by-step model, that emphasizes the promotion of knowledge-based, systematic, multi-sectoral public health work, as well as joint ownership of local resources, initiatives and policies has been developed. Implementation of systematic, knowledge-based and multi-sectoral governance of public health measures in municipalities demand shared understanding of the challenges, updated overview of the population health and impact factors, anchoring in plans, new skills and methods for selection and implementation of measures, as well as development of trust, ownership, shared ethics and goals among those involved.
Kusters, Koen; Buck, Louise; de Graaf, Maartje; Minang, Peter; van Oosten, Cora; Zagt, Roderick
2018-07-01
Integrated landscape initiatives typically aim to strengthen landscape governance by developing and facilitating multi-stakeholder platforms. These are institutional coordination mechanisms that enable discussions, negotiations, and joint planning between stakeholders from various sectors in a given landscape. Multi-stakeholder platforms tend to involve complex processes with diverse actors, whose objectives and focus may be subjected to periodic re-evaluation, revision or reform. In this article we propose a participatory method to aid planning, monitoring, and evaluation of such platforms, and we report on experiences from piloting the method in Ghana and Indonesia. The method is comprised of three components. The first can be used to look ahead, identifying priorities for future multi-stakeholder collaboration in the landscape. It is based on the identification of four aspirations that are common across multi-stakeholder platforms in integrated landscape initiatives. The second can be used to look inward. It focuses on the processes within an existing multi-stakeholder platform in order to identify areas for possible improvement. The third can be used to look back, identifying the main outcomes of an existing platform and comparing them to the original objectives. The three components can be implemented together or separately. They can be used to inform planning and adaptive management of the platform, as well as to demonstrate performance and inform the design of new interventions.
NASA Astrophysics Data System (ADS)
Aleksanyan, Grayr; Shcherbakov, Ivan; Kucher, Artem; Sulyz, Andrew
2018-04-01
Continuous monitoring of the patient's breathing by the method of multi-angle electric impedance tomography allows to obtain images of conduction change in the chest cavity during the monitoring. Direct analysis of images is difficult due to the large amount of information and low resolution images obtained by multi-angle electrical impedance tomography. This work presents a method for obtaining a graph of respiratory activity of the lungs based on the results of continuous lung monitoring using the multi-angle electrical impedance tomography method. The method makes it possible to obtain a graph of the respiratory activity of the left and right lungs separately, as well as a summary graph, to which it is possible to apply methods of processing the results of spirography.
NASA Astrophysics Data System (ADS)
Jamróz, Dariusz; Niedoba, Tomasz; Surowiak, Agnieszka; Tumidajski, Tadeusz; Szostek, Roman; Gajer, Mirosław
2017-09-01
The application of methods drawing upon multi-parameter visualization of data by transformation of multidimensional space into two-dimensional one allow to show multi-parameter data on computer screen. Thanks to that, it is possible to conduct a qualitative analysis of this data in the most natural way for human being, i.e. by the sense of sight. An example of such method of multi-parameter visualization is multidimensional scaling. This method was used in this paper to present and analyze a set of seven-dimensional data obtained from Janina Mining Plant and Wieczorek Coal Mine. It was decided to examine whether the method of multi-parameter data visualization allows to divide the samples space into areas of various applicability to fluidal gasification process. The "Technological applicability card for coals" was used for this purpose [Sobolewski et al., 2012; 2017], in which the key parameters, important and additional ones affecting the gasification process were described.
NASA Astrophysics Data System (ADS)
Vikram, K. Arun; Ratnam, Ch; Lakshmi, VVK; Kumar, A. Sunny; Ramakanth, RT
2018-02-01
Meta-heuristic multi-response optimization methods are widely in use to solve multi-objective problems to obtain Pareto optimal solutions during optimization. This work focuses on optimal multi-response evaluation of process parameters in generating responses like surface roughness (Ra), surface hardness (H) and tool vibration displacement amplitude (Vib) while performing operations like tangential and orthogonal turn-mill processes on A-axis Computer Numerical Control vertical milling center. Process parameters like tool speed, feed rate and depth of cut are considered as process parameters machined over brass material under dry condition with high speed steel end milling cutters using Taguchi design of experiments (DOE). Meta-heuristic like Dragonfly algorithm is used to optimize the multi-objectives like ‘Ra’, ‘H’ and ‘Vib’ to identify the optimal multi-response process parameters combination. Later, the results thus obtained from multi-objective dragonfly algorithm (MODA) are compared with another multi-response optimization technique Viz. Grey relational analysis (GRA).
Improved NSGA model for multi objective operation scheduling and its evaluation
NASA Astrophysics Data System (ADS)
Li, Weining; Wang, Fuyu
2017-09-01
Reasonable operation can increase the income of the hospital and improve the patient’s satisfactory level. In this paper, by using multi object operation scheduling method with improved NSGA algorithm, it shortens the operation time, reduces the operation costand lowers the operation risk, the multi-objective optimization model is established for flexible operation scheduling, through the MATLAB simulation method, the Pareto solution is obtained, the standardization of data processing. The optimal scheduling scheme is selected by using entropy weight -Topsis combination method. The results show that the algorithm is feasible to solve the multi-objective operation scheduling problem, and provide a reference for hospital operation scheduling.
Developing a Mind-Body Exercise Programme for Stressed Children
ERIC Educational Resources Information Center
Wang, Claudia; Seo, Dong-Chul; Geib, Roy W
2017-01-01
Objective: To describe the process of developing a Health Qigong programme for stressed children using a formative evaluation approach. Methods: A multi-step formative evaluation method was utilised. These steps included (1) identifying programme content and drafting the curriculum, (2) synthesising effective and age-appropriate pedagogies, (3)…
NASA Astrophysics Data System (ADS)
Wu, Linqin; Xu, Sheng; Jiang, Dezhi
2015-12-01
Industrial wireless networked control system has been widely used, and how to evaluate the performance of the wireless network is of great significance. In this paper, considering the shortcoming of the existing performance evaluation methods, a comprehensive performance evaluation method of networks multi-indexes fuzzy analytic hierarchy process (MFAHP) combined with the fuzzy mathematics and the traditional analytic hierarchy process (AHP) is presented. The method can overcome that the performance evaluation is not comprehensive and subjective. Experiments show that the method can reflect the network performance of real condition. It has direct guiding role on protocol selection, network cabling, and node setting, and can meet the requirements of different occasions by modifying the underlying parameters.
Zheng, Chunli; Wang, Jinan; Liu, Jianling; Pei, Mengjie; Huang, Chao; Wang, Yonghua
2014-08-01
The term systems pharmacology describes a field of study that uses computational and experimental approaches to broaden the view of drug actions rooted in molecular interactions and advance the process of drug discovery. The aim of this work is to stick out the role that the systems pharmacology plays across the multi-target drug discovery from natural products for cardiovascular diseases (CVDs). Firstly, based on network pharmacology methods, we reconstructed the drug-target and target-target networks to determine the putative protein target set of multi-target drugs for CVDs treatment. Secondly, we reintegrated a compound dataset of natural products and then obtained a multi-target compounds subset by virtual-screening process. Thirdly, a drug-likeness evaluation was applied to find the ADME-favorable compounds in this subset. Finally, we conducted in vitro experiments to evaluate the reliability of the selected chemicals and targets. We found that four of the five randomly selected natural molecules can effectively act on the target set for CVDs, indicating the reasonability of our systems-based method. This strategy may serve as a new model for multi-target drug discovery of complex diseases.
Improved Topographic Mapping Through Multi-Baseline SAR Interferometry with MAP Estimation
NASA Astrophysics Data System (ADS)
Dong, Yuting; Jiang, Houjun; Zhang, Lu; Liao, Mingsheng; Shi, Xuguo
2015-05-01
There is an inherent contradiction between the sensitivity of height measurement and the accuracy of phase unwrapping for SAR interferometry (InSAR) over rough terrain. This contradiction can be resolved by multi-baseline InSAR analysis, which exploits multiple phase observations with different normal baselines to improve phase unwrapping accuracy, or even avoid phase unwrapping. In this paper we propose a maximum a posteriori (MAP) estimation method assisted by SRTM DEM data for multi-baseline InSAR topographic mapping. Based on our method, a data processing flow is established and applied in processing multi-baseline ALOS/PALSAR dataset. The accuracy of resultant DEMs is evaluated by using a standard Chinese national DEM of scale 1:10,000 as reference. The results show that multi-baseline InSAR can improve DEM accuracy compared with single-baseline case. It is noteworthy that phase unwrapping is avoided and the quality of multi-baseline InSAR DEM can meet the DTED-2 standard.
NASA Astrophysics Data System (ADS)
Qi, Peng; Du, Mei
2018-06-01
China's southeast coastal areas frequently suffer from storm surge due to the attack of tropical cyclones (TCs) every year. Hazards induced by TCs are complex, such as strong wind, huge waves, storm surge, heavy rain, floods, and so on. The atmospheric and oceanic hazards cause serious disasters and substantial economic losses. This paper, from the perspective of hazard group, sets up a multi-factor evaluation method for the risk assessment of TC hazards using historical extreme data of concerned atmospheric and oceanic elements. Based on the natural hazard dynamic process, the multi-factor indicator system is composed of nine natural hazard factors representing intensity and frequency, respectively. Contributing to the indicator system, in order of importance, are maximum wind speed by TCs, attack frequency of TCs, maximum surge height, maximum wave height, frequency of gusts ≥ Scale 8, rainstorm intensity, maximum tidal range, rainstorm frequency, then sea-level rising rate. The first four factors are the most important, whose weights exceed 10% in the indicator system. With normalization processing, all the single-hazard factors are superposed by multiplying their weights to generate a superposed TC hazard. The multi-factor evaluation indicator method was applied to the risk assessment of typhoon-induced atmospheric and oceanic hazard group in typhoon-prone southeast coastal cities of China.
New Tools and Methods for Assessing Risk-Management Strategies
2004-03-01
Theories to evaluate the risks and benefits of various acquisition alternatives and allowed researchers to monitor the process students used to make a...revealed distinct risk-management strategies. 15. SUBJECT TERMS risk managements, acquisition process, expected value theory , multi-attribute utility theory ...Utility Theories to evaluate the risks and benefits of various acquisition alternatives, and allowed us to monitor the process subjects used to arrive at
Balasubramanian, Bijal A; Cohen, Deborah J; Davis, Melinda M; Gunn, Rose; Dickinson, L Miriam; Miller, William L; Crabtree, Benjamin F; Stange, Kurt C
2015-03-10
In healthcare change interventions, on-the-ground learning about the implementation process is often lost because of a primary focus on outcome improvements. This paper describes the Learning Evaluation, a methodological approach that blends quality improvement and implementation research methods to study healthcare innovations. Learning Evaluation is an approach to multi-organization assessment. Qualitative and quantitative data are collected to conduct real-time assessment of implementation processes while also assessing changes in context, facilitating quality improvement using run charts and audit and feedback, and generating transportable lessons. Five principles are the foundation of this approach: (1) gather data to describe changes made by healthcare organizations and how changes are implemented; (2) collect process and outcome data relevant to healthcare organizations and to the research team; (3) assess multi-level contextual factors that affect implementation, process, outcome, and transportability; (4) assist healthcare organizations in using data for continuous quality improvement; and (5) operationalize common measurement strategies to generate transportable results. Learning Evaluation principles are applied across organizations by the following: (1) establishing a detailed understanding of the baseline implementation plan; (2) identifying target populations and tracking relevant process measures; (3) collecting and analyzing real-time quantitative and qualitative data on important contextual factors; (4) synthesizing data and emerging findings and sharing with stakeholders on an ongoing basis; and (5) harmonizing and fostering learning from process and outcome data. Application to a multi-site program focused on primary care and behavioral health integration shows the feasibility and utility of Learning Evaluation for generating real-time insights into evolving implementation processes. Learning Evaluation generates systematic and rigorous cross-organizational findings about implementing healthcare innovations while also enhancing organizational capacity and accelerating translation of findings by facilitating continuous learning within individual sites. Researchers evaluating change initiatives and healthcare organizations implementing improvement initiatives may benefit from a Learning Evaluation approach.
Using multi-attribute decision-making approaches in the selection of a hospital management system.
Arasteh, Mohammad Ali; Shamshirband, Shahaboddin; Yee, Por Lip
2018-01-01
The most appropriate organizational software is always a real challenge for managers, especially, the IT directors. The illustration of the term "enterprise software selection", is to purchase, create, or order a software that; first, is best adapted to require of the organization; and second, has suitable price and technical support. Specifying selection criteria and ranking them, is the primary prerequisite for this action. This article provides a method to evaluate, rank, and compare the available enterprise software for choosing the apt one. The prior mentioned method is constituted of three-stage processes. First, the method identifies the organizational requires and assesses them. Second, it selects the best method throughout three possibilities; indoor-production, buying software, and ordering special software for the native use. Third, the method evaluates, compares and ranks the alternative software. The third process uses different methods of multi attribute decision making (MADM), and compares the consequent results. Based on different characteristics of the problem; several methods had been tested, namely, Analytic Hierarchy Process (AHP), Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS), Elimination and Choice Expressing Reality (ELECTURE), and easy weight method. After all, we propose the most practical method for same problems.
NASA Astrophysics Data System (ADS)
Zheng, Xianwei; Xiong, Hanjiang; Gong, Jianya; Yue, Linwei
2017-07-01
Virtual globes play an important role in representing three-dimensional models of the Earth. To extend the functioning of a virtual globe beyond that of a "geobrowser", the accuracy of the geospatial data in the processing and representation should be of special concern for the scientific analysis and evaluation. In this study, we propose a method for the processing of large-scale terrain data for virtual globe visualization and analysis. The proposed method aims to construct a morphologically preserved multi-resolution triangulated irregular network (TIN) pyramid for virtual globes to accurately represent the landscape surface and simultaneously satisfy the demands of applications at different scales. By introducing cartographic principles, the TIN model in each layer is controlled with a data quality standard to formulize its level of detail generation. A point-additive algorithm is used to iteratively construct the multi-resolution TIN pyramid. The extracted landscape features are also incorporated to constrain the TIN structure, thus preserving the basic morphological shapes of the terrain surface at different levels. During the iterative construction process, the TIN in each layer is seamlessly partitioned based on a virtual node structure, and tiled with a global quadtree structure. Finally, an adaptive tessellation approach is adopted to eliminate terrain cracks in the real-time out-of-core spherical terrain rendering. The experiments undertaken in this study confirmed that the proposed method performs well in multi-resolution terrain representation, and produces high-quality underlying data that satisfy the demands of scientific analysis and evaluation.
The Evaluation and Research of Multi-Project Programs: Program Component Analysis.
ERIC Educational Resources Information Center
Baker, Eva L.
1977-01-01
It is difficult to base evaluations on concepts irrelevant to state policy making. Evaluation of a multiproject program requires both time and differentiation of method. Data from the California Early Childhood Program illustrate process variables for program component analysis, and research questions for intraprogram comparison. (CP)
Optimization of the coherence function estimation for multi-core central processing unit
NASA Astrophysics Data System (ADS)
Cheremnov, A. G.; Faerman, V. A.; Avramchuk, V. S.
2017-02-01
The paper considers use of parallel processing on multi-core central processing unit for optimization of the coherence function evaluation arising in digital signal processing. Coherence function along with other methods of spectral analysis is commonly used for vibration diagnosis of rotating machinery and its particular nodes. An algorithm is given for the function evaluation for signals represented with digital samples. The algorithm is analyzed for its software implementation and computational problems. Optimization measures are described, including algorithmic, architecture and compiler optimization, their results are assessed for multi-core processors from different manufacturers. Thus, speeding-up of the parallel execution with respect to sequential execution was studied and results are presented for Intel Core i7-4720HQ и AMD FX-9590 processors. The results show comparatively high efficiency of the optimization measures taken. In particular, acceleration indicators and average CPU utilization have been significantly improved, showing high degree of parallelism of the constructed calculating functions. The developed software underwent state registration and will be used as a part of a software and hardware solution for rotating machinery fault diagnosis and pipeline leak location with acoustic correlation method.
Multi-Touch Tabletop System Using Infrared Image Recognition for User Position Identification.
Suto, Shota; Watanabe, Toshiya; Shibusawa, Susumu; Kamada, Masaru
2018-05-14
A tabletop system can facilitate multi-user collaboration in a variety of settings, including small meetings, group work, and education and training exercises. The ability to identify the users touching the table and their positions can promote collaborative work among participants, so methods have been studied that involve attaching sensors to the table, chairs, or to the users themselves. An effective method of recognizing user actions without placing a burden on the user would be some type of visual process, so the development of a method that processes multi-touch gestures by visual means is desired. This paper describes the development of a multi-touch tabletop system using infrared image recognition for user position identification and presents the results of touch-gesture recognition experiments and a system-usability evaluation. Using an inexpensive FTIR touch panel and infrared light, this system picks up the touch areas and the shadow area of the user's hand by an infrared camera to establish an association between the hand and table touch points and estimate the position of the user touching the table. The multi-touch gestures prepared for this system include an operation to change the direction of an object to face the user and a copy operation in which two users generate duplicates of an object. The system-usability evaluation revealed that prior learning was easy and that system operations could be easily performed.
Multi-Touch Tabletop System Using Infrared Image Recognition for User Position Identification
Suto, Shota; Watanabe, Toshiya; Shibusawa, Susumu; Kamada, Masaru
2018-01-01
A tabletop system can facilitate multi-user collaboration in a variety of settings, including small meetings, group work, and education and training exercises. The ability to identify the users touching the table and their positions can promote collaborative work among participants, so methods have been studied that involve attaching sensors to the table, chairs, or to the users themselves. An effective method of recognizing user actions without placing a burden on the user would be some type of visual process, so the development of a method that processes multi-touch gestures by visual means is desired. This paper describes the development of a multi-touch tabletop system using infrared image recognition for user position identification and presents the results of touch-gesture recognition experiments and a system-usability evaluation. Using an inexpensive FTIR touch panel and infrared light, this system picks up the touch areas and the shadow area of the user’s hand by an infrared camera to establish an association between the hand and table touch points and estimate the position of the user touching the table. The multi-touch gestures prepared for this system include an operation to change the direction of an object to face the user and a copy operation in which two users generate duplicates of an object. The system-usability evaluation revealed that prior learning was easy and that system operations could be easily performed. PMID:29758006
NASA Astrophysics Data System (ADS)
Mohammed, Habiba Ibrahim; Majid, Zulkepli; Yusof, Norhakim Bin; Bello Yamusa, Yamusa
2018-03-01
Landfilling remains the most common systematic technique of solid waste disposal in most of the developed and developing countries. Finding a suitable site for landfill is a very challenging task. Landfill site selection process aims to provide suitable areas that will protect the environment and public health from pollution and hazards. Therefore, various factors such as environmental, physical, socio-economic, and geological criteria must be considered before siting any landfill. This makes the site selection process vigorous and tedious because it involves the processing of large amount of spatial data, rules and regulations from different agencies and also policy from decision makers. This allows the incorporation of conflicting objectives and decision maker preferences into spatial decision models. This paper particularly analyzes the multi-criteria evaluation (MCE) method of landfill site selection for solid waste management by means of literature reviews and surveys. The study will help the decision makers and waste management authorities to choose the most effective method when considering landfill site selection.
NASA Astrophysics Data System (ADS)
Chen, Yen-Luan; Chang, Chin-Chih; Sheu, Dwan-Fang
2016-04-01
This paper proposes the generalised random and age replacement policies for a multi-state system composed of multi-state elements. The degradation of the multi-state element is assumed to follow the non-homogeneous continuous time Markov process which is a continuous time and discrete state process. A recursive approach is presented to efficiently compute the time-dependent state probability distribution of the multi-state element. The state and performance distribution of the entire multi-state system is evaluated via the combination of the stochastic process and the Lz-transform method. The concept of customer-centred reliability measure is developed based on the system performance and the customer demand. We develop the random and age replacement policies for an aging multi-state system subject to imperfect maintenance in a failure (or unacceptable) state. For each policy, the optimum replacement schedule which minimises the mean cost rate is derived analytically and discussed numerically.
Risk Evaluation of Railway Coal Transportation Network Based on Multi Level Grey Evaluation Model
NASA Astrophysics Data System (ADS)
Niu, Wei; Wang, Xifu
2018-01-01
The railway transport mode is currently the most important way of coal transportation, and now China’s railway coal transportation network has become increasingly perfect, but there is still insufficient capacity, some lines close to saturation and other issues. In this paper, the theory and method of risk assessment, analytic hierarchy process and multi-level gray evaluation model are applied to the risk evaluation of coal railway transportation network in China. Based on the example analysis of Shanxi railway coal transportation network, to improve the internal structure and the competitiveness of the market.
Absolute order-of-magnitude reasoning applied to a social multi-criteria evaluation framework
NASA Astrophysics Data System (ADS)
Afsordegan, A.; Sánchez, M.; Agell, N.; Aguado, J. C.; Gamboa, G.
2016-03-01
A social multi-criteria evaluation framework for solving a real-case problem of selecting a wind farm location in the regions of Urgell and Conca de Barberá in Catalonia (northeast of Spain) is studied. This paper applies a qualitative multi-criteria decision analysis approach based on linguistic labels assessment able to address uncertainty and deal with different levels of precision. This method is based on qualitative reasoning as an artificial intelligence technique for assessing and ranking multi-attribute alternatives with linguistic labels in order to handle uncertainty. This method is suitable for problems in the social framework such as energy planning which require the construction of a dialogue process among many social actors with high level of complexity and uncertainty. The method is compared with an existing approach, which has been applied previously in the wind farm location problem. This approach, consisting of an outranking method, is based on Condorcet's original method. The results obtained by both approaches are analysed and their performance in the selection of the wind farm location is compared in aggregation procedures. Although results show that both methods conduct to similar alternatives rankings, the study highlights both their advantages and drawbacks.
A heuristic method for consumable resource allocation in multi-class dynamic PERT networks
NASA Astrophysics Data System (ADS)
Yaghoubi, Saeed; Noori, Siamak; Mazdeh, Mohammad Mahdavi
2013-06-01
This investigation presents a heuristic method for consumable resource allocation problem in multi-class dynamic Project Evaluation and Review Technique (PERT) networks, where new projects from different classes (types) arrive to system according to independent Poisson processes with different arrival rates. Each activity of any project is operated at a devoted service station located in a node of the network with exponential distribution according to its class. Indeed, each project arrives to the first service station and continues its routing according to precedence network of its class. Such system can be represented as a queuing network, while the discipline of queues is first come, first served. On the basis of presented method, a multi-class system is decomposed into several single-class dynamic PERT networks, whereas each class is considered separately as a minisystem. In modeling of single-class dynamic PERT network, we use Markov process and a multi-objective model investigated by Azaron and Tavakkoli-Moghaddam in 2007. Then, after obtaining the resources allocated to service stations in every minisystem, the final resources allocated to activities are calculated by the proposed method.
Milosevic, Igor; Naunovic, Zorana
2013-10-01
This article presents a process of evaluation and selection of the most favourable location for a sanitary landfill facility from three alternative locations, by applying a multi-criteria decision-making (MCDM) method. An incorrect choice of location for a landfill facility can have a significant negative economic and environmental impact, such as the pollution of air, ground and surface waters. The aim of this article is to present several improvements in the practical process of landfill site selection using the VIKOR MCDM compromise ranking method integrated with a fuzzy analytic hierarchy process approach for determining the evaluation criteria weighing coefficients. The VIKOR method focuses on ranking and selecting from a set of alternatives in the presence of conflicting and non-commensurable (different units) criteria, and on proposing a compromise solution that is closest to the ideal solution. The work shows that valuable site ranking lists can be obtained using the VIKOR method, which is a suitable choice when there is a large number of relevant input parameters.
NASA Astrophysics Data System (ADS)
Widianta, M. M. D.; Rizaldi, T.; Setyohadi, D. P. S.; Riskiawan, H. Y.
2018-01-01
The right decision in placing employees in an appropriate position in a company will support the quality of management and will have an impact on improving the quality of human resources of the company. Such decision-making can be assisted by an approach through the Decision Support System (DSS) to improve accuracy in the employee placement process. The purpose of this paper is to compare the four methods of Multi Criteria Decision Making (MCDM), ie Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS), Simple Additive Weighting (SAW), Analytic Hierarchy Process (AHP) and Preference Ranking Organization Method for Enrichment Of Evaluations (PROMETHEE) for the application of employee placement in accordance with predetermined criteria. The ranking results and the accuracy level obtained from each method are different depending on the different scaling and weighting processes in each method.
NASA Astrophysics Data System (ADS)
Kang, Chao; Shi, Yaoyao; He, Xiaodong; Yu, Tao; Deng, Bo; Zhang, Hongji; Sun, Pengcheng; Zhang, Wenbin
2017-09-01
This study investigates the multi-objective optimization of quality characteristics for a T300/epoxy prepreg tape-wound cylinder. The method integrates the Taguchi method, grey relational analysis (GRA) and response surface methodology, and is adopted to improve tensile strength and reduce residual stress. In the winding process, the main process parameters involving winding tension, pressure, temperature and speed are selected to evaluate the parametric influences on tensile strength and residual stress. Experiments are conducted using the Box-Behnken design. Based on principal component analysis, the grey relational grades are properly established to convert multi-responses into an individual objective problem. Then the response surface method is used to build a second-order model of grey relational grade and predict the optimum parameters. The predictive accuracy of the developed model is proved by two test experiments with a low prediction error of less than 7%. The following process parameters, namely winding tension 124.29 N, pressure 2000 N, temperature 40 °C and speed 10.65 rpm, have the highest grey relational grade and give better quality characteristics in terms of tensile strength and residual stress. The confirmation experiment shows that better results are obtained with GRA improved by the proposed method than with ordinary GRA. The proposed method is proved to be feasible and can be applied to optimize the multi-objective problem in the filament winding process.
Hancerliogullari, Gulsah; Hancerliogullari, Kadir Oymen; Koksalmis, Emrah
2017-01-23
Determining the most suitable anesthesia method for circumcision surgery plays a fundamental role in pediatric surgery. This study is aimed to present pediatric surgeons' perspective on the relative importance of the criteria for selecting anesthesia method for circumcision surgery by utilizing the multi-criteria decision making methods. Fuzzy set theory offers a useful tool for transforming linguistic terms into numerical assessments. Since the evaluation of anesthesia methods requires linguistic terms, we utilize the fuzzy Analytic Hierarchy Process (AHP) and fuzzy Technique for Order Preference by Similarity to Ideal Solution (TOPSIS). Both mathematical decision-making methods are originated from individual judgements for qualitative factors utilizing the pair-wise comparison matrix. Our model uses four main criteria, eight sub-criteria as well as three alternatives. To assess the relative priorities, an online questionnaire was completed by three experts, pediatric surgeons, who had experience with circumcision surgery. Discussion of the results with the experts indicates that time-related factors are the most important criteria, followed by psychology, convenience and duration. Moreover, general anesthesia with penile block for circumcision surgery is the preferred choice of anesthesia compared to general anesthesia without penile block, which has a greater priority compared to local anesthesia under the discussed main-criteria and sub-criteria. The results presented in this study highlight the need to integrate surgeons' criteria into the decision making process for selecting anesthesia methods. This is the first study in which multi-criteria decision making tools, specifically fuzzy AHP and fuzzy TOPSIS, are used to evaluate anesthesia methods for a pediatric surgical procedure.
Self-correcting multi-atlas segmentation
NASA Astrophysics Data System (ADS)
Gao, Yi; Wilford, Andrew; Guo, Liang
2016-03-01
In multi-atlas segmentation, one typically registers several atlases to the new image, and their respective segmented label images are transformed and fused to form the final segmentation. After each registration, the quality of the registration is reflected by the single global value: the final registration cost. Ideally, if the quality of the registration can be evaluated at each point, independent of the registration process, which also provides a direction in which the deformation can further be improved, the overall segmentation performance can be improved. We propose such a self-correcting multi-atlas segmentation method. The method is applied on hippocampus segmentation from brain images and statistically significantly improvement is observed.
NASA Astrophysics Data System (ADS)
Deng, Feiyue; Yang, Shaopu; Tang, Guiji; Hao, Rujiang; Zhang, Mingliang
2017-04-01
Wheel bearings are essential mechanical components of trains, and fault detection of the wheel bearing is of great significant to avoid economic loss and casualty effectively. However, considering the operating conditions, detection and extraction of the fault features hidden in the heavy noise of the vibration signal have become a challenging task. Therefore, a novel method called adaptive multi-scale AVG-Hat morphology filter (MF) is proposed to solve it. The morphology AVG-Hat operator not only can suppress the interference of the strong background noise greatly, but also enhance the ability of extracting fault features. The improved envelope spectrum sparsity (IESS), as a new evaluation index, is proposed to select the optimal filtering signal processed by the multi-scale AVG-Hat MF. It can present a comprehensive evaluation about the intensity of fault impulse to the background noise. The weighted coefficients of the different scale structural elements (SEs) in the multi-scale MF are adaptively determined by the particle swarm optimization (PSO) algorithm. The effectiveness of the method is validated by analyzing the real wheel bearing fault vibration signal (e.g. outer race fault, inner race fault and rolling element fault). The results show that the proposed method could improve the performance in the extraction of fault features effectively compared with the multi-scale combined morphological filter (CMF) and multi-scale morphology gradient filter (MGF) methods.
A New Automated Design Method Based on Machine Learning for CMOS Analog Circuits
NASA Astrophysics Data System (ADS)
Moradi, Behzad; Mirzaei, Abdolreza
2016-11-01
A new simulation based automated CMOS analog circuit design method which applies a multi-objective non-Darwinian-type evolutionary algorithm based on Learnable Evolution Model (LEM) is proposed in this article. The multi-objective property of this automated design of CMOS analog circuits is governed by a modified Strength Pareto Evolutionary Algorithm (SPEA) incorporated in the LEM algorithm presented here. LEM includes a machine learning method such as the decision trees that makes a distinction between high- and low-fitness areas in the design space. The learning process can detect the right directions of the evolution and lead to high steps in the evolution of the individuals. The learning phase shortens the evolution process and makes remarkable reduction in the number of individual evaluations. The expert designer's knowledge on circuit is applied in the design process in order to reduce the design space as well as the design time. The circuit evaluation is made by HSPICE simulator. In order to improve the design accuracy, bsim3v3 CMOS transistor model is adopted in this proposed design method. This proposed design method is tested on three different operational amplifier circuits. The performance of this proposed design method is verified by comparing it with the evolutionary strategy algorithm and other similar methods.
Multi-level tree analysis of pulmonary artery/vein trees in non-contrast CT images
NASA Astrophysics Data System (ADS)
Gao, Zhiyun; Grout, Randall W.; Hoffman, Eric A.; Saha, Punam K.
2012-02-01
Diseases like pulmonary embolism and pulmonary hypertension are associated with vascular dystrophy. Identifying such pulmonary artery/vein (A/V) tree dystrophy in terms of quantitative measures via CT imaging significantly facilitates early detection of disease or a treatment monitoring process. A tree structure, consisting of nodes and connected arcs, linked to the volumetric representation allows multi-level geometric and volumetric analysis of A/V trees. Here, a new theory and method is presented to generate multi-level A/V tree representation of volumetric data and to compute quantitative measures of A/V tree geometry and topology at various tree hierarchies. The new method is primarily designed on arc skeleton computation followed by a tree construction based topologic and geometric analysis of the skeleton. The method starts with a volumetric A/V representation as input and generates its topologic and multi-level volumetric tree representations long with different multi-level morphometric measures. A new recursive merging and pruning algorithms are introduced to detect bad junctions and noisy branches often associated with digital geometric and topologic analysis. Also, a new notion of shortest axial path is introduced to improve the skeletal arc joining two junctions. The accuracy of the multi-level tree analysis algorithm has been evaluated using computer generated phantoms and pulmonary CT images of a pig vessel cast phantom while the reproducibility of method is evaluated using multi-user A/V separation of in vivo contrast-enhanced CT images of a pig lung at different respiratory volumes.
Multi-Hypothesis Modelling Capabilities for Robust Data-Model Integration
NASA Astrophysics Data System (ADS)
Walker, A. P.; De Kauwe, M. G.; Lu, D.; Medlyn, B.; Norby, R. J.; Ricciuto, D. M.; Rogers, A.; Serbin, S.; Weston, D. J.; Ye, M.; Zaehle, S.
2017-12-01
Large uncertainty is often inherent in model predictions due to imperfect knowledge of how to describe the mechanistic processes (hypotheses) that a model is intended to represent. Yet this model hypothesis uncertainty (MHU) is often overlooked or informally evaluated, as methods to quantify and evaluate MHU are limited. MHU is increased as models become more complex because each additional processes added to a model comes with inherent MHU as well as parametric unceratinty. With the current trend of adding more processes to Earth System Models (ESMs), we are adding uncertainty, which can be quantified for parameters but not MHU. Model inter-comparison projects do allow for some consideration of hypothesis uncertainty but in an ad hoc and non-independent fashion. This has stymied efforts to evaluate ecosystem models against data and intepret the results mechanistically because it is not simple to interpret exactly why a model is producing the results it does and identify which model assumptions are key as they combine models of many sub-systems and processes, each of which may be conceptualised and represented mathematically in various ways. We present a novel modelling framework—the multi-assumption architecture and testbed (MAAT)—that automates the combination, generation, and execution of a model ensemble built with different representations of process. We will present the argument that multi-hypothesis modelling needs to be considered in conjunction with other capabilities (e.g. the Predictive Ecosystem Analyser; PecAn) and statistical methods (e.g. sensitivity anaylsis, data assimilation) to aid efforts in robust data model integration to enhance our predictive understanding of biological systems.
[Multi-mathematical modelings for compatibility optimization of Jiangzhi granules].
Yang, Ming; Zhang, Li; Ge, Yingli; Lu, Yanliu; Ji, Guang
2011-12-01
To investigate into the method of "multi activity index evaluation and combination optimized of mult-component" for Chinese herbal formulas. According to the scheme of uniform experimental design, efficacy experiment, multi index evaluation, least absolute shrinkage, selection operator (LASSO) modeling, evolutionary optimization algorithm, validation experiment, we optimized the combination of Jiangzhi granules based on the activity indexes of blood serum ALT, ALT, AST, TG, TC, HDL, LDL and TG level of liver tissues, ratio of liver tissue to body. Analytic hierarchy process (AHP) combining with criteria importance through intercriteria correlation (CRITIC) for multi activity index evaluation was more reasonable and objective, it reflected the information of activity index's order and objective sample data. LASSO algorithm modeling could accurately reflect the relationship between different combination of Jiangzhi granule and the activity comprehensive indexes. The optimized combination of Jiangzhi granule showed better values of the activity comprehensive indexed than the original formula after the validation experiment. AHP combining with CRITIC can be used for multi activity index evaluation and LASSO algorithm, it is suitable for combination optimized of Chinese herbal formulas.
Liu, Min-Yin; Huang, Adam; Huang, Norden E.
2017-01-01
Sleep spindles are brief bursts of brain activity in the sigma frequency range (11–16 Hz) measured by electroencephalography (EEG) mostly during non-rapid eye movement (NREM) stage 2 sleep. These oscillations are of great biological and clinical interests because they potentially play an important role in identifying and characterizing the processes of various neurological disorders. Conventionally, sleep spindles are identified by expert sleep clinicians via visual inspection of EEG signals. The process is laborious and the results are inconsistent among different experts. To resolve the problem, numerous computerized methods have been developed to automate the process of sleep spindle identification. Still, the performance of these automated sleep spindle detection methods varies inconsistently from study to study. There are two reasons: (1) the lack of common benchmark databases, and (2) the lack of commonly accepted evaluation metrics. In this study, we focus on tackling the second problem by proposing to evaluate the performance of a spindle detector in a multi-objective optimization context and hypothesize that using the resultant Pareto fronts for deriving evaluation metrics will improve automatic sleep spindle detection. We use a popular multi-objective evolutionary algorithm (MOEA), the Strength Pareto Evolutionary Algorithm (SPEA2), to optimize six existing frequency-based sleep spindle detection algorithms. They include three Fourier, one continuous wavelet transform (CWT), and two Hilbert-Huang transform (HHT) based algorithms. We also explore three hybrid approaches. Trained and tested on open-access DREAMS and MASS databases, two new hybrid methods of combining Fourier with HHT algorithms show significant performance improvement with F1-scores of 0.726–0.737. PMID:28572762
NASA Astrophysics Data System (ADS)
Fatrias, D.; Kamil, I.; Meilani, D.
2018-03-01
Coordinating business operation with suppliers becomes increasingly important to survive and prosper under the dynamic business environment. A good partnership with suppliers not only increase efficiency, but also strengthen corporate competitiveness. Associated with such concern, this study aims to develop a practical approach of multi-criteria supplier evaluation using combined methods of Taguchi loss function (TLF), best-worst method (BWM) and VIse Kriterijumska Optimizacija kompromisno Resenje (VIKOR). A new framework of integrative approach adopting these methods is our main contribution for supplier evaluation in literature. In this integrated approach, a compromised supplier ranking list based on the loss score of suppliers is obtained using efficient steps of a pairwise comparison based decision making process. Implemetation to the case problem with real data from crumb rubber industry shows the usefulness of the proposed approach. Finally, a suitable managerial implication is presented.
Multi-Method Evaluation of College Teaching
ERIC Educational Resources Information Center
Algozzine, Bob; Beattie, John; Bray, Marty; Flowers, Claudia; Gretes, John; Mohanty, Ganesh; Spooner, Fred
2010-01-01
Student evaluation of instruction in college and university courses has been a routine and mandatory part of undergraduate and graduate education for some time. A major shortcoming of the process is that it relies exclusively on the opinions or qualitative judgments of students rather than on assessing the learning or transfer of knowledge that…
Fairley, C; Bleay, S M; Sears, V G; NicDaeid, N
2012-04-10
This paper reports a comparison of the effectiveness and practicality of using different multi-metal deposition processes for finger mark development. The work investigates whether modifications can be made to improve the performance of the existing process published by Schnetz. Secondly, we compare the ability of different multi-metal deposition processes to develop finger marks on a range of surfaces with that of other currently used development processes. All published multi-metal deposition processes utilise an initial stage of colloidal gold deposition followed by enhancement of the marks with using a physical developer. All possible combinations of colloidal gold and physical developer stages were tested. The method proposed by Schnetz was shown to be the most effective process, however a modification which reduced the pH of the enhancement solution was revealed to provide the best combination of effectiveness and practicality. In trials comparing the modified formulation with vacuum metal deposition, superglue and powder suspensions on surfaces which typically give low finger mark yields (cling film, plasticised vinyl, leather and masking tape), the modified method produced significantly better results over existing processes for cling film and plasticised vinyl. The modified formulation was found to be ineffective on both masking tape and leather. It is recommended that further tests be carried out on the modified multi-metal deposition formulation to establish whether it could be introduced for operational work on cling film material in particular. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Kernel Regression Estimation of Fiber Orientation Mixtures in Diffusion MRI
Cabeen, Ryan P.; Bastin, Mark E.; Laidlaw, David H.
2016-01-01
We present and evaluate a method for kernel regression estimation of fiber orientations and associated volume fractions for diffusion MR tractography and population-based atlas construction in clinical imaging studies of brain white matter. This is a model-based image processing technique in which representative fiber models are estimated from collections of component fiber models in model-valued image data. This extends prior work in nonparametric image processing and multi-compartment processing to provide computational tools for image interpolation, smoothing, and fusion with fiber orientation mixtures. In contrast to related work on multi-compartment processing, this approach is based on directional measures of divergence and includes data-adaptive extensions for model selection and bilateral filtering. This is useful for reconstructing complex anatomical features in clinical datasets analyzed with the ball-and-sticks model, and our framework’s data-adaptive extensions are potentially useful for general multi-compartment image processing. We experimentally evaluate our approach with both synthetic data from computational phantoms and in vivo clinical data from human subjects. With synthetic data experiments, we evaluate performance based on errors in fiber orientation, volume fraction, compartment count, and tractography-based connectivity. With in vivo data experiments, we first show improved scan-rescan reproducibility and reliability of quantitative fiber bundle metrics, including mean length, volume, streamline count, and mean volume fraction. We then demonstrate the creation of a multi-fiber tractography atlas from a population of 80 human subjects. In comparison to single tensor atlasing, our multi-fiber atlas shows more complete features of known fiber bundles and includes reconstructions of the lateral projections of the corpus callosum and complex fronto-parietal connections of the superior longitudinal fasciculus I, II, and III. PMID:26691524
1991-09-01
iv III. THE ANALYTIC HIERARCHY PROCESS ..... ........ 15 A. INTRODUCTION ...... ................. 15 B. THE AHP PROCESS ...... ................ 16 C...INTRODUCTION ...... ................. 26 B. IMPLEMENTATION OF CERTS USING AHP ........ .. 27 1. Consistency ...... ................ 29 2. User Interface...the proposed technique into a Decision Support System. Expert Choice implements the Analytic Hierarchy Process ( AHP ), an approach to multi- criteria
NASA Astrophysics Data System (ADS)
Han, S. T.; Shu, X. D.; Shchukin, V.; Kozhevnikova, G.
2018-06-01
In order to achieve reasonable process parameters in forming multi-step shaft by cross wedge rolling, the research studied the rolling-forming process multi-step shaft on the DEFORM-3D finite element software. The interactive orthogonal experiment was used to study the effect of the eight parameters, the first section shrinkage rate φ1, the first forming angle α1, the first spreading angle β1, the first spreading length L1, the second section shrinkage rate φ2, the second forming angle α2, the second spreading angle β2 and the second spreading length L2, on the quality of shaft end and the microstructure uniformity. By using the fuzzy mathematics comprehensive evaluation method and the extreme difference analysis, the influence degree of the process parameters on the quality of the multi-step shaft is obtained: β2>φ2L1>α1>β1>φ1>α2L2. The results of the study can provide guidance for obtaining multi-stepped shaft with high mechanical properties and achieving near net forming without stub bar in cross wedge rolling.
Toward a Model Framework of Generalized Parallel Componential Processing of Multi-Symbol Numbers
ERIC Educational Resources Information Center
Huber, Stefan; Cornelsen, Sonja; Moeller, Korbinian; Nuerk, Hans-Christoph
2015-01-01
In this article, we propose and evaluate a new model framework of parallel componential multi-symbol number processing, generalizing the idea of parallel componential processing of multi-digit numbers to the case of negative numbers by considering the polarity signs similar to single digits. In a first step, we evaluated this account by defining…
Operator performance evaluation using multi criteria decision making methods
NASA Astrophysics Data System (ADS)
Rani, Ruzanita Mat; Ismail, Wan Rosmanira; Razali, Siti Fatihah
2014-06-01
Operator performance evaluation is a very important operation in labor-intensive manufacturing industry because the company's productivity depends on the performance of its operators. The aims of operator performance evaluation are to give feedback to operators on their performance, to increase company's productivity and to identify strengths and weaknesses of each operator. In this paper, six multi criteria decision making methods; Analytical Hierarchy Process (AHP), fuzzy AHP (FAHP), ELECTRE, PROMETHEE II, Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) and VlseKriterijumska Optimizacija I Kompromisno Resenje (VIKOR) are used to evaluate the operators' performance and to rank the operators. The performance evaluation is based on six main criteria; competency, experience and skill, teamwork and time punctuality, personal characteristics, capability and outcome. The study was conducted at one of the SME food manufacturing companies in Selangor. From the study, it is found that AHP and FAHP yielded the "outcome" criteria as the most important criteria. The results of operator performance evaluation showed that the same operator is ranked the first using all six methods.
NASA Astrophysics Data System (ADS)
Ariyarit, Atthaphon; Sugiura, Masahiko; Tanabe, Yasutada; Kanazaki, Masahiro
2018-06-01
A multi-fidelity optimization technique by an efficient global optimization process using a hybrid surrogate model is investigated for solving real-world design problems. The model constructs the local deviation using the kriging method and the global model using a radial basis function. The expected improvement is computed to decide additional samples that can improve the model. The approach was first investigated by solving mathematical test problems. The results were compared with optimization results from an ordinary kriging method and a co-kriging method, and the proposed method produced the best solution. The proposed method was also applied to aerodynamic design optimization of helicopter blades to obtain the maximum blade efficiency. The optimal shape obtained by the proposed method achieved performance almost equivalent to that obtained using the high-fidelity, evaluation-based single-fidelity optimization. Comparing all three methods, the proposed method required the lowest total number of high-fidelity evaluation runs to obtain a converged solution.
Xu, Weichen; Jimenez, Rod Brian; Mowery, Rachel; Luo, Haibin; Cao, Mingyan; Agarwal, Nitin; Ramos, Irina; Wang, Xiangyang; Wang, Jihong
2017-10-01
During manufacturing and storage process, therapeutic proteins are subject to various post-translational modifications (PTMs), such as isomerization, deamidation, oxidation, disulfide bond modifications and glycosylation. Certain PTMs may affect bioactivity, stability or pharmacokinetics and pharmacodynamics profile and are therefore classified as potential critical quality attributes (pCQAs). Identifying, monitoring and controlling these PTMs are usually key elements of the Quality by Design (QbD) approach. Traditionally, multiple analytical methods are utilized for these purposes, which is time consuming and costly. In recent years, multi-attribute monitoring methods have been developed in the biopharmaceutical industry. However, these methods combine high-end mass spectrometry with complicated data analysis software, which could pose difficulty when implementing in a quality control (QC) environment. Here we report a multi-attribute method (MAM) using a Quadrupole Dalton (QDa) mass detector to selectively monitor and quantitate PTMs in a therapeutic monoclonal antibody. The result output from the QDa-based MAM is straightforward and automatic. Evaluation results indicate this method provides comparable results to the traditional assays. To ensure future application in the QC environment, this method was qualified according to the International Conference on Harmonization (ICH) guideline and applied in the characterization of drug substance and stability samples. The QDa-based MAM is shown to be an extremely useful tool for product and process characterization studies that facilitates facile understanding of process impact on multiple quality attributes, while being QC friendly and cost-effective.
A Multi-level Fuzzy Evaluation Method for Smart Distribution Network Based on Entropy Weight
NASA Astrophysics Data System (ADS)
Li, Jianfang; Song, Xiaohui; Gao, Fei; Zhang, Yu
2017-05-01
Smart distribution network is considered as the future trend of distribution network. In order to comprehensive evaluate smart distribution construction level and give guidance to the practice of smart distribution construction, a multi-level fuzzy evaluation method based on entropy weight is proposed. Firstly, focus on both the conventional characteristics of distribution network and new characteristics of smart distribution network such as self-healing and interaction, a multi-level evaluation index system which contains power supply capability, power quality, economy, reliability and interaction is established. Then, a combination weighting method based on Delphi method and entropy weight method is put forward, which take into account not only the importance of the evaluation index in the experts’ subjective view, but also the objective and different information from the index values. Thirdly, a multi-level evaluation method based on fuzzy theory is put forward. Lastly, an example is conducted based on the statistical data of some cites’ distribution network and the evaluation method is proved effective and rational.
Varmazyar, Mohsen; Dehghanbaghi, Maryam; Afkhami, Mehdi
2016-10-01
Balanced Scorecard (BSC) is a strategic evaluation tool using both financial and non-financial indicators to determine the business performance of organizations or companies. In this paper, a new integrated approach based on the Balanced Scorecard (BSC) and multi-criteria decision making (MCDM) methods are proposed to evaluate the performance of research centers of research and technology organization (RTO) in Iran. Decision-Making Trial and Evaluation Laboratory (DEMATEL) are employed to reflect the interdependencies among BSC perspectives. Then, Analytic Network Process (ANP) is utilized to weight the indices influencing the considered problem. In the next step, we apply four MCDM methods including Additive Ratio Assessment (ARAS), Complex Proportional Assessment (COPRAS), Multi-Objective Optimization by Ratio Analysis (MOORA), and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) for ranking of alternatives. Finally, the utility interval technique is applied to combine the ranking results of MCDM methods. Weighted utility intervals are computed by constructing a correlation matrix between the ranking methods. A real case is presented to show the efficacy of the proposed approach. Copyright © 2016 Elsevier Ltd. All rights reserved.
Multi-trait, multi-breed conception rate evaluations
USDA-ARS?s Scientific Manuscript database
Heifer and cow conception rates (HCR and CCR) were evaluated with multi-trait, multi-breed models including crossbred cows instead of the previous single-trait, single-breed models. Fertility traits benefit from multi-trait processing because of high genetic correlations and many missing observation...
Churilov, Leonid; Liu, Daniel; Ma, Henry; Christensen, Soren; Nagakane, Yoshinari; Campbell, Bruce; Parsons, Mark W; Levi, Christopher R; Davis, Stephen M; Donnan, Geoffrey A
2013-04-01
The appropriateness of a software platform for rapid MRI assessment of the amount of salvageable brain tissue after stroke is critical for both the validity of the Extending the Time for Thrombolysis in Emergency Neurological Deficits (EXTEND) Clinical Trial of stroke thrombolysis beyond 4.5 hours and for stroke patient care outcomes. The objective of this research is to develop and implement a methodology for selecting the acute stroke imaging software platform most appropriate for the setting of a multi-centre clinical trial. A multi-disciplinary decision making panel formulated the set of preferentially independent evaluation attributes. Alternative Multi-Attribute Value Measurement methods were used to identify the best imaging software platform followed by sensitivity analysis to ensure the validity and robustness of the proposed solution. Four alternative imaging software platforms were identified. RApid processing of PerfusIon and Diffusion (RAPID) software was selected as the most appropriate for the needs of the EXTEND trial. A theoretically grounded generic multi-attribute selection methodology for imaging software was developed and implemented. The developed methodology assured both a high quality decision outcome and a rational and transparent decision process. This development contributes to stroke literature in the area of comprehensive evaluation of MRI clinical software. At the time of evaluation, RAPID software presented the most appropriate imaging software platform for use in the EXTEND clinical trial. The proposed multi-attribute imaging software evaluation methodology is based on sound theoretical foundations of multiple criteria decision analysis and can be successfully used for choosing the most appropriate imaging software while ensuring both robust decision process and outcomes. © 2012 The Authors. International Journal of Stroke © 2012 World Stroke Organization.
Selecting essential information for biosurveillance--a multi-criteria decision analysis.
Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina
2014-01-01
The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.
Pereira, Suzanne; Névéol, Aurélie; Kerdelhué, Gaétan; Serrot, Elisabeth; Joubert, Michel; Darmoni, Stéfan J.
2008-01-01
Background: To assist with the development of a French online quality-controlled health gateway (CISMeF), an automatic indexing tool assigning MeSH descriptors to medical text in French was created. The French Multi-Terminology Indexer (F-MTI) relies on a multi-terminology approach involving four prominent medical terminologies and the mappings between them. Objective: In this paper, we compare lemmatization and stemming as methods to process French medical text for indexing. We also evaluate the multi-terminology approach implemented in F-MTI. Methods: The indexing strategies were assessed on a corpus of 18,814 resources indexed manually. Results: There is little difference in the indexing performance when lemmatization or stemming is used. However, the multi-terminology approach outperforms indexing relying on a single terminology in terms of recall. Conclusion: F-MTI will soon be used in the CISMeF production environment and in a Health MultiTerminology Server in French. PMID:18998933
An Empirical Investigation of Entrepreneurship Intensity in Iranian State Universities
ERIC Educational Resources Information Center
Mazdeh, Mohammad Mahdavi; Razavi, Seyed-Mostafa; Hesamamiri, Roozbeh; Zahedi, Mohammad-Reza; Elahi, Behin
2013-01-01
The purpose of this study is to propose a framework to evaluate the entrepreneurship intensity (EI) of Iranian state universities. In order to determine EI, a hybrid multi-method framework consisting of Delphi, Analytic Network Process (ANP), and VIKOR is proposed. The Delphi method is used to localize and reduce the number of criteria extracted…
Community Currency Trading Method through Partial Transaction Intermediary Process
NASA Astrophysics Data System (ADS)
Kido, Kunihiko; Hasegawa, Seiichi; Komoda, Norihisa
A community currency is local money that is issued by local governments or Non-Profit Organization (NPO) to support social services. The purpose of introducing community currencies is to regenerate communities by fostering mutual aids among community members. In this paper, we propose a community currency trading method through partial intermediary process, under operational environments without introducing coordinators all the time. In this method, coordinators perform coordination between service users and service providers during several months from the start point of transactions. After the period of coordination, participants spontaneously make transactions based on their trust area and a trust evaluation method based on the number of provided services and complaint information. This method is especially effective to communities with close social networks and low trustworthiness. The proposed method is evaluated through multi-agent simulation.
Sun, Jian-Ning; Sun, Wen-Yan; Dong, Shi-Fen
2017-03-01
The Chinese herbal compound formula preparation was made based on theory of Chinese medicine, which was confirmed by long period clinical application, and with multi-compound and multi-target characteristics. During the exploitation process of innovation medicine of Chinese herbal compound formula, selecting and speeding up the research development of drugs with clinical value shall be paid more attention, and as request of rules involved in new drug research and development, the whole process management should be carried out, including project evaluation, manufacturing process determination, establishment of quality control standards, evaluation for pharmacological and toxic effect, as well as new drug application process. This reviews was aimed to give some proposals for pharmacodynamics research methods involved in exploration of Chinese herbal compound formula preparation, including: ①the endpoint criteria should meet the clinical attribution of new drugs; ②the pre-clinical pharmacodynamics evaluation should be carried on appropriate animal models according to the characteristics of diagnosis and therapy of Chinese medicine and observation indexes; ③during the innovation of drug for infants and children, information on drug action conforming to physiological characteristics of infants and children should be supplied, and the pharmacodynamics and toxicology research shall be conducted in immature rats according to the body weight of children. In a summary, the clinical application characteristics are the important criteria for evaluation of pharmacological effect of innovation medicine of Chinese herbal compound formula. Copyright© by the Chinese Pharmaceutical Association.
[Modern research progress of traditional Chinese medicine based on integrative pharmacology].
Wang, Ping; Tang, Shi-Huan; Su, Jin; Zhang, Jia-Qi; Cui, Ru-Yi; Xu, Hai-Yu; Yang, Hong-Jun
2018-04-01
Integrative pharmacology (IP) is a discipline that studies the interaction, integration and principle of action of multiple components with the body, emphasizing the integrations of multi-level and multi-link, such as "whole and part", " in vivo and in vitro ", " in vivo process and activity evaluation". After four years of development and practice, the theory and method of IP has received extensive attention and application.In order to better promote the development of IP, this paper systematically reviews the concepts, research contents, research methods and application fields about IP. Copyright© by the Chinese Pharmaceutical Association.
HIPS: A new hippocampus subfield segmentation method.
Romero, José E; Coupé, Pierrick; Manjón, José V
2017-12-01
The importance of the hippocampus in the study of several neurodegenerative diseases such as Alzheimer's disease makes it a structure of great interest in neuroimaging. However, few segmentation methods have been proposed to measure its subfields due to its complex structure and the lack of high resolution magnetic resonance (MR) data. In this work, we present a new pipeline for automatic hippocampus subfield segmentation using two available hippocampus subfield delineation protocols that can work with both high and standard resolution data. The proposed method is based on multi-atlas label fusion technology that benefits from a novel multi-contrast patch match search process (using high resolution T1-weighted and T2-weighted images). The proposed method also includes as post-processing a new neural network-based error correction step to minimize systematic segmentation errors. The method has been evaluated on both high and standard resolution images and compared to other state-of-the-art methods showing better results in terms of accuracy and execution time. Copyright © 2017 Elsevier Inc. All rights reserved.
Multi-frame image processing with panning cameras and moving subjects
NASA Astrophysics Data System (ADS)
Paolini, Aaron; Humphrey, John; Curt, Petersen; Kelmelis, Eric
2014-06-01
Imaging scenarios commonly involve erratic, unpredictable camera behavior or subjects that are prone to movement, complicating multi-frame image processing techniques. To address these issues, we developed three techniques that can be applied to multi-frame image processing algorithms in order to mitigate the adverse effects observed when cameras are panning or subjects within the scene are moving. We provide a detailed overview of the techniques and discuss the applicability of each to various movement types. In addition to this, we evaluated algorithm efficacy with demonstrated benefits using field test video, which has been processed using our commercially available surveillance product. Our results show that algorithm efficacy is significantly improved in common scenarios, expanding our software's operational scope. Our methods introduce little computational burden, enabling their use in real-time and low-power solutions, and are appropriate for long observation periods. Our test cases focus on imaging through turbulence, a common use case for multi-frame techniques. We present results of a field study designed to test the efficacy of these techniques under expanded use cases.
Cultural and Linguistic Adaptation of a Healthy Diet Text Message Intervention for Hispanic Adults
Cameron, Linda D.; Durazo, Arturo; Ramirez, A. Susana; Corona, Roberto; Ultreras, Mayra; Piva, Sonia
2017-01-01
Hispanics represent a critical target for culturally-adapted diet interventions. In this formative research, we translated HealthyYouTXT, an mHealth program developed by the U.S. National Cancer Institute, into HealthyYouTXT en Español, a linguistically and culturally appropriate version for Spanish speakers. We report a three-stage, mixed-methods process through which we culturally adapted the text messages, evaluated their acceptability, and revised the program based on the findings. In Stage 1, we conducted initial translations and adaptations of the text libraries using an iterative, principle-guided process. In Stage 2, we used mixed methods including focus groups and surveys with 109 Hispanic adults to evaluate the acceptability and cultural appropriateness of the program. Further, we used survey data to evaluate whether Self-Determination Theory factors (used to develop HealthyYouTXT) of autonomous motivation, controlled motivation, and amotivation and Hispanic cultural beliefs about familism, fatalism, and destiny predict program interest and its perceived efficacy. Mixed-methods analyses revealed substantial interest in HealthyYouTXT, with most participants expressing substantial interest in using it and viewing it as highly efficacious. Both cultural beliefs (i.e., beliefs in destiny and, for men, high familism) and SDT motivations (i.e., autonomy) predicted HealthyYouTXT evaluations, suggesting utility in emphasizing them in messages. Higher destiny beliefs predicted lower interest and perceived efficacy, suggesting they could impede program use. In Stage 3, we implemented the mixed-methods findings to generate a revised HealthyYouTXT en Español. The emergent linguistic principles and multi-stage, multi-methods process can be applied beneficially in health communication adaptations. PMID:28248628
Single well tracer method to evaluate enhanced recovery
Sheely, Jr., Clyde Q.; Baldwin, Jr., David E.
1978-01-01
Data useful to evaluate the effectiveness of or to design an enhanced recovery process (the recovery process involving mobilizing and moving hydrocarbons through a hydrocarbon-bearing subterranean formation from an injection well to a production well by injecting a mobilizing fluid into the injection well) are obtained by a process which comprises sequentially: determining hydrocarbon saturation in the formation in a volume in the formation near a well bore penetrating the formation, injecting sufficient of the mobilizing fluid to mobilize and move hydrocarbons from a volume in the formation near the well bore penetrating the formation, and determining by the single well tracer method a hydrocarbon saturation profile in a volume from which hydrocarbons are moved. The single well tracer method employed is disclosed by U.S. Pat. No. 3,623,842. The process is useful to evaluate surfactant floods, water floods, polymer floods, CO.sub.2 floods, caustic floods, micellar floods, and the like in the reservoir in much less time at greatly reduced costs, compared to conventional multi-well pilot test.
Multi-off-grid methods in multi-step integration of ordinary differential equations
NASA Technical Reports Server (NTRS)
Beaudet, P. R.
1974-01-01
Description of methods of solving first- and second-order systems of differential equations in which all derivatives are evaluated at off-grid locations in order to circumvent the Dahlquist stability limitation on the order of on-grid methods. The proposed multi-off-grid methods require off-grid state predictors for the evaluation of the n derivatives at each step. Progressing forward in time, the off-grid states are predicted using a linear combination of back on-grid state values and off-grid derivative evaluations. A comparison is made between the proposed multi-off-grid methods and the corresponding Adams and Cowell on-grid integration techniques in integrating systems of ordinary differential equations, showing a significant reduction in the error at larger step sizes in the case of the multi-off-grid integrator.
Error-Rate Estimation Based on Multi-Signal Flow Graph Model and Accelerated Radiation Tests
Wang, Yueke; Xing, Kefei; Deng, Wei; Zhang, Zelong
2016-01-01
A method of evaluating the single-event effect soft-error vulnerability of space instruments before launched has been an active research topic in recent years. In this paper, a multi-signal flow graph model is introduced to analyze the fault diagnosis and meantime to failure (MTTF) for space instruments. A model for the system functional error rate (SFER) is proposed. In addition, an experimental method and accelerated radiation testing system for a signal processing platform based on the field programmable gate array (FPGA) is presented. Based on experimental results of different ions (O, Si, Cl, Ti) under the HI-13 Tandem Accelerator, the SFER of the signal processing platform is approximately 10−3(error/particle/cm2), while the MTTF is approximately 110.7 h. PMID:27583533
Error-Rate Estimation Based on Multi-Signal Flow Graph Model and Accelerated Radiation Tests.
He, Wei; Wang, Yueke; Xing, Kefei; Deng, Wei; Zhang, Zelong
2016-01-01
A method of evaluating the single-event effect soft-error vulnerability of space instruments before launched has been an active research topic in recent years. In this paper, a multi-signal flow graph model is introduced to analyze the fault diagnosis and meantime to failure (MTTF) for space instruments. A model for the system functional error rate (SFER) is proposed. In addition, an experimental method and accelerated radiation testing system for a signal processing platform based on the field programmable gate array (FPGA) is presented. Based on experimental results of different ions (O, Si, Cl, Ti) under the HI-13 Tandem Accelerator, the SFER of the signal processing platform is approximately 10-3(error/particle/cm2), while the MTTF is approximately 110.7 h.
Fusion of multi-spectral and panchromatic images based on 2D-PWVD and SSIM
NASA Astrophysics Data System (ADS)
Tan, Dongjie; Liu, Yi; Hou, Ruonan; Xue, Bindang
2016-03-01
A combined method using 2D pseudo Wigner-Ville distribution (2D-PWVD) and structural similarity(SSIM) index is proposed for fusion of low resolution multi-spectral (MS) image and high resolution panchromatic (PAN) image. First, the intensity component of multi-spectral image is extracted with generalized IHS transform. Then, the spectrum diagrams of the intensity components of multi-spectral image and panchromatic image are obtained with 2D-PWVD. Different fusion rules are designed for different frequency information of the spectrum diagrams. SSIM index is used to evaluate the high frequency information of the spectrum diagrams for assigning the weights in the fusion processing adaptively. After the new spectrum diagram is achieved according to the fusion rule, the final fusion image can be obtained by inverse 2D-PWVD and inverse GIHS transform. Experimental results show that, the proposed method can obtain high quality fusion images.
Progress in centralised ethics review processes: Implications for multi-site health evaluations.
Prosser, Brenton; Davey, Rachel; Gibson, Diane
2015-04-01
Increasingly, public sector programmes respond to complex social problems that intersect specific fields and individual disciplines. Such responses result in multi-site initiatives that can span nations, jurisdictions, sectors and organisations. The rigorous evaluation of public sector programmes is now a baseline expectation. For evaluations of large and complex multi-site programme initiatives, the processes of ethics review can present a significant challenge. However in recent years, there have been new developments in centralised ethics review processes in many nations. This paper provides the case study of an evaluation of a national, inter-jurisdictional, cross-sector, aged care health initiative and its encounters with Australian centralised ethics review processes. Specifically, the paper considers progress against the key themes of a previous five-year, five nation study (Fitzgerald and Phillips, 2006), which found that centralised ethics review processes would save time, money and effort, as well as contribute to more equitable workloads for researchers and evaluators. The paper concludes with insights for those charged with refining centralised ethics review processes, as well as recommendations for future evaluators of complex multi-site programme initiatives. Copyright © 2015 Elsevier Ltd. All rights reserved.
Data Processing And Machine Learning Methods For Multi-Modal Operator State Classification Systems
NASA Technical Reports Server (NTRS)
Hearn, Tristan A.
2015-01-01
This document is intended as an introduction to a set of common signal processing learning methods that may be used in the software portion of a functional crew state monitoring system. This includes overviews of both the theory of the methods involved, as well as examples of implementation. Practical considerations are discussed for implementing modular, flexible, and scalable processing and classification software for a multi-modal, multi-channel monitoring system. Example source code is also given for all of the discussed processing and classification methods.
Jabeen, Sumera
2018-06-01
Social development programmes are deliberate attempts to bring about change and unintended outcomes can be considered as inherent to any such intervention. There is now a solid consensus among the international evaluation community regarding the need to consider unintended outcomes as a key aspect in any evaluative study. However, this concern often equates to nothing more than false piety. Exiting evaluation theory suffers from overlap of terminology, inadequate categorisation of unintended outcomes and lack of guidance on how to study them. To advance the knowledge of evaluation theory, methods and practice, the author has developed an evaluation approach to study unintended effects using a theory building, testing and refinement process. A comprehensive classification of unintended outcomes on the basis of knowability, value, distribution and temporality helped specify various type of unintended outcomes for programme evaluation. Corresponding to this classification, a three-step evaluation process was proposed including a) outlining programme intentions b) forecasting likely unintended effects c) mapping the anticipated and understanding unanticipated unintended outcomes. This unintended outcomes evaluation approach (UOEA) was then trialled by undertaking a multi-site and multi-method case study of a poverty alleviation programme in Pakistan and refinements were made to the approach.The case study revealed that this programme was producing a number of unintended effects, mostly negative, affecting those already disadvantaged such as the poorest, women and children. The trialling process demonstrated the effectiveness of the UOEA and suggests that this can serve as a useful guide for future evaluation practice. It also provides the discipline of evaluation with an empirically-based reference point for further theoretical developments in the study of unintended outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.
Metric Evaluation Pipeline for 3d Modeling of Urban Scenes
NASA Astrophysics Data System (ADS)
Bosch, M.; Leichtman, A.; Chilcott, D.; Goldberg, H.; Brown, M.
2017-05-01
Publicly available benchmark data and metric evaluation approaches have been instrumental in enabling research to advance state of the art methods for remote sensing applications in urban 3D modeling. Most publicly available benchmark datasets have consisted of high resolution airborne imagery and lidar suitable for 3D modeling on a relatively modest scale. To enable research in larger scale 3D mapping, we have recently released a public benchmark dataset with multi-view commercial satellite imagery and metrics to compare 3D point clouds with lidar ground truth. We now define a more complete metric evaluation pipeline developed as publicly available open source software to assess semantically labeled 3D models of complex urban scenes derived from multi-view commercial satellite imagery. Evaluation metrics in our pipeline include horizontal and vertical accuracy and completeness, volumetric completeness and correctness, perceptual quality, and model simplicity. Sources of ground truth include airborne lidar and overhead imagery, and we demonstrate a semi-automated process for producing accurate ground truth shape files to characterize building footprints. We validate our current metric evaluation pipeline using 3D models produced using open source multi-view stereo methods. Data and software is made publicly available to enable further research and planned benchmarking activities.
Multi-focus image fusion based on window empirical mode decomposition
NASA Astrophysics Data System (ADS)
Qin, Xinqiang; Zheng, Jiaoyue; Hu, Gang; Wang, Jiao
2017-09-01
In order to improve multi-focus image fusion quality, a novel fusion algorithm based on window empirical mode decomposition (WEMD) is proposed. This WEMD is an improved form of bidimensional empirical mode decomposition (BEMD), due to its decomposition process using the adding window principle, effectively resolving the signal concealment problem. We used WEMD for multi-focus image fusion, and formulated different fusion rules for bidimensional intrinsic mode function (BIMF) components and the residue component. For fusion of the BIMF components, the concept of the Sum-modified-Laplacian was used and a scheme based on the visual feature contrast adopted; when choosing the residue coefficients, a pixel value based on the local visibility was selected. We carried out four groups of multi-focus image fusion experiments and compared objective evaluation criteria with other three fusion methods. The experimental results show that the proposed fusion approach is effective and performs better at fusing multi-focus images than some traditional methods.
A Novel Health Evaluation Strategy for Multifunctional Self-Validating Sensors
Shen, Zhengguang; Wang, Qi
2013-01-01
The performance evaluation of sensors is very important in actual application. In this paper, a theory based on multi-variable information fusion is studied to evaluate the health level of multifunctional sensors. A novel conception of health reliability degree (HRD) is defined to indicate a quantitative health level, which is different from traditional so-called qualitative fault diagnosis. To evaluate the health condition from both local and global perspectives, the HRD of a single sensitive component at multiple time points and the overall multifunctional sensor at a single time point are defined, respectively. The HRD methodology is emphasized by using multi-variable data fusion technology coupled with a grey comprehensive evaluation method. In this method, to acquire the distinct importance of each sensitive unit and the sensitivity of different time points, the information entropy and analytic hierarchy process method are used, respectively. In order to verify the feasibility of the proposed strategy, a health evaluating experimental system for multifunctional self-validating sensors was designed. The five different health level situations have been discussed. Successful results show that the proposed method is feasible, the HRD could be used to quantitatively indicate the health level and it does have a fast response to the performance changes of multifunctional sensors. PMID:23291576
Multi-objective optimization for generating a weighted multi-model ensemble
NASA Astrophysics Data System (ADS)
Lee, H.
2017-12-01
Many studies have demonstrated that multi-model ensembles generally show better skill than each ensemble member. When generating weighted multi-model ensembles, the first step is measuring the performance of individual model simulations using observations. There is a consensus on the assignment of weighting factors based on a single evaluation metric. When considering only one evaluation metric, the weighting factor for each model is proportional to a performance score or inversely proportional to an error for the model. While this conventional approach can provide appropriate combinations of multiple models, the approach confronts a big challenge when there are multiple metrics under consideration. When considering multiple evaluation metrics, it is obvious that a simple averaging of multiple performance scores or model ranks does not address the trade-off problem between conflicting metrics. So far, there seems to be no best method to generate weighted multi-model ensembles based on multiple performance metrics. The current study applies the multi-objective optimization, a mathematical process that provides a set of optimal trade-off solutions based on a range of evaluation metrics, to combining multiple performance metrics for the global climate models and their dynamically downscaled regional climate simulations over North America and generating a weighted multi-model ensemble. NASA satellite data and the Regional Climate Model Evaluation System (RCMES) software toolkit are used for assessment of the climate simulations. Overall, the performance of each model differs markedly with strong seasonal dependence. Because of the considerable variability across the climate simulations, it is important to evaluate models systematically and make future projections by assigning optimized weighting factors to the models with relatively good performance. Our results indicate that the optimally weighted multi-model ensemble always shows better performance than an arithmetic ensemble mean and may provide reliable future projections.
Wilson, Annabelle M; Magarey, Anthea M; Dollman, James; Jones, Michelle; Mastersson, Nadia
2010-08-01
To describe the rationale, development and implementation of the quantitative component of evaluation of a multi-setting, multi-strategy, community-based childhood obesity prevention project (the eat well be active (ewba) Community Programs) and the challenges associated with this process and some potential solutions. ewba has a quasi-experimental design with intervention and comparison communities. Baseline data were collected in 2006 and post-intervention measures will be taken from a non-matched cohort in 2009. Schoolchildren aged 10-12 years were chosen as one litmus group for evaluation purposes. Thirty-nine primary schools in two metropolitan and two rural communities in South Australia. A total of 1732 10-12-year-old school students completed a nutrition and/or a physical activity questionnaire and 1637 had anthropometric measures taken; 983 parents, 286 teachers, thirty-six principals, twenty-six canteen and thirteen out-of-school-hours care (OSHC) workers completed Program-specific questionnaires developed for each of these target groups. The overall child response rate for the study was 49 %. Sixty-five per cent, 43 %, 90 %, 90 % and 68 % of parent, teachers, principals, canteen and OSHC workers respectively, completed and returned questionnaires. A number of practical, logistical and methodological challenges were experienced when undertaking this data collection. Learnings from the process of quantitative baseline data collection for the ewba Community Programs can provide insights for other researchers planning similar studies with similar methods, particularly those evaluating multi-strategy programmes across multiple settings.
Natural Hazard Susceptibility Assessment for Road Planning Using Spatial Multi-Criteria Analysis
NASA Astrophysics Data System (ADS)
Karlsson, Caroline S. J.; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve W.
2017-11-01
Inadequate infrastructural networks can be detrimental to society if transport between locations becomes hindered or delayed, especially due to natural hazards which are difficult to control. Thus determining natural hazard susceptible areas and incorporating them in the initial planning process, may reduce infrastructural damages in the long run. The objective of this study was to evaluate the usefulness of expert judgments for assessing natural hazard susceptibility through a spatial multi-criteria analysis approach using hydrological, geological, and land use factors. To utilize spatial multi-criteria analysis for decision support, an analytic hierarchy process was adopted where expert judgments were evaluated individually and in an aggregated manner. The estimates of susceptible areas were then compared with the methods weighted linear combination using equal weights and factor interaction method. Results showed that inundation received the highest susceptibility. Using expert judgment showed to perform almost the same as equal weighting where the difference in susceptibility between the two for inundation was around 4%. The results also showed that downscaling could negatively affect the susceptibility assessment and be highly misleading. Susceptibility assessment through spatial multi-criteria analysis is useful for decision support in early road planning despite its limitation to the selection and use of decision rules and criteria. A natural hazard spatial multi-criteria analysis could be used to indicate areas where more investigations need to be undertaken from a natural hazard point of view, and to identify areas thought to have higher susceptibility along existing roads where mitigation measures could be targeted after in-situ investigations.
Selecting Essential Information for Biosurveillance—A Multi-Criteria Decision Analysis
Generous, Nicholas; Margevicius, Kristen J.; Taylor-McCabe, Kirsten J.; Brown, Mac; Daniel, W. Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina
2014-01-01
The National Strategy for Biosurveillancedefines biosurveillance as “the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels.” However, the strategy does not specify how “essential information” is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being “essential”. Thequestion of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of “essential information” for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system. PMID:24489748
Natural Hazard Susceptibility Assessment for Road Planning Using Spatial Multi-Criteria Analysis.
Karlsson, Caroline S J; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve W
2017-11-01
Inadequate infrastructural networks can be detrimental to society if transport between locations becomes hindered or delayed, especially due to natural hazards which are difficult to control. Thus determining natural hazard susceptible areas and incorporating them in the initial planning process, may reduce infrastructural damages in the long run. The objective of this study was to evaluate the usefulness of expert judgments for assessing natural hazard susceptibility through a spatial multi-criteria analysis approach using hydrological, geological, and land use factors. To utilize spatial multi-criteria analysis for decision support, an analytic hierarchy process was adopted where expert judgments were evaluated individually and in an aggregated manner. The estimates of susceptible areas were then compared with the methods weighted linear combination using equal weights and factor interaction method. Results showed that inundation received the highest susceptibility. Using expert judgment showed to perform almost the same as equal weighting where the difference in susceptibility between the two for inundation was around 4%. The results also showed that downscaling could negatively affect the susceptibility assessment and be highly misleading. Susceptibility assessment through spatial multi-criteria analysis is useful for decision support in early road planning despite its limitation to the selection and use of decision rules and criteria. A natural hazard spatial multi-criteria analysis could be used to indicate areas where more investigations need to be undertaken from a natural hazard point of view, and to identify areas thought to have higher susceptibility along existing roads where mitigation measures could be targeted after in-situ investigations.
Kim, Ju-Won; Park, Seunghee
2018-01-02
In this study, a magnetic flux leakage (MFL) method, known to be a suitable non-destructive evaluation (NDE) method for continuum ferromagnetic structures, was used to detect local damage when inspecting steel wire ropes. To demonstrate the proposed damage detection method through experiments, a multi-channel MFL sensor head was fabricated using a Hall sensor array and magnetic yokes to adapt to the wire rope. To prepare the damaged wire-rope specimens, several different amounts of artificial damages were inflicted on wire ropes. The MFL sensor head was used to scan the damaged specimens to measure the magnetic flux signals. After obtaining the signals, a series of signal processing steps, including the enveloping process based on the Hilbert transform (HT), was performed to better recognize the MFL signals by reducing the unexpected noise. The enveloped signals were then analyzed for objective damage detection by comparing them with a threshold that was established based on the generalized extreme value (GEV) distribution. The detected MFL signals that exceed the threshold were analyzed quantitatively by extracting the magnetic features from the MFL signals. To improve the quantitative analysis, damage indexes based on the relationship between the enveloped MFL signal and the threshold value were also utilized, along with a general damage index for the MFL method. The detected MFL signals for each damage type were quantified by using the proposed damage indexes and the general damage indexes for the MFL method. Finally, an artificial neural network (ANN) based multi-stage pattern recognition method using extracted multi-scale damage indexes was implemented to automatically estimate the severity of the damage. To analyze the reliability of the MFL-based automated wire rope NDE method, the accuracy and reliability were evaluated by comparing the repeatedly estimated damage size and the actual damage size.
Protein (multi-)location prediction: using location inter-dependencies in a probabilistic framework
2014-01-01
Motivation Knowing the location of a protein within the cell is important for understanding its function, role in biological processes, and potential use as a drug target. Much progress has been made in developing computational methods that predict single locations for proteins. Most such methods are based on the over-simplifying assumption that proteins localize to a single location. However, it has been shown that proteins localize to multiple locations. While a few recent systems attempt to predict multiple locations of proteins, their performance leaves much room for improvement. Moreover, they typically treat locations as independent and do not attempt to utilize possible inter-dependencies among locations. Our hypothesis is that directly incorporating inter-dependencies among locations into both the classifier-learning and the prediction process can improve location prediction performance. Results We present a new method and a preliminary system we have developed that directly incorporates inter-dependencies among locations into the location-prediction process of multiply-localized proteins. Our method is based on a collection of Bayesian network classifiers, where each classifier is used to predict a single location. Learning the structure of each Bayesian network classifier takes into account inter-dependencies among locations, and the prediction process uses estimates involving multiple locations. We evaluate our system on a dataset of single- and multi-localized proteins (the most comprehensive protein multi-localization dataset currently available, derived from the DBMLoc dataset). Our results, obtained by incorporating inter-dependencies, are significantly higher than those obtained by classifiers that do not use inter-dependencies. The performance of our system on multi-localized proteins is comparable to a top performing system (YLoc+), without being restricted only to location-combinations present in the training set. PMID:24646119
Protein (multi-)location prediction: using location inter-dependencies in a probabilistic framework.
Simha, Ramanuja; Shatkay, Hagit
2014-03-19
Knowing the location of a protein within the cell is important for understanding its function, role in biological processes, and potential use as a drug target. Much progress has been made in developing computational methods that predict single locations for proteins. Most such methods are based on the over-simplifying assumption that proteins localize to a single location. However, it has been shown that proteins localize to multiple locations. While a few recent systems attempt to predict multiple locations of proteins, their performance leaves much room for improvement. Moreover, they typically treat locations as independent and do not attempt to utilize possible inter-dependencies among locations. Our hypothesis is that directly incorporating inter-dependencies among locations into both the classifier-learning and the prediction process can improve location prediction performance. We present a new method and a preliminary system we have developed that directly incorporates inter-dependencies among locations into the location-prediction process of multiply-localized proteins. Our method is based on a collection of Bayesian network classifiers, where each classifier is used to predict a single location. Learning the structure of each Bayesian network classifier takes into account inter-dependencies among locations, and the prediction process uses estimates involving multiple locations. We evaluate our system on a dataset of single- and multi-localized proteins (the most comprehensive protein multi-localization dataset currently available, derived from the DBMLoc dataset). Our results, obtained by incorporating inter-dependencies, are significantly higher than those obtained by classifiers that do not use inter-dependencies. The performance of our system on multi-localized proteins is comparable to a top performing system (YLoc+), without being restricted only to location-combinations present in the training set.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haas, J.C.; Olivo, C.A.; Wilson, K.B.
1994-04-01
An experimental test plan has been prepared for DOE/METC review and approval to develop a filter media suitable for multi-contaminant control in granular-bed filter (GBF) applications. The plan includes identification, development, and demonstration of methods for enhanced media morphology, chemical reactivity, and mechanical strength. The test plan includes media preparation methods, physical and chemical characterization methods for fresh and reacted media, media evaluation criteria, details of test and analytical equipment, and test matrix of the proposed media testing. A filter media composed of agglomerated limestone and clay was determined to be the best candidate for multi-contaminate control in GBF operation.more » The combined limestone/clay agglomerate has the potential to remove sulfur and alkali species, in addition to particulate, and possibly halogens and trace heavy metals from coal process streams.« less
Multi-projector auto-calibration and placement optimization for non-planar surfaces
NASA Astrophysics Data System (ADS)
Li, Dong; Xie, Jinghui; Zhao, Lu; Zhou, Lijing; Weng, Dongdong
2015-10-01
Non-planar projection has been widely applied in virtual reality and digital entertainment and exhibitions because of its flexible layout and immersive display effects. Compared with planar projection, a non-planar projection is more difficult to achieve because projector calibration and image distortion correction are difficult processes. This paper uses a cylindrical screen as an example to present a new method for automatically calibrating a multi-projector system in a non-planar environment without using 3D reconstruction. This method corrects the geometric calibration error caused by the screen's manufactured imperfections, such as an undulating surface or a slant in the vertical plane. In addition, based on actual projection demand, this paper presents the overall performance evaluation criteria for the multi-projector system. According to these criteria, we determined the optimal placement for the projectors. This method also extends to surfaces that can be parameterized, such as spheres, ellipsoids, and paraboloids, and demonstrates a broad applicability.
Xie, Qiuju; Ni, Ji-Qin; Su, Zhongbin
2017-10-15
In confined swine buildings, temperature, humidity, and air quality are all important for animal health and productivity. However, the current swine building environmental control is only based on temperature; and evaluation and control methods based on multiple environmental factors are needed. In this paper, fuzzy comprehensive evaluation (FCE) theory was adopted for multi-factor assessment of environmental quality in two commercial swine buildings using real measurement data. An assessment index system and membership functions were established; and predetermined weights were given using analytic hierarchy process (AHP) combined with knowledge of experts. The results show that multi-factors such as temperature, humidity, and concentrations of ammonia (NH 3 ), carbon dioxide (CO 2 ), and hydrogen sulfide (H 2 S) can be successfully integrated in FCE for swine building environment assessment. The FCE method has a high correlation coefficient of 0.737 compared with the method of single-factor evaluation (SFE). The FCE method can significantly increase the sensitivity and perform an effective and integrative assessment. It can be used as part of environmental controlling and warning systems for swine building environment management to improve swine production and welfare. Copyright © 2017 Elsevier B.V. All rights reserved.
Yang, Yong; Tong, Song; Huang, Shuying; Lin, Pan
2014-01-01
This paper presents a novel framework for the fusion of multi-focus images explicitly designed for visual sensor network (VSN) environments. Multi-scale based fusion methods can often obtain fused images with good visual effect. However, because of the defects of the fusion rules, it is almost impossible to completely avoid the loss of useful information in the thus obtained fused images. The proposed fusion scheme can be divided into two processes: initial fusion and final fusion. The initial fusion is based on a dual-tree complex wavelet transform (DTCWT). The Sum-Modified-Laplacian (SML)-based visual contrast and SML are employed to fuse the low- and high-frequency coefficients, respectively, and an initial composited image is obtained. In the final fusion process, the image block residuals technique and consistency verification are used to detect the focusing areas and then a decision map is obtained. The map is used to guide how to achieve the final fused image. The performance of the proposed method was extensively tested on a number of multi-focus images, including no-referenced images, referenced images, and images with different noise levels. The experimental results clearly indicate that the proposed method outperformed various state-of-the-art fusion methods, in terms of both subjective and objective evaluations, and is more suitable for VSNs. PMID:25587878
Yang, Yong; Tong, Song; Huang, Shuying; Lin, Pan
2014-11-26
This paper presents a novel framework for the fusion of multi-focus images explicitly designed for visual sensor network (VSN) environments. Multi-scale based fusion methods can often obtain fused images with good visual effect. However, because of the defects of the fusion rules, it is almost impossible to completely avoid the loss of useful information in the thus obtained fused images. The proposed fusion scheme can be divided into two processes: initial fusion and final fusion. The initial fusion is based on a dual-tree complex wavelet transform (DTCWT). The Sum-Modified-Laplacian (SML)-based visual contrast and SML are employed to fuse the low- and high-frequency coefficients, respectively, and an initial composited image is obtained. In the final fusion process, the image block residuals technique and consistency verification are used to detect the focusing areas and then a decision map is obtained. The map is used to guide how to achieve the final fused image. The performance of the proposed method was extensively tested on a number of multi-focus images, including no-referenced images, referenced images, and images with different noise levels. The experimental results clearly indicate that the proposed method outperformed various state-of-the-art fusion methods, in terms of both subjective and objective evaluations, and is more suitable for VSNs.
Multi-objective decision-making under uncertainty: Fuzzy logic methods
NASA Technical Reports Server (NTRS)
Hardy, Terry L.
1994-01-01
Selecting the best option among alternatives is often a difficult process. This process becomes even more difficult when the evaluation criteria are vague or qualitative, and when the objectives vary in importance and scope. Fuzzy logic allows for quantitative representation of vague or fuzzy objectives, and therefore is well-suited for multi-objective decision-making. This paper presents methods employing fuzzy logic concepts to assist in the decision-making process. In addition, this paper describes software developed at NASA Lewis Research Center for assisting in the decision-making process. Two diverse examples are used to illustrate the use of fuzzy logic in choosing an alternative among many options and objectives. One example is the selection of a lunar lander ascent propulsion system, and the other example is the selection of an aeration system for improving the water quality of the Cuyahoga River in Cleveland, Ohio. The fuzzy logic techniques provided here are powerful tools which complement existing approaches, and therefore should be considered in future decision-making activities.
Wavepacket dynamics and the multi-configurational time-dependent Hartree approach
NASA Astrophysics Data System (ADS)
Manthe, Uwe
2017-06-01
Multi-configurational time-dependent Hartree (MCTDH) based approaches are efficient, accurate, and versatile methods for high-dimensional quantum dynamics simulations. Applications range from detailed investigations of polyatomic reaction processes in the gas phase to high-dimensional simulations studying the dynamics of condensed phase systems described by typical solid state physics model Hamiltonians. The present article presents an overview of the different areas of application and provides a comprehensive review of the underlying theory. The concepts and guiding ideas underlying the MCTDH approach and its multi-mode and multi-layer extensions are discussed in detail. The general structure of the equations of motion is highlighted. The representation of the Hamiltonian and the correlated discrete variable representation (CDVR), which provides an efficient multi-dimensional quadrature in MCTDH calculations, are discussed. Methods which facilitate the calculation of eigenstates, the evaluation of correlation functions, and the efficient representation of thermal ensembles in MCTDH calculations are described. Different schemes for the treatment of indistinguishable particles in MCTDH calculations and recent developments towards a unified multi-layer MCTDH theory for systems including bosons and fermions are discussed.
Multi-objective optimization of chromatographic rare earth element separation.
Knutson, Hans-Kristian; Holmqvist, Anders; Nilsson, Bernt
2015-10-16
The importance of rare earth elements in modern technological industry grows, and as a result the interest for developing separation processes increases. This work is a part of developing chromatography as a rare earth element processing method. Process optimization is an important step in process development, and there are several competing objectives that need to be considered in a chromatographic separation process. Most studies are limited to evaluating the two competing objectives productivity and yield, and studies of scenarios with tri-objective optimizations are scarce. Tri-objective optimizations are much needed when evaluating the chromatographic separation of rare earth elements due to the importance of product pool concentration along with productivity and yield as process objectives. In this work, a multi-objective optimization strategy considering productivity, yield and pool concentration is proposed. This was carried out in the frame of a model based optimization study on a batch chromatography separation of the rare earth elements samarium, europium and gadolinium. The findings from the multi-objective optimization were used to provide with a general strategy for achieving desirable operation points, resulting in a productivity ranging between 0.61 and 0.75 kgEu/mcolumn(3), h(-1) and a pool concentration between 0.52 and 0.79 kgEu/m(3), while maintaining a purity above 99% and never falling below an 80% yield for the main target component europium. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Bascetin, A.
2007-04-01
The selection of an optimal reclamation method is one of the most important factors in open-pit design and production planning. It also affects economic considerations in open-pit design as a function of plan location and depth. Furthermore, the selection is a complex multi-person, multi-criteria decision problem. The group decision-making process can be improved by applying a systematic and logical approach to assess the priorities based on the inputs of several specialists from different functional areas within the mine company. The analytical hierarchy process (AHP) can be very useful in involving several decision makers with different conflicting objectives to arrive at a consensus decision. In this paper, the selection of an optimal reclamation method using an AHP-based model was evaluated for coal production in an open-pit coal mine located at Seyitomer region in Turkey. The use of the proposed model indicates that it can be applied to improve the group decision making in selecting a reclamation method that satisfies optimal specifications. Also, it is found that the decision process is systematic and using the proposed model can reduce the time taken to select a optimal method.
Multi-label literature classification based on the Gene Ontology graph.
Jin, Bo; Muller, Brian; Zhai, Chengxiang; Lu, Xinghua
2008-12-08
The Gene Ontology is a controlled vocabulary for representing knowledge related to genes and proteins in a computable form. The current effort of manually annotating proteins with the Gene Ontology is outpaced by the rate of accumulation of biomedical knowledge in literature, which urges the development of text mining approaches to facilitate the process by automatically extracting the Gene Ontology annotation from literature. The task is usually cast as a text classification problem, and contemporary methods are confronted with unbalanced training data and the difficulties associated with multi-label classification. In this research, we investigated the methods of enhancing automatic multi-label classification of biomedical literature by utilizing the structure of the Gene Ontology graph. We have studied three graph-based multi-label classification algorithms, including a novel stochastic algorithm and two top-down hierarchical classification methods for multi-label literature classification. We systematically evaluated and compared these graph-based classification algorithms to a conventional flat multi-label algorithm. The results indicate that, through utilizing the information from the structure of the Gene Ontology graph, the graph-based multi-label classification methods can significantly improve predictions of the Gene Ontology terms implied by the analyzed text. Furthermore, the graph-based multi-label classifiers are capable of suggesting Gene Ontology annotations (to curators) that are closely related to the true annotations even if they fail to predict the true ones directly. A software package implementing the studied algorithms is available for the research community. Through utilizing the information from the structure of the Gene Ontology graph, the graph-based multi-label classification methods have better potential than the conventional flat multi-label classification approach to facilitate protein annotation based on the literature.
A combined approach of AHP and TOPSIS methods applied in the field of integrated software systems
NASA Astrophysics Data System (ADS)
Berdie, A. D.; Osaci, M.; Muscalagiu, I.; Barz, C.
2017-05-01
Adopting the most appropriate technology for developing applications on an integrated software system for enterprises, may result in great savings both in cost and hours of work. This paper proposes a research study for the determination of a hierarchy between three SAP (System Applications and Products in Data Processing) technologies. The technologies Web Dynpro -WD, Floorplan Manager - FPM and CRM WebClient UI - CRM WCUI are multi-criteria evaluated in terms of the obtained performances through the implementation of the same web business application. To establish the hierarchy a multi-criteria analysis model that combines the AHP (Analytic Hierarchy Process) and the TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) methods was proposed. This model was built with the help of the SuperDecision software. This software is based on the AHP method and determines the weights for the selected sets of criteria. The TOPSIS method was used to obtain the final ranking and the technologies hierarchy.
Hallock, Michael J.; Stone, John E.; Roberts, Elijah; Fry, Corey; Luthey-Schulten, Zaida
2014-01-01
Simulation of in vivo cellular processes with the reaction-diffusion master equation (RDME) is a computationally expensive task. Our previous software enabled simulation of inhomogeneous biochemical systems for small bacteria over long time scales using the MPD-RDME method on a single GPU. Simulations of larger eukaryotic systems exceed the on-board memory capacity of individual GPUs, and long time simulations of modest-sized cells such as yeast are impractical on a single GPU. We present a new multi-GPU parallel implementation of the MPD-RDME method based on a spatial decomposition approach that supports dynamic load balancing for workstations containing GPUs of varying performance and memory capacity. We take advantage of high-performance features of CUDA for peer-to-peer GPU memory transfers and evaluate the performance of our algorithms on state-of-the-art GPU devices. We present parallel e ciency and performance results for simulations using multiple GPUs as system size, particle counts, and number of reactions grow. We also demonstrate multi-GPU performance in simulations of the Min protein system in E. coli. Moreover, our multi-GPU decomposition and load balancing approach can be generalized to other lattice-based problems. PMID:24882911
Hallock, Michael J; Stone, John E; Roberts, Elijah; Fry, Corey; Luthey-Schulten, Zaida
2014-05-01
Simulation of in vivo cellular processes with the reaction-diffusion master equation (RDME) is a computationally expensive task. Our previous software enabled simulation of inhomogeneous biochemical systems for small bacteria over long time scales using the MPD-RDME method on a single GPU. Simulations of larger eukaryotic systems exceed the on-board memory capacity of individual GPUs, and long time simulations of modest-sized cells such as yeast are impractical on a single GPU. We present a new multi-GPU parallel implementation of the MPD-RDME method based on a spatial decomposition approach that supports dynamic load balancing for workstations containing GPUs of varying performance and memory capacity. We take advantage of high-performance features of CUDA for peer-to-peer GPU memory transfers and evaluate the performance of our algorithms on state-of-the-art GPU devices. We present parallel e ciency and performance results for simulations using multiple GPUs as system size, particle counts, and number of reactions grow. We also demonstrate multi-GPU performance in simulations of the Min protein system in E. coli . Moreover, our multi-GPU decomposition and load balancing approach can be generalized to other lattice-based problems.
Composite Characterization Using Laser Doppler Vibrometry and Multi-Frequency Wavenumber Analysis
NASA Technical Reports Server (NTRS)
Juarez, Peter; Leckey, Cara
2015-01-01
NASA has recognized the need for better characterization of composite materials to support advances in aeronautics and the next generation of space exploration vehicles. An area of related research is the evaluation of impact induced delaminations. Presented is a non-contact method of measuring the ply depth of impact delamination damage in a composite through use of a Scanning Laser Doppler Vibrometer (SLDV), multi-frequency wavenumber analysis, and a wavenumber-ply correlation algorithm. A single acquisition of a chirp excited lamb wavefield in an impacted composite is post-processed into a numerous single frequency excitation wavefields through a deconvolution process. A spatially windowed wavenumber analysis then extracts local wavenumbers from the wavefield, which are then correlated to theoretical dispersion curves for ply depth determination. SLDV based methods to characterize as-manufactured composite variation using wavefield analysis will also be discussed.
A fuzzy MCDM approach for evaluating school performance based on linguistic information
NASA Astrophysics Data System (ADS)
Musani, Suhaina; Jemain, Abdul Aziz
2013-11-01
Decision making is the process of finding the best option among the feasible alternatives. This process should consider a variety of criteria, but this study only focus on academic achievement. The data used is the percentage of candidates who obtained Malaysian Certificate of Education (SPM) in Melaka based on school academic achievement for each subject. 57 secondary schools in Melaka as listed by the Ministry of Education involved in this study. Therefore the school ranking can be done using MCDM (Multi Criteria Decision Making) methods. The objective of this study is to develop a rational method for evaluating school performance based on linguistic information. Since the information or level of academic achievement provided in linguistic manner, there is a possible chance of getting incomplete or uncertain problems. So in order to overcome the situation, the information could be provided as fuzzy numbers. Since fuzzy set represents the uncertainty in human perceptions. In this research, VIKOR (Multi Criteria Optimization and Compromise Solution) has been used as a MCDM tool for the school ranking process in fuzzy environment. Results showed that fuzzy set theory can solve the limitations of using MCDM when there is uncertainty problems exist in the data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Eric C; Smith, Raymond; Ruiz-Mercado, Gerardo
This presentation examines different methods for analyzing manufacturing processes in the early stages of technical readiness. Before developers know much detail about their processes, it is valuable to apply various assessments to evaluate their performance. One type of assessment evaluates performance indicators to describe how closely processes approach desirable objectives. Another type of assessment determines the life cycle inventories (LCI) of inputs and outputs for processes, where for a functional unit of product, the user evaluates the resources used and the releases to the environment. These results can be compared to similar processes or combined with the LCI of othermore » processes to examine up-and down-stream chemicals. The inventory also provides a listing of the up-stream chemicals, which permits study of the whole life cycle. Performance indicators are evaluated in this presentation with the U.S. Environmental Protection Agency's GREENSCOPE (Gauging Reaction Effectiveness for ENvironmental Sustainability with a multi-Objective Process Evaluator) methodology, which evaluates processes in four areas: Environment, Energy, Economics, and Efficiency. The method develops relative scores for indicators that allow comparisons across various technologies. In this contribution, two conversion pathways for producing cellulosic ethanol from biomass, via thermochemical and biochemical routes, are studied. The information developed from the indicators and LCI can be used to inform the process design and the potential life cycle effects of up- and down-stream chemicals.« less
Sach, Tracey H; Desborough, James; Houghton, Julie; Holland, Richard
2014-11-06
Economic methods are underutilised within pharmacy research resulting in a lack of quality evidence to support funding decisions for pharmacy interventions. The aim of this study is to illustrate the methods of micro-costing within the pharmacy context in order to raise awareness and use of this approach in pharmacy research. Micro-costing methods are particularly useful where a new service or intervention is being evaluated and for which no previous estimates of the costs of providing the service exist. This paper describes the rationale for undertaking a micro-costing study before detailing and illustrating the process involved. The illustration relates to a recently completed trial of multi-professional medication reviews as an intervention provided in care homes. All costs are presented in UK£2012. In general, costing methods involve three broad steps (identification, measurement and valuation); when using micro-costing, closer attention to detail is required within all three stages of this process. The mean (standard deviation; 95% confidence interval (CI) ) cost per resident of the multi-professional medication review intervention was £104.80 (50.91; 98.72 to 109.45), such that the overall cost of providing the intervention to all intervention home residents was £36,221.29 (95% CI, 32 810.81 to 39 631.77). This study has demonstrated that micro-costing can be a useful method, not only for estimating the cost of a pharmacy intervention to feed into a pharmacy economic evaluation, but also as a source of information to help inform those designing pharmacy services about the potential time and costs involved in delivering such services. © 2014 Royal Pharmaceutical Society.
Morell, Jonathan A
2018-06-01
This article argues that evaluators could better deal with unintended consequences if they improved their methods of systematically and methodically combining empirical data collection and model building over the life cycle of an evaluation. This process would be helpful because it can increase the timespan from when the need for a change in methodology is first suspected to the time when the new element of the methodology is operational. The article begins with an explanation of why logic models are so important in evaluation, and why the utility of models is limited if they are not continually revised based on empirical evaluation data. It sets the argument within the larger context of the value and limitations of models in the scientific enterprise. Following will be a discussion of various issues that are relevant to model development and revision. What is the relevance of complex system behavior for understanding predictable and unpredictable unintended consequences, and the methods needed to deal with them? How might understanding of unintended consequences be improved with an appreciation of generic patterns of change that are independent of any particular program or change effort? What are the social and organizational dynamics that make it rational and adaptive to design programs around single-outcome solutions to multi-dimensional problems? How does cognitive bias affect our ability to identify likely program outcomes? Why is it hard to discern change as a result of programs being embedded in multi-component, continually fluctuating, settings? The last part of the paper outlines a process for actualizing systematic iteration between model and methodology, and concludes with a set of research questions that speak to how the model/data process can be made efficient and effective. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Shafii, M.; Tolson, B.; Matott, L. S.
2012-04-01
Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.
Analyzing gene expression time-courses based on multi-resolution shape mixture model.
Li, Ying; He, Ye; Zhang, Yu
2016-11-01
Biological processes actually are a dynamic molecular process over time. Time course gene expression experiments provide opportunities to explore patterns of gene expression change over a time and understand the dynamic behavior of gene expression, which is crucial for study on development and progression of biology and disease. Analysis of the gene expression time-course profiles has not been fully exploited so far. It is still a challenge problem. We propose a novel shape-based mixture model clustering method for gene expression time-course profiles to explore the significant gene groups. Based on multi-resolution fractal features and mixture clustering model, we proposed a multi-resolution shape mixture model algorithm. Multi-resolution fractal features is computed by wavelet decomposition, which explore patterns of change over time of gene expression at different resolution. Our proposed multi-resolution shape mixture model algorithm is a probabilistic framework which offers a more natural and robust way of clustering time-course gene expression. We assessed the performance of our proposed algorithm using yeast time-course gene expression profiles compared with several popular clustering methods for gene expression profiles. The grouped genes identified by different methods are evaluated by enrichment analysis of biological pathways and known protein-protein interactions from experiment evidence. The grouped genes identified by our proposed algorithm have more strong biological significance. A novel multi-resolution shape mixture model algorithm based on multi-resolution fractal features is proposed. Our proposed model provides a novel horizons and an alternative tool for visualization and analysis of time-course gene expression profiles. The R and Matlab program is available upon the request. Copyright © 2016 Elsevier Inc. All rights reserved.
Pereira, Suzanne; Névéol, Aurélie; Kerdelhué, Gaétan; Serrot, Elisabeth; Joubert, Michel; Darmoni, Stéfan J
2008-11-06
To assist with the development of a French online quality-controlled health gateway(CISMeF), an automatic indexing tool assigning MeSH descriptors to medical text in French was created. The French Multi-Terminology Indexer (FMTI) relies on a multi-terminology approach involving four prominent medical terminologies and the mappings between them. In this paper,we compare lemmatization and stemming as methods to process French medical text for indexing. We also evaluate the multi-terminology approach implemented in F-MTI. The indexing strategies were assessed on a corpus of 18,814 resources indexed manually. There is little difference in the indexing performance when lemmatization or stemming is used. However, the multi-terminology approach outperforms indexing relying on a single terminology in terms of recall. F-MTI will soon be used in the CISMeF production environment and in a Health MultiTerminology Server in French.
Pfeifer, Mischa D; Scholkmann, Felix; Labruyère, Rob
2017-01-01
Even though research in the field of functional near-infrared spectroscopy (fNIRS) has been performed for more than 20 years, consensus on signal processing methods is still lacking. A significant knowledge gap exists between established researchers and those entering the field. One major issue regularly observed in publications from researchers new to the field is the failure to consider possible signal contamination by hemodynamic changes unrelated to neurovascular coupling (i.e., scalp blood flow and systemic blood flow). This might be due to the fact that these researchers use the signal processing methods provided by the manufacturers of their measurement device without an advanced understanding of the performed steps. The aim of the present study was to investigate how different signal processing approaches (including and excluding approaches that partially correct for the possible signal contamination) affect the results of a typical functional neuroimaging study performed with fNIRS. In particular, we evaluated one standard signal processing method provided by a commercial company and compared it to three customized approaches. We thereby investigated the influence of the chosen method on the statistical outcome of a clinical data set (task-evoked motor cortex activity). No short-channels were used in the present study and therefore two types of multi-channel corrections based on multiple long-channels were applied. The choice of the signal processing method had a considerable influence on the outcome of the study. While methods that ignored the contamination of the fNIRS signals by task-evoked physiological noise yielded several significant hemodynamic responses over the whole head, the statistical significance of these findings disappeared when accounting for part of the contamination using a multi-channel regression. We conclude that adopting signal processing methods that correct for physiological confounding effects might yield more realistic results in cases where multi-distance measurements are not possible. Furthermore, we recommend using manufacturers' standard signal processing methods only in case the user has an advanced understanding of every signal processing step performed.
Cole, Donald C; Levin, Carol; Loechl, Cornelia; Thiele, Graham; Grant, Frederick; Girard, Aimee Webb; Sindi, Kirimi; Low, Jan
2016-06-01
Multi-sectoral programs that involve stakeholders in agriculture, nutrition and health care are essential for responding to nutrition problems such as vitamin A deficiency among pregnant and lactating women and their infants in many poor areas of lower income countries. Yet planning such multi-sectoral programs and designing appropriate evaluations, to respond to different disciplinary cultures of evidence, remain a challenge. We describe the context, program development process, and evaluation design of the Mama SASHA project (Sweetpotato Action for Security and Health in Africa) which promoted production and consumption of a bio-fortified, orange-fleshed sweetpotato (OFSP). In planning the program we drew upon information from needs assessments, stakeholder consultations, and a first round of the implementation evaluation of a pilot project. The multi-disciplinary team worked with partner organizations to develop a program theory of change and an impact pathway which identified aspects of the program that would be monitored and established evaluation methods. Responding to the growing demand for greater rigour in impact evaluations, we carried out quasi-experimental allocation by health facility catchment area, repeat village surveys for assessment of change in intervention and control areas, and longitudinal tracking of individual mother-child pairs. Mid-course corrections in program implementation were informed by program monitoring, regular feedback from implementers and partners' meetings. To assess economic efficiency and provide evidence for scaling we collected data on resources used and project expenses. Managing the multi-sectoral program and the mixed methods evaluation involved bargaining and trade-offs that were deemed essential to respond to the array of stakeholders, program funders and disciplines involved. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Cole, Donald C.; Levin, Carol; Loechl, Cornelia; Thiele, Graham; Grant, Frederick; Girard, Aimee Webb; Sindi, Kirimi; Low, Jan
2016-01-01
Multi-sectoral programs that involve stakeholders in agriculture, nutrition and health care are essential for responding to nutrition problems such as vitamin A deficiency among pregnant and lactating women and their infants in many poor areas of lower income countries. Yet planning such multi-sectoral programs and designing appropriate evaluations, to respond to different disciplinary cultures of evidence, remain a challenge. We describe the context, program development process, and evaluation design of the Mama SASHA project (Sweetpotato Action for Security and Health in Africa) which promoted production and consumption of a bio-fortified, orange-fleshed sweetpotato (OFSP). In planning the program we drew upon information from needs assessments, stakeholder consultations, and a first round of the implementation evaluation of a pilot project. The multi-disciplinary team worked with partner organizations to develop a program theory of change and an impact pathway which identified aspects of the program that would be monitored and established evaluation methods. Responding to the growing demand for greater rigour in impact evaluations, we carried out quasi-experimental allocation by health facility catchment area, repeat village surveys for assessment of change in intervention and control areas, and longitudinal tracking of individual mother-child pairs. Mid-course corrections in program implementation were informed by program monitoring, regular feedback from implementers and partners’ meetings. To assess economic efficiency and provide evidence for scaling we collected data on resources used and project expenses. Managing the multi-sectoral program and the mixed methods evaluation involved bargaining and trade-offs that were deemed essential to respond to the array of stakeholders, program funders and disciplines involved. PMID:27003730
Study on Mosaic and Uniform Color Method of Satellite Image Fusion in Large Srea
NASA Astrophysics Data System (ADS)
Liu, S.; Li, H.; Wang, X.; Guo, L.; Wang, R.
2018-04-01
Due to the improvement of satellite radiometric resolution and the color difference for multi-temporal satellite remote sensing images and the large amount of satellite image data, how to complete the mosaic and uniform color process of satellite images is always an important problem in image processing. First of all using the bundle uniform color method and least squares mosaic method of GXL and the dodging function, the uniform transition of color and brightness can be realized in large area and multi-temporal satellite images. Secondly, using Color Mapping software to color mosaic images of 16bit to mosaic images of 8bit based on uniform color method with low resolution reference images. At last, qualitative and quantitative analytical methods are used respectively to analyse and evaluate satellite image after mosaic and uniformity coloring. The test reflects the correlation of mosaic images before and after coloring is higher than 95 % and image information entropy increases, texture features are enhanced which have been proved by calculation of quantitative indexes such as correlation coefficient and information entropy. Satellite image mosaic and color processing in large area has been well implemented.
Digital Radiography Qualification of Tube Welding
NASA Technical Reports Server (NTRS)
Carl, Chad
2012-01-01
The Orion Project will be directing Lockheed Martin to perform orbital arc welding on commodities metallic tubing as part of the Multi Purpose Crew Vehicle assembly and integration process in the Operations and Checkout High bay at Kennedy Space Center. The current method of nondestructive evaluation is utilizing traditional film based x-rays. Due to the high number of welds that are necessary to join the commodities tubing (approx 470), a more efficient and expeditious method of nondestructive evaluation is desired. Digital radiography will be qualified as part of a broader NNWG project scope.
A GIS-based extended fuzzy multi-criteria evaluation for landslide susceptibility mapping
NASA Astrophysics Data System (ADS)
Feizizadeh, Bakhtiar; Shadman Roodposhti, Majid; Jankowski, Piotr; Blaschke, Thomas
2014-12-01
Landslide susceptibility mapping (LSM) is making increasing use of GIS-based spatial analysis in combination with multi-criteria evaluation (MCE) methods. We have developed a new multi-criteria decision analysis (MCDA) method for LSM and applied it to the Izeh River basin in south-western Iran. Our method is based on fuzzy membership functions (FMFs) derived from GIS analysis. It makes use of nine causal landslide factors identified by local landslide experts. Fuzzy set theory was first integrated with an analytical hierarchy process (AHP) in order to use pairwise comparisons to compare LSM criteria for ranking purposes. FMFs were then applied in order to determine the criteria weights to be used in the development of a landslide susceptibility map. Finally, a landslide inventory database was used to validate the LSM map by comparing it with known landslides within the study area. Results indicated that the integration of fuzzy set theory with AHP produced significantly improved accuracies and a high level of reliability in the resulting landslide susceptibility map. Approximately 53% of known landslides within our study area fell within zones classified as having "very high susceptibility", with the further 31% falling into zones classified as having "high susceptibility".
A GIS-based extended fuzzy multi-criteria evaluation for landslide susceptibility mapping
Feizizadeh, Bakhtiar; Shadman Roodposhti, Majid; Jankowski, Piotr; Blaschke, Thomas
2014-01-01
Landslide susceptibility mapping (LSM) is making increasing use of GIS-based spatial analysis in combination with multi-criteria evaluation (MCE) methods. We have developed a new multi-criteria decision analysis (MCDA) method for LSM and applied it to the Izeh River basin in south-western Iran. Our method is based on fuzzy membership functions (FMFs) derived from GIS analysis. It makes use of nine causal landslide factors identified by local landslide experts. Fuzzy set theory was first integrated with an analytical hierarchy process (AHP) in order to use pairwise comparisons to compare LSM criteria for ranking purposes. FMFs were then applied in order to determine the criteria weights to be used in the development of a landslide susceptibility map. Finally, a landslide inventory database was used to validate the LSM map by comparing it with known landslides within the study area. Results indicated that the integration of fuzzy set theory with AHP produced significantly improved accuracies and a high level of reliability in the resulting landslide susceptibility map. Approximately 53% of known landslides within our study area fell within zones classified as having “very high susceptibility”, with the further 31% falling into zones classified as having “high susceptibility”. PMID:26089577
A GIS-based extended fuzzy multi-criteria evaluation for landslide susceptibility mapping.
Feizizadeh, Bakhtiar; Shadman Roodposhti, Majid; Jankowski, Piotr; Blaschke, Thomas
2014-12-01
Landslide susceptibility mapping (LSM) is making increasing use of GIS-based spatial analysis in combination with multi-criteria evaluation (MCE) methods. We have developed a new multi-criteria decision analysis (MCDA) method for LSM and applied it to the Izeh River basin in south-western Iran. Our method is based on fuzzy membership functions (FMFs) derived from GIS analysis. It makes use of nine causal landslide factors identified by local landslide experts. Fuzzy set theory was first integrated with an analytical hierarchy process (AHP) in order to use pairwise comparisons to compare LSM criteria for ranking purposes. FMFs were then applied in order to determine the criteria weights to be used in the development of a landslide susceptibility map. Finally, a landslide inventory database was used to validate the LSM map by comparing it with known landslides within the study area. Results indicated that the integration of fuzzy set theory with AHP produced significantly improved accuracies and a high level of reliability in the resulting landslide susceptibility map. Approximately 53% of known landslides within our study area fell within zones classified as having "very high susceptibility", with the further 31% falling into zones classified as having "high susceptibility".
Multi-criteria evaluation of sources for self-help domestic water supply
NASA Astrophysics Data System (ADS)
Nnaji, C. C.; Banigo, A.
2018-03-01
Two multi-criteria decision analysis methods were employed to evaluate six water sources. The analytical hierarchical process (AHP) ranked borehole highest with a rank of 0.321 followed by water board with a rank of 0.284. The other sources ranked far below these two as follows: water tanker (0.139), rainwater harvesting (0.117), shallow well (0.114) and stream (0.130). The Technique for Order Performance by Similarity to the Ideal Solution (TOPSIS) ranked water board highest with a rank of 0.865, followed by borehole with a value of 0.778. Quality and risk of contamination were found to be the most influential criteria while seasonality was the least.
Lin, Xiao; Chyi, Chin Wun; Ruan, Ke-feng; Feng, Yi; Heng, Paul Wan Sia
2011-10-01
This work aimed to explore the potential of lactose as novel cushioning agents with suitable physicomechanical properties by micronization and co-spray drying with polymers for protecting coated multi-particulates from rupture when they are compressed into tablets. Several commercially available lactose grades, micronized lactose (ML) produced by jet milling, spray-dried ML (SML), and polymer-co-processed SMLs, were evaluated for their material characteristics and tableting properties. Hydroxypropylcellulose (HPC), hydroxypropylmethylcellulose (HPMC), and polyvinylpyrrolidone (PVP) at three different levels were evaluated as co-processed polymers for spray drying. Sugar multi-particulates layered with chlorpheniramine maleate followed by an ethylcellulose coat were tableted using various lactose types as fillers. Drug release from compacted multi-particulate tablets was used to evaluate the cushioning effect of the fillers. The results showed that the cushioning effect of lactose principally depended on its particle size. Micronization can effectively enhance the protective action of lactose. Although spray drying led to a small reduction in the cushioning effect of ML, it significantly improved the physicomechanical properties of ML. Co-spray drying with suitable polymers improved both the cushioning effect and the physicomechanical properties of SML to a certain degree. Among the three polymers studied, HPC was the most effective in terms of enhancing the cushioning effect of SML. This was achieved by reducing yield pressure, and enhancing compressibility and compactibility. The combination of micronization and co-spray drying with polymers is a promising method with which new applications for lactose can be developed. Copyright © 2011 Elsevier B.V. All rights reserved.
A new state evaluation method of oil pump unit based on AHP and FCE
NASA Astrophysics Data System (ADS)
Lin, Yang; Liang, Wei; Qiu, Zeyang; Zhang, Meng; Lu, Wenqing
2017-05-01
In order to make an accurate state evaluation of oil pump unit, a comprehensive evaluation index should be established. A multi-parameters state evaluation method of oil pump unit is proposed in this paper. The oil pump unit is analyzed by Failure Mode and Effect Analysis (FMEA), so evaluation index can be obtained based on FMEA conclusions. The weights of different parameters in evaluation index are discussed using Analytic Hierarchy Process (AHP) with expert experience. According to the evaluation index and the weight of each parameter, the state evaluation is carried out by Fuzzy Comprehensive Evaluation (FCE) and the state is divided into five levels depending on status value, which is inspired by human body health. In order to verify the effectiveness and feasibility of the proposed method, a state evaluation of oil pump used in a pump station is taken as an example.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fu, Pengchen; Settgast, Randolph R.; Johnson, Scott M.
2014-12-17
GEOS is a massively parallel, multi-physics simulation application utilizing high performance computing (HPC) to address subsurface reservoir stimulation activities with the goal of optimizing current operations and evaluating innovative stimulation methods. GEOS enables coupling of di erent solvers associated with the various physical processes occurring during reservoir stimulation in unique and sophisticated ways, adapted to various geologic settings, materials and stimulation methods. Developed at the Lawrence Livermore National Laboratory (LLNL) as a part of a Laboratory-Directed Research and Development (LDRD) Strategic Initiative (SI) project, GEOS represents the culmination of a multi-year ongoing code development and improvement e ort that hasmore » leveraged existing code capabilities and sta expertise to design new computational geosciences software.« less
Using a fuzzy DEMATEL method for analyzing the factors influencing subcontractors selection
NASA Astrophysics Data System (ADS)
Kozik, Renata
2016-06-01
Subcontracting is a long-standing practice in the construction industry. This form of project organization, if manage properly, could provide the better quality, reduction in project time and costs. Subcontractors selection is a multi-criterion problem and can be determined by many factors. Identifying the importance of each of them as well as the direction of cause-effect relations between various types of factors can improve the management process. Their values could be evaluated on the basis of the available expert opinions with the application of a fuzzy multi-stage grading scale. In this paper it is recommended to use fuzzy DEMATEL method to analyze the relationship between factors affecting subcontractors selection.
Multiview face detection based on position estimation over multicamera surveillance system
NASA Astrophysics Data System (ADS)
Huang, Ching-chun; Chou, Jay; Shiu, Jia-Hou; Wang, Sheng-Jyh
2012-02-01
In this paper, we propose a multi-view face detection system that locates head positions and indicates the direction of each face in 3-D space over a multi-camera surveillance system. To locate 3-D head positions, conventional methods relied on face detection in 2-D images and projected the face regions back to 3-D space for correspondence. However, the inevitable false face detection and rejection usually degrades the system performance. Instead, our system searches for the heads and face directions over the 3-D space using a sliding cube. Each searched 3-D cube is projected onto the 2-D camera views to determine the existence and direction of human faces. Moreover, a pre-process to estimate the locations of candidate targets is illustrated to speed-up the searching process over the 3-D space. In summary, our proposed method can efficiently fuse multi-camera information and suppress the ambiguity caused by detection errors. Our evaluation shows that the proposed approach can efficiently indicate the head position and face direction on real video sequences even under serious occlusion.
Application fuzzy multi-attribute decision analysis method to prioritize project success criteria
NASA Astrophysics Data System (ADS)
Phong, Nguyen Thanh; Quyen, Nguyen Le Hoang Thuy To
2017-11-01
Project success is a foundation for project owner to manage and control not only for the current project but also for future potential projects in construction companies. However, identifying the key success criteria for evaluating a particular project in real practice is a challenging task. Normally, it depends on a lot of factors, such as the expectation of the project owner and stakeholders, triple constraints of the project (cost, time, quality), and company's mission, vision, and objectives. Traditional decision-making methods for measuring the project success are usually based on subjective opinions of panel experts, resulting in irrational and inappropriate decisions. Therefore, this paper introduces a multi-attribute decision analysis method (MADAM) for weighting project success criteria by using fuzzy Analytical Hierarchy Process approach. It is found that this method is useful when dealing with imprecise and uncertain human judgments in evaluating project success criteria. Moreover, this research also suggests that although cost, time, and quality are three project success criteria projects, the satisfaction of project owner and acceptance of project stakeholders with the completed project criteria is the most important criteria for project success evaluation in Vietnam.
High Cycle Fatigue (HCF) Science and Technology Program 2002 Annual Report
2003-08-01
Turbine Engine Airfoils, Phase I 4.3 Probabilistic Design of Turbine Engine Airfoils, Phase II 4.4 Probabilistic Blade Design System 4.5...XTL17/SE2 7.4 Conclusion 8.0 TEST AND EVALUATION 8.1 Characterization Test Protocol 8.2 Demonstration Test Protocol 8.3 Development of Multi ...transparent and opaque overlays for processing. The objective of the SBIR Phase I program was to identify and evaluate promising methods for
Development of a multi-criteria evaluation system to assess growing pig welfare.
Martín, P; Traulsen, I; Buxadé, C; Krieter, J
2017-03-01
The aim of this paper was to present an alternative multi-criteria evaluation model to assess animal welfare on farms based on the Welfare Quality® (WQ) project, using an example of welfare assessment of growing pigs. The WQ assessment protocol follows a three-step aggregation process. Measures are aggregated into criteria, criteria into principles and principles into an overall assessment. This study focussed on the first step of the aggregation. Multi-attribute utility theory (MAUT) was used to produce a value of welfare for each criterion. The utility functions and the aggregation function were constructed in two separated steps. The Measuring Attractiveness by a Categorical Based Evaluation Technique (MACBETH) method was used for utility function determination and the Choquet Integral (CI) was used as an aggregation operator. The WQ decision-makers' preferences were fitted in order to construct the utility functions and to determine the CI parameters. The methods were tested with generated data sets for farms of growing pigs. Using the MAUT, similar results were obtained to the ones obtained applying the WQ protocol aggregation methods. It can be concluded that due to the use of an interactive approach such as MACBETH, this alternative methodology is more transparent and more flexible than the methodology proposed by WQ, which allows the possibility to modify the model according, for instance, to new scientific knowledge.
Han, Zhenyu; Sun, Shouzheng; Fu, Hongya; Fu, Yunzhong
2017-01-01
Automated fiber placement (AFP) process includes a variety of energy forms and multi-scale effects. This contribution proposes a novel multi-scale low-entropy method aiming at optimizing processing parameters in an AFP process, where multi-scale effect, energy consumption, energy utilization efficiency and mechanical properties of micro-system could be taken into account synthetically. Taking a carbon fiber/epoxy prepreg as an example, mechanical properties of macro–meso–scale are obtained by Finite Element Method (FEM). A multi-scale energy transfer model is then established to input the macroscopic results into the microscopic system as its boundary condition, which can communicate with different scales. Furthermore, microscopic characteristics, mainly micro-scale adsorption energy, diffusion coefficient entropy–enthalpy values, are calculated under different processing parameters based on molecular dynamics method. Low-entropy region is then obtained in terms of the interrelation among entropy–enthalpy values, microscopic mechanical properties (interface adsorbability and matrix fluidity) and processing parameters to guarantee better fluidity, stronger adsorption, lower energy consumption and higher energy quality collaboratively. Finally, nine groups of experiments are carried out to verify the validity of the simulation results. The results show that the low-entropy optimization method can reduce void content effectively, and further improve the mechanical properties of laminates. PMID:28869520
Han, Zhenyu; Sun, Shouzheng; Fu, Hongya; Fu, Yunzhong
2017-09-03
Automated fiber placement (AFP) process includes a variety of energy forms and multi-scale effects. This contribution proposes a novel multi-scale low-entropy method aiming at optimizing processing parameters in an AFP process, where multi-scale effect, energy consumption, energy utilization efficiency and mechanical properties of micro-system could be taken into account synthetically. Taking a carbon fiber/epoxy prepreg as an example, mechanical properties of macro-meso-scale are obtained by Finite Element Method (FEM). A multi-scale energy transfer model is then established to input the macroscopic results into the microscopic system as its boundary condition, which can communicate with different scales. Furthermore, microscopic characteristics, mainly micro-scale adsorption energy, diffusion coefficient entropy-enthalpy values, are calculated under different processing parameters based on molecular dynamics method. Low-entropy region is then obtained in terms of the interrelation among entropy-enthalpy values, microscopic mechanical properties (interface adsorbability and matrix fluidity) and processing parameters to guarantee better fluidity, stronger adsorption, lower energy consumption and higher energy quality collaboratively. Finally, nine groups of experiments are carried out to verify the validity of the simulation results. The results show that the low-entropy optimization method can reduce void content effectively, and further improve the mechanical properties of laminates.
Collaborative simulation method with spatiotemporal synchronization process control
NASA Astrophysics Data System (ADS)
Zou, Yisheng; Ding, Guofu; Zhang, Weihua; Zhang, Jian; Qin, Shengfeng; Tan, John Kian
2016-10-01
When designing a complex mechatronics system, such as high speed trains, it is relatively difficult to effectively simulate the entire system's dynamic behaviors because it involves multi-disciplinary subsystems. Currently,a most practical approach for multi-disciplinary simulation is interface based coupling simulation method, but it faces a twofold challenge: spatial and time unsynchronizations among multi-directional coupling simulation of subsystems. A new collaborative simulation method with spatiotemporal synchronization process control is proposed for coupling simulating a given complex mechatronics system across multiple subsystems on different platforms. The method consists of 1) a coupler-based coupling mechanisms to define the interfacing and interaction mechanisms among subsystems, and 2) a simulation process control algorithm to realize the coupling simulation in a spatiotemporal synchronized manner. The test results from a case study show that the proposed method 1) can certainly be used to simulate the sub-systems interactions under different simulation conditions in an engineering system, and 2) effectively supports multi-directional coupling simulation among multi-disciplinary subsystems. This method has been successfully applied in China high speed train design and development processes, demonstrating that it can be applied in a wide range of engineering systems design and simulation with improved efficiency and effectiveness.
Ghaisas, N. S.; Subramaniam, A.; Lele, S. K.; ...
2017-12-31
We report high energy-density solids undergoing elastic-plastic deformations coupled to compressible fluids are a common occurrence in engineering applications. Examples include problems involving high-velocity impact and penetration, cavitation, and several manufacturing processes, such as cold forming. Numerical simulations of such phenomena require the ability to handle the interaction of shock waves with multi-material interfaces that can undergo large deformations and severe distortions. As opposed to Lagrangian (Benson 1992) and arbitrary Lagrangian-Eulerian (ALE) methods (Donea et al. 2004), fully Eulerian methods use grids that do not change in time. Consequently, Eulerian methods do not suffer from difficulties on account of meshmore » entanglement, and do not require periodic, expensive, remap operations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghaisas, N. S.; Subramaniam, A.; Lele, S. K.
We report high energy-density solids undergoing elastic-plastic deformations coupled to compressible fluids are a common occurrence in engineering applications. Examples include problems involving high-velocity impact and penetration, cavitation, and several manufacturing processes, such as cold forming. Numerical simulations of such phenomena require the ability to handle the interaction of shock waves with multi-material interfaces that can undergo large deformations and severe distortions. As opposed to Lagrangian (Benson 1992) and arbitrary Lagrangian-Eulerian (ALE) methods (Donea et al. 2004), fully Eulerian methods use grids that do not change in time. Consequently, Eulerian methods do not suffer from difficulties on account of meshmore » entanglement, and do not require periodic, expensive, remap operations.« less
Ding, Shuai; Xia, Chen-Yi; Zhou, Kai-Le; Yang, Shan-Lin; Shang, Jennifer S.
2014-01-01
Facing a customer market with rising demands for cloud service dependability and security, trustworthiness evaluation techniques are becoming essential to cloud service selection. But these methods are out of the reach to most customers as they require considerable expertise. Additionally, since the cloud service evaluation is often a costly and time-consuming process, it is not practical to measure trustworthy attributes of all candidates for each customer. Many existing models cannot easily deal with cloud services which have very few historical records. In this paper, we propose a novel service selection approach in which the missing value prediction and the multi-attribute trustworthiness evaluation are commonly taken into account. By simply collecting limited historical records, the current approach is able to support the personalized trustworthy service selection. The experimental results also show that our approach performs much better than other competing ones with respect to the customer preference and expectation in trustworthiness assessment. PMID:24972237
Ding, Shuai; Xia, Cheng-Yi; Xia, Chen-Yi; Zhou, Kai-Le; Yang, Shan-Lin; Shang, Jennifer S
2014-01-01
Facing a customer market with rising demands for cloud service dependability and security, trustworthiness evaluation techniques are becoming essential to cloud service selection. But these methods are out of the reach to most customers as they require considerable expertise. Additionally, since the cloud service evaluation is often a costly and time-consuming process, it is not practical to measure trustworthy attributes of all candidates for each customer. Many existing models cannot easily deal with cloud services which have very few historical records. In this paper, we propose a novel service selection approach in which the missing value prediction and the multi-attribute trustworthiness evaluation are commonly taken into account. By simply collecting limited historical records, the current approach is able to support the personalized trustworthy service selection. The experimental results also show that our approach performs much better than other competing ones with respect to the customer preference and expectation in trustworthiness assessment.
NASA Astrophysics Data System (ADS)
Fan, Shu-Kai S.; Tsai, Du-Ming; Chuang, Wei-Che
2017-04-01
Solar power has become an attractive alternative source of energy. The multi-crystalline solar cell has been widely accepted in the market because it has a relatively low manufacturing cost. Multi-crystalline solar wafers with larger grain sizes and fewer grain boundaries are higher quality and convert energy more efficiently than mono-crystalline solar cells. In this article, a new image processing method is proposed for assessing the wafer quality. An adaptive segmentation algorithm based on region growing is developed to separate the closed regions of individual grains. Using the proposed method, the shape and size of each grain in the wafer image can be precisely evaluated. Two measures of average grain size are taken from the literature and modified to estimate the average grain size. The resulting average grain size estimate dictates the quality of the crystalline solar wafers and can be considered a viable quantitative indicator of conversion efficiency.
NASA Astrophysics Data System (ADS)
Yoon, Kyungho; Lee, Wonhye; Croce, Phillip; Cammalleri, Amanda; Yoo, Seung-Schik
2018-05-01
Transcranial focused ultrasound (tFUS) is emerging as a non-invasive brain stimulation modality. Complicated interactions between acoustic pressure waves and osseous tissue introduce many challenges in the accurate targeting of an acoustic focus through the cranium. Image-guidance accompanied by a numerical simulation is desired to predict the intracranial acoustic propagation through the skull; however, such simulations typically demand heavy computation, which warrants an expedited processing method to provide on-site feedback for the user in guiding the acoustic focus to a particular brain region. In this paper, we present a multi-resolution simulation method based on the finite-difference time-domain formulation to model the transcranial propagation of acoustic waves from a single-element transducer (250 kHz). The multi-resolution approach improved computational efficiency by providing the flexibility in adjusting the spatial resolution. The simulation was also accelerated by utilizing parallelized computation through the graphic processing unit. To evaluate the accuracy of the method, we measured the actual acoustic fields through ex vivo sheep skulls with different sonication incident angles. The measured acoustic fields were compared to the simulation results in terms of focal location, dimensions, and pressure levels. The computational efficiency of the presented method was also assessed by comparing simulation speeds at various combinations of resolution grid settings. The multi-resolution grids consisting of 0.5 and 1.0 mm resolutions gave acceptable accuracy (under 3 mm in terms of focal position and dimension, less than 5% difference in peak pressure ratio) with a speed compatible with semi real-time user feedback (within 30 s). The proposed multi-resolution approach may serve as a novel tool for simulation-based guidance for tFUS applications.
Yoon, Kyungho; Lee, Wonhye; Croce, Phillip; Cammalleri, Amanda; Yoo, Seung-Schik
2018-05-10
Transcranial focused ultrasound (tFUS) is emerging as a non-invasive brain stimulation modality. Complicated interactions between acoustic pressure waves and osseous tissue introduce many challenges in the accurate targeting of an acoustic focus through the cranium. Image-guidance accompanied by a numerical simulation is desired to predict the intracranial acoustic propagation through the skull; however, such simulations typically demand heavy computation, which warrants an expedited processing method to provide on-site feedback for the user in guiding the acoustic focus to a particular brain region. In this paper, we present a multi-resolution simulation method based on the finite-difference time-domain formulation to model the transcranial propagation of acoustic waves from a single-element transducer (250 kHz). The multi-resolution approach improved computational efficiency by providing the flexibility in adjusting the spatial resolution. The simulation was also accelerated by utilizing parallelized computation through the graphic processing unit. To evaluate the accuracy of the method, we measured the actual acoustic fields through ex vivo sheep skulls with different sonication incident angles. The measured acoustic fields were compared to the simulation results in terms of focal location, dimensions, and pressure levels. The computational efficiency of the presented method was also assessed by comparing simulation speeds at various combinations of resolution grid settings. The multi-resolution grids consisting of 0.5 and 1.0 mm resolutions gave acceptable accuracy (under 3 mm in terms of focal position and dimension, less than 5% difference in peak pressure ratio) with a speed compatible with semi real-time user feedback (within 30 s). The proposed multi-resolution approach may serve as a novel tool for simulation-based guidance for tFUS applications.
Automatic Road Gap Detection Using Fuzzy Inference System
NASA Astrophysics Data System (ADS)
Hashemi, S.; Valadan Zoej, M. J.; Mokhtarzadeh, M.
2011-09-01
Automatic feature extraction from aerial and satellite images is a high-level data processing which is still one of the most important research topics of the field. In this area, most of the researches are focused on the early step of road detection, where road tracking methods, morphological analysis, dynamic programming and snakes, multi-scale and multi-resolution methods, stereoscopic and multi-temporal analysis, hyper spectral experiments, are some of the mature methods in this field. Although most researches are focused on detection algorithms, none of them can extract road network perfectly. On the other hand, post processing algorithms accentuated on the refining of road detection results, are not developed as well. In this article, the main is to design an intelligent method to detect and compensate road gaps remained on the early result of road detection algorithms. The proposed algorithm consists of five main steps as follow: 1) Short gap coverage: In this step, a multi-scale morphological is designed that covers short gaps in a hierarchical scheme. 2) Long gap detection: In this step, the long gaps, could not be covered in the previous stage, are detected using a fuzzy inference system. for this reason, a knowledge base consisting of some expert rules are designed which are fired on some gap candidates of the road detection results. 3) Long gap coverage: In this stage, detected long gaps are compensated by two strategies of linear and polynomials for this reason, shorter gaps are filled by line fitting while longer ones are compensated by polynomials.4) Accuracy assessment: In order to evaluate the obtained results, some accuracy assessment criteria are proposed. These criteria are obtained by comparing the obtained results with truly compensated ones produced by a human expert. The complete evaluation of the obtained results whit their technical discussions are the materials of the full paper.
Oikonomou, Vera; Dimitrakopoulos, Panayiotis G; Troumbis, Andreas Y
2011-01-01
Nature provides life-support services which do not merely constitute the basis for ecosystem integrity but also benefit human societies. The importance of such multiple outputs is often ignored or underestimated in environmental planning and decision making. The economic valuation of ecosystem functions or services has been widely used to make these benefits economically visible and thus address this deficiency. Alternatively, the relative importance of the components of ecosystem value can be identified and compared by means of multi-criteria evaluation. Hereupon, this article proposes a conceptual framework that couples ecosystem function analysis, multi criteria evaluation and social research methodologies for introducing an ecosystem function-based planning and management approach. The framework consists of five steps providing the structure of a participative decision making process which is then tested and ratified, by applying the discrete multi-criteria method NAIADE, in the Kalloni Natura 2000 site, on Lesbos, Greece. Three scenarios were developed and evaluated with regard to their impacts on the different types of ecosystem functions and the social actors' value judgements. A conflict analysis permitted the better elaboration of the different views, outlining the coalitions formed in the local community and shaping the way towards reaching a consensus.
NASA Astrophysics Data System (ADS)
Oikonomou, Vera; Dimitrakopoulos, Panayiotis G.; Troumbis, Andreas Y.
2011-01-01
Nature provides life-support services which do not merely constitute the basis for ecosystem integrity but also benefit human societies. The importance of such multiple outputs is often ignored or underestimated in environmental planning and decision making. The economic valuation of ecosystem functions or services has been widely used to make these benefits economically visible and thus address this deficiency. Alternatively, the relative importance of the components of ecosystem value can be identified and compared by means of multi-criteria evaluation. Hereupon, this article proposes a conceptual framework that couples ecosystem function analysis, multi criteria evaluation and social research methodologies for introducing an ecosystem function-based planning and management approach. The framework consists of five steps providing the structure of a participative decision making process which is then tested and ratified, by applying the discrete multi-criteria method NAIADE, in the Kalloni Natura 2000 site, on Lesbos, Greece. Three scenarios were developed and evaluated with regard to their impacts on the different types of ecosystem functions and the social actors' value judgements. A conflict analysis permitted the better elaboration of the different views, outlining the coalitions formed in the local community and shaping the way towards reaching a consensus.
SEM evaluation of metallization on semiconductors. [Scanning Electron Microscope
NASA Technical Reports Server (NTRS)
Fresh, D. L.; Adolphsen, J. W.
1974-01-01
A test method for the evaluation of metallization on semiconductors is presented and discussed. The method has been prepared in MIL-STD format for submittal as a proposed addition to MIL-STD-883. It is applicable to discrete devices and to integrated circuits and specifically addresses batch-process oriented defects. Quantitative accept/reject criteria are given for contact windows, other oxide steps, and general interconnecting metallization. Figures are provided that illustrate typical types of defects. Apparatus specifications, sampling plans, and specimen preparation and examination requirements are described. Procedures for glassivated devices and for multi-metal interconnection systems are included.
Multislice spiral CT simulator for dynamic cardiopulmonary studies
NASA Astrophysics Data System (ADS)
De Francesco, Silvia; Ferreira da Silva, Augusto M.
2002-04-01
We've developed a Multi-slice Spiral CT Simulator modeling the acquisition process of a real tomograph over a 4-dimensional phantom (4D MCAT) of the human thorax. The simulator allows us to visually characterize artifacts due to insufficient temporal sampling and a priori evaluate the quality of the images obtained in cardio-pulmonary studies (both with single-/multi-slice and ECG gated acquisition processes). The simulating environment allows both for conventional and spiral scanning modes and includes a model of noise in the acquisition process. In case of spiral scanning, reconstruction facilities include longitudinal interpolation methods (360LI and 180LI both for single and multi-slice). Then, the reconstruction of the section is performed through FBP. The reconstructed images/volumes are affected by distortion due to insufficient temporal sampling of the moving object. The developed simulating environment allows us to investigate the nature of the distortion characterizing it qualitatively and quantitatively (using, for example, Herman's measures). Much of our work is focused on the determination of adequate temporal sampling and sinogram regularization techniques. At the moment, the simulator model is limited to the case of multi-slice tomograph, being planned as a next step of development the extension to cone beam or area detectors.
Liu, Wanli; Bian, Zhengfu; Liu, Zhenguo; Zhang, Qiuzhao
2015-01-01
Differential interferometric synthetic aperture radar has been shown to be effective for monitoring subsidence in coal mining areas. Phase unwrapping can have a dramatic influence on the monitoring result. In this paper, a filtering-based phase unwrapping algorithm in combination with path-following is introduced to unwrap differential interferograms with high noise in mining areas. It can perform simultaneous noise filtering and phase unwrapping so that the pre-filtering steps can be omitted, thus usually retaining more details and improving the detectable deformation. For the method, the nonlinear measurement model of phase unwrapping is processed using a simplified Cubature Kalman filtering, which is an effective and efficient tool used in many nonlinear fields. Three case studies are designed to evaluate the performance of the method. In Case 1, two tests are designed to evaluate the performance of the method under different factors including the number of multi-looks and path-guiding indexes. The result demonstrates that the unwrapped results are sensitive to the number of multi-looks and that the Fisher Distance is the most suitable path-guiding index for our study. Two case studies are then designed to evaluate the feasibility of the proposed phase unwrapping method based on Cubature Kalman filtering. The results indicate that, compared with the popular Minimum Cost Flow method, the Cubature Kalman filtering-based phase unwrapping can achieve promising results without pre-filtering and is an appropriate method for coal mining areas with high noise. PMID:26153776
System and Method for Multi-Wavelength Optical Signal Detection
NASA Technical Reports Server (NTRS)
McGlone, Thomas D. (Inventor)
2017-01-01
The system and method for multi-wavelength optical signal detection enables the detection of optical signal levels significantly below those processed at the discrete circuit level by the use of mixed-signal processing methods implemented with integrated circuit technologies. The present invention is configured to detect and process small signals, which enables the reduction of the optical power required to stimulate detection networks, and lowers the required laser power to make specific measurements. The present invention provides an adaptation of active pixel networks combined with mixed-signal processing methods to provide an integer representation of the received signal as an output. The present invention also provides multi-wavelength laser detection circuits for use in various systems, such as a differential absorption light detection and ranging system.
Developing criteria to establish Trusted Digital Repositories
Faundeen, John L.
2017-01-01
This paper details the drivers, methods, and outcomes of the U.S. Geological Survey’s quest to establish criteria by which to judge its own digital preservation resources as Trusted Digital Repositories. Drivers included recent U.S. legislation focused on data and asset management conducted by federal agencies spending $100M USD or more annually on research activities. The methods entailed seeking existing evaluation criteria from national and international organizations such as International Standards Organization (ISO), U.S. Library of Congress, and Data Seal of Approval upon which to model USGS repository evaluations. Certification, complexity, cost, and usability of existing evaluation models were key considerations. The selected evaluation method was derived to allow the repository evaluation process to be transparent, understandable, and defensible; factors that are critical for judging competing, internal units. Implementing the chosen evaluation criteria involved establishing a cross-agency, multi-disciplinary team that interfaced across the organization.
Optimal External Wrench Distribution During a Multi-Contact Sit-to-Stand Task.
Bonnet, Vincent; Azevedo-Coste, Christine; Robert, Thomas; Fraisse, Philippe; Venture, Gentiane
2017-07-01
This paper aims at developing and evaluating a new practical method for the real-time estimate of joint torques and external wrenches during multi-contact sit-to-stand (STS) task using kinematics data only. The proposed method allows also identifying subject specific body inertial segment parameters that are required to perform inverse dynamics. The identification phase is performed using simple and repeatable motions. Thanks to an accurately identified model the estimate of the total external wrench can be used as an input to solve an under-determined multi-contact problem. It is solved using a constrained quadratic optimization process minimizing a hybrid human-like energetic criterion. The weights of this hybrid cost function are adjusted and a sensitivity analysis is performed in order to reproduce robustly human external wrench distribution. The results showed that the proposed method could successfully estimate the external wrenches under buttocks, feet, and hands during STS tasks (RMS error lower than 20 N and 6 N.m). The simplicity and generalization abilities of the proposed method allow paving the way of future diagnosis solutions and rehabilitation applications, including in-home use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schaffner, Michael
2014-06-01
The current downward trend in funding for U.S. defense systems seems to be on a collision course with the state of the practice in systems engineering, which typically results in the increased pace and scale of capabilities and resultantly increased cost of complex national defense systems. Recent advances in the state of the art in systems engineering methodology can be leveraged to address this growing challenge. The present work leverages advanced constructs and methods for early-phase conceptual design of complex systems, when committed costs are still low and management influence is still high. First, a literature review is presented ofmore » the topics relevant to this work, including approaches to the design of affordable systems, assumptions and methods of exploratory modeling, and enabling techniques to help mitigate the computational challenges involved. The types, purposes, and limits of early-phase, exploratory models are then elucidated. The RSC-based Method for Affordable Concept Selection (RMACS) is described, which comprises nine processes in the three main thrusts of information gathering, evaluation, and analysis. The method is then applied to a naval ship case example, described as the Next-Generation Combat Ship, with representational information outputs and discussions of affordability with respect to each process. The ninth process, Multi-Era Analysis (MERA), is introduced and explicated, including required and optional informational components, temporal and change-related considerations, required and optional activities involved, and the potential types of outputs from the process. The MERA process is then applied to a naval ship case example similar to that of the RMACS application, but with discrete change options added to enable a tradespace network. The seven activities of the MERA process are demonstrated, with the salient outputs of each given and discussed. Additional thoughts are presented on MERA and RMACS, and 8 distinct areas are identified for further research in the MERA process, along with a brief description of the directions that such research might take. It is concluded that the affordability of complex systems can be better enabled through a conceptual design method that incorporates MERA as well as metrics such as Multi-Attribute Expense, Max Expense, and Expense Stability. It is also found that affordability of changeable systems can be better enabled through the use of existing path-planning algorithms in efficient evaluation and analysis of long-term strategies. Finally, it is found that MERA enables the identification and analysis of path-dependent considerations related to designs, epochs, strategies, and change options, in many possible futures.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Li; He, Ya-Ling; Kang, Qinjun
2013-12-15
A coupled (hybrid) simulation strategy spatially combining the finite volume method (FVM) and the lattice Boltzmann method (LBM), called CFVLBM, is developed to simulate coupled multi-scale multi-physicochemical processes. In the CFVLBM, computational domain of multi-scale problems is divided into two sub-domains, i.e., an open, free fluid region and a region filled with porous materials. The FVM and LBM are used for these two regions, respectively, with information exchanged at the interface between the two sub-domains. A general reconstruction operator (RO) is proposed to derive the distribution functions in the LBM from the corresponding macro scalar, the governing equation of whichmore » obeys the convection–diffusion equation. The CFVLBM and the RO are validated in several typical physicochemical problems and then are applied to simulate complex multi-scale coupled fluid flow, heat transfer, mass transport, and chemical reaction in a wall-coated micro reactor. The maximum ratio of the grid size between the FVM and LBM regions is explored and discussed. -- Highlights: •A coupled simulation strategy for simulating multi-scale phenomena is developed. •Finite volume method and lattice Boltzmann method are coupled. •A reconstruction operator is derived to transfer information at the sub-domains interface. •Coupled multi-scale multiple physicochemical processes in micro reactor are simulated. •Techniques to save computational resources and improve the efficiency are discussed.« less
Research on Multi-Temporal PolInSAR Modeling and Applications
NASA Astrophysics Data System (ADS)
Hong, Wen; Pottier, Eric; Chen, Erxue
2014-11-01
In the study of theory and processing methodology, we apply accurate topographic phase to the Freeman-Durden decomposition for PolInSAR data. On the other hand, we present a TomoSAR imaging method based on convex optimization regularization theory. The target decomposition and reconstruction performance will be evaluated by multi-temporal Land P-band fully polarimetric images acquired in BioSAR campaigns. In the study of hybrid Quad-Pol system performance, we analyse the expression of range ambiguity to signal ratio (RASR) in this architecture. Simulations are used to testify its advantage in the improvement of range ambiguities.
Research on Multi-Temporal PolInSAR Modeling and Applications
NASA Astrophysics Data System (ADS)
Hong, Wen; Pottier, Eric; Chen, Erxue
2014-11-01
In the study of theory and processing methodology, we apply accurate topographic phase to the Freeman- Durden decomposition for PolInSAR data. On the other hand, we present a TomoSAR imaging method based on convex optimization regularization theory. The target decomposition and reconstruction performance will be evaluated by multi-temporal L- and P-band fully polarimetric images acquired in BioSAR campaigns. In the study of hybrid Quad-Pol system performance, we analyse the expression of range ambiguity to signal ratio (RASR) in this architecture. Simulations are used to testify its advantage in the improvement of range ambiguities.
Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining
Liang, Qiaokang; Zhang, Dan; Wu, Wanneng; Zou, Kunlin
2016-01-01
Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC) tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing. PMID:27854322
Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining.
Liang, Qiaokang; Zhang, Dan; Wu, Wanneng; Zou, Kunlin
2016-11-16
Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC) tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing.
A multi-layer MRI description of Parkinson's disease
NASA Astrophysics Data System (ADS)
La Rocca, M.; Amoroso, N.; Lella, E.; Bellotti, R.; Tangaro, S.
2017-09-01
Magnetic resonance imaging (MRI) along with complex network is currently one of the most widely adopted techniques for detection of structural changes in neurological diseases, such as Parkinson's Disease (PD). In this paper, we present a digital image processing study, within the multi-layer network framework, combining more classifiers to evaluate the informative power of the MRI features, for the discrimination of normal controls (NC) and PD subjects. We define a network for each MRI scan; the nodes are the sub-volumes (patches) the images are divided into and the links are defined using the Pearson's pairwise correlation between patches. We obtain a multi-layer network whose important network features, obtained with different feature selection methods, are used to feed a supervised multi-level random forest classifier which exploits this base of knowledge for accurate classification. Method evaluation has been carried out using T1 MRI scans of 354 individuals, including 177 PD subjects and 177 NC from the Parkinson's Progression Markers Initiative (PPMI) database. The experimental results demonstrate that the features obtained from multiplex networks are able to accurately describe PD patterns. Besides, also if a privileged scale for studying PD disease exists, exploring the informative content of more scales leads to a significant improvement of the performances in the discrimination between disease and healthy subjects. In particular, this method gives a comprehensive overview of brain regions statistically affected by the disease, an additional value to the presented study.
NASA Astrophysics Data System (ADS)
Uhde, Britta; Andreas Hahn, W.; Griess, Verena C.; Knoke, Thomas
2015-08-01
Multi-criteria decision analysis (MCDA) is a decision aid frequently used in the field of forest management planning. It includes the evaluation of multiple criteria such as the production of timber and non-timber forest products and tangible as well as intangible values of ecosystem services (ES). Hence, it is beneficial compared to those methods that take a purely financial perspective. Accordingly, MCDA methods are increasingly popular in the wide field of sustainability assessment. Hybrid approaches allow aggregating MCDA and, potentially, other decision-making techniques to make use of their individual benefits and leading to a more holistic view of the actual consequences that come with certain decisions. This review is providing a comprehensive overview of hybrid approaches that are used in forest management planning. Today, the scientific world is facing increasing challenges regarding the evaluation of ES and the trade-offs between them, for example between provisioning and regulating services. As the preferences of multiple stakeholders are essential to improve the decision process in multi-purpose forestry, participatory and hybrid approaches turn out to be of particular importance. Accordingly, hybrid methods show great potential for becoming most relevant in future decision making. Based on the review presented here, the development of models for the use in planning processes should focus on participatory modeling and the consideration of uncertainty regarding available information.
Uhde, Britta; Hahn, W Andreas; Griess, Verena C; Knoke, Thomas
2015-08-01
Multi-criteria decision analysis (MCDA) is a decision aid frequently used in the field of forest management planning. It includes the evaluation of multiple criteria such as the production of timber and non-timber forest products and tangible as well as intangible values of ecosystem services (ES). Hence, it is beneficial compared to those methods that take a purely financial perspective. Accordingly, MCDA methods are increasingly popular in the wide field of sustainability assessment. Hybrid approaches allow aggregating MCDA and, potentially, other decision-making techniques to make use of their individual benefits and leading to a more holistic view of the actual consequences that come with certain decisions. This review is providing a comprehensive overview of hybrid approaches that are used in forest management planning. Today, the scientific world is facing increasing challenges regarding the evaluation of ES and the trade-offs between them, for example between provisioning and regulating services. As the preferences of multiple stakeholders are essential to improve the decision process in multi-purpose forestry, participatory and hybrid approaches turn out to be of particular importance. Accordingly, hybrid methods show great potential for becoming most relevant in future decision making. Based on the review presented here, the development of models for the use in planning processes should focus on participatory modeling and the consideration of uncertainty regarding available information.
Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun
2014-01-01
Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation.
Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun
2014-01-01
Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation. PMID:25405760
The Use of Multi-Criteria Evaluation and Network Analysis in the Area Development Planning Process
2013-03-01
layouts. The alternative layout scoring process, base in multi-criteria evaluation, returns a quantitative score for each alternative layout and a...The purpose of this research was to develop improvements to the area development planning process. These plans are used to improve operations within...an installation sub-section by altering the physical layout of facilities. One methodology was developed based on apply network analysis concepts to
Boström, Jan; Elger, Christian E.; Mormann, Florian
2016-01-01
Recording extracellulary from neurons in the brains of animals in vivo is among the most established experimental techniques in neuroscience, and has recently become feasible in humans. Many interesting scientific questions can be addressed only when extracellular recordings last several hours, and when individual neurons are tracked throughout the entire recording. Such questions regard, for example, neuronal mechanisms of learning and memory consolidation, and the generation of epileptic seizures. Several difficulties have so far limited the use of extracellular multi-hour recordings in neuroscience: Datasets become huge, and data are necessarily noisy in clinical recording environments. No methods for spike sorting of such recordings have been available. Spike sorting refers to the process of identifying the contributions of several neurons to the signal recorded in one electrode. To overcome these difficulties, we developed Combinato: a complete data-analysis framework for spike sorting in noisy recordings lasting twelve hours or more. Our framework includes software for artifact rejection, automatic spike sorting, manual optimization, and efficient visualization of results. Our completely automatic framework excels at two tasks: It outperforms existing methods when tested on simulated and real data, and it enables researchers to analyze multi-hour recordings. We evaluated our methods on both short and multi-hour simulated datasets. To evaluate the performance of our methods in an actual neuroscientific experiment, we used data from from neurosurgical patients, recorded in order to identify visually responsive neurons in the medial temporal lobe. These neurons responded to the semantic content, rather than to visual features, of a given stimulus. To test our methods with multi-hour recordings, we made use of neurons in the human medial temporal lobe that respond selectively to the same stimulus in the evening and next morning. PMID:27930664
Rationale, design and methods for process evaluation in the HEALTHY study.
Schneider, M; Hall, W J; Hernandez, A E; Hindes, K; Montez, G; Pham, T; Rosen, L; Sleigh, A; Thompson, D; Volpe, S L; Zeveloff, A; Steckler, A
2009-08-01
The HEALTHY study was a multi-site randomized trial designed to determine whether a 3-year school-based intervention targeting nutrition and physical activity behaviors could effectively reduce risk factors associated with type 2 diabetes in middle school children. Pilot and formative studies were conducted to inform the development of the intervention components and the process evaluation methods for the main trial. During the main trial, both qualitative and quantitative assessments monitored the fidelity of the intervention and motivated modifications to improve intervention delivery. Structured observations of physical education classes, total school food environments, classroom-based educational modules, and communications and promotional campaigns provided verification that the intervention was delivered as intended. Interviews and focus groups yielded a multidimensional assessment of how the intervention was delivered and received, as well as identifying the barriers to and facilitators of the intervention across and within participating schools. Interim summaries of process evaluation data were presented to the study group as a means of ensuring standardization and quality of the intervention across the seven participating centers. Process evaluation methods and procedures documented the fidelity with which the HEALTHY study was implemented across 21 intervention schools and identified ways in which the intervention delivery might be enhanced throughout the study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amoroso, J.; Dandeneau, C.
FY16 efforts were focused on direct comparison of multi-phase ceramic waste forms produced via melt processing and HIP methods. Based on promising waste form compositions previously devised at SRNL, simulant material was prepared at SRNL and a portion was sent to the Australian Nuclear Science and Technology Organization (ANSTO) for HIP treatments, while the remainder of the material was melt processed at SRNL. The microstructure, phase formation, elemental speciation, and leach behavior, and radiation stability of the fabricated ceramics was performed. In addition, melt-processed ceramics designed with different fractions of hollandite, zirconolite, perovskite, and pyrochlore phases were investigated. for performancemore » and properties.« less
Bowers, Gillian; Bowers, John
2018-01-01
Digital services are often regarded as a solution to the growing demands on primary care services. Provision of a tool offering advice to support self-management as well as the ability to digitally consult with a General Practitioner (GP) has the potential to alleviate some of the pressure on primary care. This paper reports on a Phase II, 6-month evaluation of eConsult, a web-based triage and consultation system that was piloted across 11 GP practices across Scotland. Through a multi-method approach the evaluation explored eConsult use across practices, exposing both barriers and facilitators to its adoption. Findings suggest that expectations that eConsult would offer an additional and alternative method of accessing GP services were largely met. However, there is less certainty that it has fulfilled expectations of promoting self-help. In addition, low uptake meant that evaluation of current effectiveness was difficult for practices to quantify. The presence of an eConsult champion(s) within the practice was seen to be a significant factor in ensuring successful integration of the tool. A lack of patient and staff engagement, insufficient support and lack of protocols around processes were seen as barriers to its success. PMID:29724040
An algorithm for pavement crack detection based on multiscale space
NASA Astrophysics Data System (ADS)
Liu, Xiang-long; Li, Qing-quan
2006-10-01
Conventional human-visual and manual field pavement crack detection method and approaches are very costly, time-consuming, dangerous, labor-intensive and subjective. They possess various drawbacks such as having a high degree of variability of the measure results, being unable to provide meaningful quantitative information and almost always leading to inconsistencies in crack details over space and across evaluation, and with long-periodic measurement. With the development of the public transportation and the growth of the Material Flow System, the conventional method can far from meet the demands of it, thereby, the automatic pavement state data gathering and data analyzing system come to the focus of the vocation's attention, and developments in computer technology, digital image acquisition, image processing and multi-sensors technology made the system possible, but the complexity of the image processing always made the data processing and data analyzing come to the bottle-neck of the whole system. According to the above description, a robust and high-efficient parallel pavement crack detection algorithm based on Multi-Scale Space is proposed in this paper. The proposed method is based on the facts that: (1) the crack pixels in pavement images are darker than their surroundings and continuous; (2) the threshold values of gray-level pavement images are strongly related with the mean value and standard deviation of the pixel-grey intensities. The Multi-Scale Space method is used to improve the data processing speed and minimize the effectiveness caused by image noise. Experiment results demonstrate that the advantages are remarkable: (1) it can correctly discover tiny cracks, even from very noise pavement image; (2) the efficiency and accuracy of the proposed algorithm are superior; (3) its application-dependent nature can simplify the design of the entire system.
Green material selection for sustainability: A hybrid MCDM approach.
Zhang, Honghao; Peng, Yong; Tian, Guangdong; Wang, Danqi; Xie, Pengpeng
2017-01-01
Green material selection is a crucial step for the material industry to comprehensively improve material properties and promote sustainable development. However, because of the subjectivity and conflicting evaluation criteria in its process, green material selection, as a multi-criteria decision making (MCDM) problem, has been a widespread concern to the relevant experts. Thus, this study proposes a hybrid MCDM approach that combines decision making and evaluation laboratory (DEMATEL), analytical network process (ANP), grey relational analysis (GRA) and technique for order performance by similarity to ideal solution (TOPSIS) to select the optimal green material for sustainability based on the product's needs. A nonlinear programming model with constraints was proposed to obtain the integrated closeness index. Subsequently, an empirical application of rubbish bins was used to illustrate the proposed method. In addition, a sensitivity analysis and a comparison with existing methods were employed to validate the accuracy and stability of the obtained final results. We found that this method provides a more accurate and effective decision support tool for alternative evaluation or strategy selection.
Green material selection for sustainability: A hybrid MCDM approach
Zhang, Honghao; Peng, Yong; Tian, Guangdong; Wang, Danqi; Xie, Pengpeng
2017-01-01
Green material selection is a crucial step for the material industry to comprehensively improve material properties and promote sustainable development. However, because of the subjectivity and conflicting evaluation criteria in its process, green material selection, as a multi-criteria decision making (MCDM) problem, has been a widespread concern to the relevant experts. Thus, this study proposes a hybrid MCDM approach that combines decision making and evaluation laboratory (DEMATEL), analytical network process (ANP), grey relational analysis (GRA) and technique for order performance by similarity to ideal solution (TOPSIS) to select the optimal green material for sustainability based on the product's needs. A nonlinear programming model with constraints was proposed to obtain the integrated closeness index. Subsequently, an empirical application of rubbish bins was used to illustrate the proposed method. In addition, a sensitivity analysis and a comparison with existing methods were employed to validate the accuracy and stability of the obtained final results. We found that this method provides a more accurate and effective decision support tool for alternative evaluation or strategy selection. PMID:28498864
Pythagorean fuzzy analytic hierarchy process to multi-criteria decision making
NASA Astrophysics Data System (ADS)
Mohd, Wan Rosanisah Wan; Abdullah, Lazim
2017-11-01
A numerous approaches have been proposed in the literature to determine the criteria of weight. The weight of criteria is very significant in the process of decision making. One of the outstanding approaches that used to determine weight of criteria is analytic hierarchy process (AHP). This method involves decision makers (DMs) to evaluate the decision to form the pair-wise comparison between criteria and alternatives. In classical AHP, the linguistic variable of pairwise comparison is presented in terms of crisp value. However, this method is not appropriate to present the real situation of the problems because it involved the uncertainty in linguistic judgment. For this reason, AHP has been extended by incorporating the Pythagorean fuzzy sets. In addition, no one has found in the literature proposed how to determine the weight of criteria using AHP under Pythagorean fuzzy sets. In order to solve the MCDM problem, the Pythagorean fuzzy analytic hierarchy process is proposed to determine the criteria weight of the evaluation criteria. Using the linguistic variables, pairwise comparison for evaluation criteria are made to the weights of criteria using Pythagorean fuzzy numbers (PFNs). The proposed method is implemented in the evaluation problem in order to demonstrate its applicability. This study shows that the proposed method provides us with a useful way and a new direction in solving MCDM problems with Pythagorean fuzzy context.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cho, S; Shin, E H; Kim, J
2015-06-15
Purpose: To evaluate the shielding wall design to protect patients, staff and member of the general public for secondary neutron using a simply analytic solution, multi-Monte Carlo code MCNPX, ANISN and FLUKA. Methods: An analytical and multi-Monte Carlo method were calculated for proton facility (Sumitomo Heavy Industry Ltd.) at Samsung Medical Center in Korea. The NCRP-144 analytical evaluation methods, which produced conservative estimates on the dose equivalent values for the shielding, were used for analytical evaluations. Then, the radiation transport was simulated with the multi-Monte Carlo code. The neutron dose at evaluation point is got by the value using themore » production of the simulation value and the neutron dose coefficient introduced in ICRP-74. Results: The evaluation points of accelerator control room and control room entrance are mainly influenced by the point of the proton beam loss. So the neutron dose equivalent of accelerator control room for evaluation point is 0.651, 1.530, 0.912, 0.943 mSv/yr and the entrance of cyclotron room is 0.465, 0.790, 0.522, 0.453 mSv/yr with calculation by the method of NCRP-144 formalism, ANISN, FLUKA and MCNP, respectively. The most of Result of MCNPX and FLUKA using the complicated geometry showed smaller values than Result of ANISN. Conclusion: The neutron shielding for a proton therapy facility has been evaluated by the analytic model and multi-Monte Carlo methods. We confirmed that the setting of shielding was located in well accessible area to people when the proton facility is operated.« less
Huang, Ming-Xiong; Anderson, Bill; Huang, Charles W.; Kunde, Gerd J.; Vreeland, Erika C.; Huang, Jeffrey W.; Matlashov, Andrei N.; Karaulanov, Todor; Nettles, Christopher P.; Gomez, Andrew; Minser, Kayla; Weldon, Caroline; Paciotti, Giulio; Harsh, Michael; Lee, Roland R.; Flynn, Edward R.
2017-01-01
Superparamagnetic Relaxometry (SPMR) is a highly sensitive technique for the in vivo detection of tumor cells and may improve early stage detection of cancers. SPMR employs superparamagnetic iron oxide nanoparticles (SPION). After a brief magnetizing pulse is used to align the SPION, SPMR measures the time decay of SPION using Super-conducting Quantum Interference Device (SQUID) sensors. Substantial research has been carried out in developing the SQUID hardware and in improving the properties of the SPION. However, little research has been done in the pre-processing of sensor signals and post-processing source modeling in SPMR. In the present study, we illustrate new pre-processing tools that were developed to: 1) remove trials contaminated with artifacts, 2) evaluate and ensure that a single decay process associated with bounded SPION exists in the data, 3) automatically detect and correct flux jumps, and 4) accurately fit the sensor signals with different decay models. Furthermore, we developed an automated approach based on multi-start dipole imaging technique to obtain the locations and magnitudes of multiple magnetic sources, without initial guesses from the users. A regularization process was implemented to solve the ambiguity issue related to the SPMR source variables. A procedure based on reduced chi-square cost-function was introduced to objectively obtain the adequate number of dipoles that describe the data. The new pre-processing tools and multi-start source imaging approach have been successfully evaluated using phantom data. In conclusion, these tools and multi-start source modeling approach substantially enhance the accuracy and sensitivity in detecting and localizing sources from the SPMR signals. Furthermore, multi-start approach with regularization provided robust and accurate solutions for a poor SNR condition similar to the SPMR detection sensitivity in the order of 1000 cells. We believe such algorithms will help establishing the industrial standards for SPMR when applying the technique in pre-clinical and clinical settings. PMID:28072579
Scare Tactics: Evaluating Problem Decompositions Using Failure Scenarios
NASA Technical Reports Server (NTRS)
Helm, B. Robert; Fickas, Stephen
1992-01-01
Our interest is in the design of multi-agent problem-solving systems, which we refer to as composite systems. We have proposed an approach to composite system design by decomposition of problem statements. An automated assistant called Critter provides a library of reusable design transformations which allow a human analyst to search the space of decompositions for a problem. In this paper we describe a method for evaluating and critiquing problem decompositions generated by this search process. The method uses knowledge stored in the form of failure decompositions attached to design transformations. We suggest the benefits of our critiquing method by showing how it could re-derive steps of a published development example. We then identify several open issues for the method.
Evaluation of Graphite Fiber/Polyimide PMCs from Hot Melt vs Solution Prepreg
NASA Technical Reports Server (NTRS)
Shin, E. Eugene; Sutter, James K.; Eakin, Howard; Inghram, Linda; McCorkle, Linda; Scheiman, Dan; Papadopoulos, Demetrios; Thesken, John; Fink, Jeffrey E.
2002-01-01
Carbon fiber reinforced high temperature polymer matrix composites (PMC) have been extensively investigated as potential weight reduction replacements of various metallic components in next generation high performance propulsion rocket engines. The initial phase involves development of comprehensive composite material-process-structure-design-property-in-service performance correlations and database, especially for a high stiffness facesheet of various sandwich structures. Overview of the program plan, technical approaches and current multi-team efforts will be presented. During composite fabrication, it was found that the two large volume commercial prepregging methods (hot-melt vs. solution) resulted in considerably different composite cure behavior. Details of the process-induced physical and chemical modifications in the prepregs, their effects on composite processing, and systematic cure cycle optimization studies will be discussed. The combined effects of prepregging method and cure cycle modification on composite properties and isothermal aging performance were also evaluated.
Evaluation of Graphite Fiber/Polyimide PMCs from Hot Melt versus Solution Prepreg
NASA Technical Reports Server (NTRS)
Shin, Eugene E.; Sutter, James K.; Eakin, Howard; Inghram, Linda; McCorkle, Linda; Scheiman, Dan; Papadopoulos, Demetrios; Thesken, John; Fink, Jeffrey E.; Gray, Hugh R. (Technical Monitor)
2002-01-01
Carbon fiber reinforced high temperature polymer matrix composites (PMC) have been extensively investigated as potential weight reduction replacements of various metallic components in next generation high performance propulsion rocket engines. The initial phase involves development of comprehensive composite material-process-structure-design-property in-service performance correlations and database, especially for a high stiffness facesheet of various sandwich structures. Overview of the program plan, technical approaches and current multi-team efforts will be presented. During composite fabrication, it was found that the two large volume commercial prepregging methods (hot-melt vs. solution) resulted in considerably different composite cure behavior. Details of the process-induced physical and chemical modifications in the prepregs, their effects on composite processing, and systematic cure cycle optimization studies will be discussed. The combined effects of prepregging method and cure cycle modification on composite properties and isothermal aging performance were also evaluated.
Multi-energy Coordinated Evaluation for Energy Internet
NASA Astrophysics Data System (ADS)
Jia, Dongqiang; Sun, Jian; Wang, Cunping; Hong, Xiao; Ma, Xiufan; Xiong, Wenting; Shen, Yaqi
2017-05-01
This paper reviews the current research status of multi-energy coordinated evaluation for energy Internet. Taking the coordinated optimization effect of wind energy, solar energy and other energy sources into consideration, 17 evaluation indexes, such as the substitution coefficient of cold heat and power, the ratio of wind and solar energy, and the rate of energy storage ratio, were designed from five aspects, including the acceptance of renewable energy, energy complementary alternative benefits, peak valley difference, the degree of equipment utilization and user needs. At the same time, this article attaches importance to the economic and social benefits of the coordination of multiple energy sources. Ultimately, a comprehensive multi-energy coordination evaluation index system of regional energy Internet was put forward from the safe operation, coordination and optimization, economic and social benefits four aspects, and a comprehensive evaluation model was established. This model uses the optimal combination weighting method based on moment estimation and Topsis evaluation analysis method, so both the subjective and objective weight of the index are considered and the coordinate evaluation of multi-energy is realized. Finally the perfection of the index system and the validity of the evaluation method are verified by a case analysis.
A Multi-Site Cognitive Task Analysis for Biomedical Query Mediation
Hruby, Gregory W.; Rasmussen, Luke V.; Hanauer, David; Patel, Vimla; Cimino, James J.; Weng, Chunhua
2016-01-01
Objective To apply cognitive task analyses of the Biomedical query mediation (BQM) processes for EHR data retrieval at multiple sites towards the development of a generic BQM process model. Materials and Methods We conducted semi-structured interviews with eleven data analysts from five academic institutions and one government agency, and performed cognitive task analyses on their BQM processes. A coding schema was developed through iterative refinement and used to annotate the interview transcripts. The annotated dataset was used to reconstruct and verify each BQM process and to develop a harmonized BQM process model. A survey was conducted to evaluate the face and content validity of this harmonized model. Results The harmonized process model is hierarchical, encompassing tasks, activities, and steps. The face validity evaluation concluded the model to be representative of the BQM process. In the content validity evaluation, out of the 27 tasks for BQM, 19 meet the threshold for semi-valid, including 3 fully valid: “Identify potential index phenotype,” “If needed, request EHR database access rights,” and “Perform query and present output to medical researcher”, and 8 are invalid. Discussion We aligned the goals of the tasks within the BQM model with the five components of the reference interview. The similarity between the process of BQM and the reference interview is promising and suggests the BQM tasks are powerful for eliciting implicit information needs. Conclusions We contribute a BQM process model based on a multi-site study. This model promises to inform the standardization of the BQM process towards improved communication efficiency and accuracy. PMID:27435950
Stability analysis for a multi-camera photogrammetric system.
Habib, Ayman; Detchev, Ivan; Kwak, Eunju
2014-08-18
Consumer-grade digital cameras suffer from geometrical instability that may cause problems when used in photogrammetric applications. This paper provides a comprehensive review of this issue of interior orientation parameter variation over time, it explains the common ways used for coping with the issue, and describes the existing methods for performing stability analysis for a single camera. The paper then points out the lack of coverage of stability analysis for multi-camera systems, suggests a modification of the collinearity model to be used for the calibration of an entire photogrammetric system, and proposes three methods for system stability analysis. The proposed methods explore the impact of the changes in interior orientation and relative orientation/mounting parameters on the reconstruction process. Rather than relying on ground truth in real datasets to check the system calibration stability, the proposed methods are simulation-based. Experiment results are shown, where a multi-camera photogrammetric system was calibrated three times, and stability analysis was performed on the system calibration parameters from the three sessions. The proposed simulation-based methods provided results that were compatible with a real-data based approach for evaluating the impact of changes in the system calibration parameters on the three-dimensional reconstruction.
Stability Analysis for a Multi-Camera Photogrammetric System
Habib, Ayman; Detchev, Ivan; Kwak, Eunju
2014-01-01
Consumer-grade digital cameras suffer from geometrical instability that may cause problems when used in photogrammetric applications. This paper provides a comprehensive review of this issue of interior orientation parameter variation over time, it explains the common ways used for coping with the issue, and describes the existing methods for performing stability analysis for a single camera. The paper then points out the lack of coverage of stability analysis for multi-camera systems, suggests a modification of the collinearity model to be used for the calibration of an entire photogrammetric system, and proposes three methods for system stability analysis. The proposed methods explore the impact of the changes in interior orientation and relative orientation/mounting parameters on the reconstruction process. Rather than relying on ground truth in real datasets to check the system calibration stability, the proposed methods are simulation-based. Experiment results are shown, where a multi-camera photogrammetric system was calibrated three times, and stability analysis was performed on the system calibration parameters from the three sessions. The proposed simulation-based methods provided results that were compatible with a real-data based approach for evaluating the impact of changes in the system calibration parameters on the three-dimensional reconstruction. PMID:25196012
Multiple network alignment via multiMAGNA+.
Vijayan, Vipin; Milenkovic, Tijana
2017-08-21
Network alignment (NA) aims to find a node mapping that identifies topologically or functionally similar network regions between molecular networks of different species. Analogous to genomic sequence alignment, NA can be used to transfer biological knowledge from well- to poorly-studied species between aligned network regions. Pairwise NA (PNA) finds similar regions between two networks while multiple NA (MNA) can align more than two networks. We focus on MNA. Existing MNA methods aim to maximize total similarity over all aligned nodes (node conservation). Then, they evaluate alignment quality by measuring the amount of conserved edges, but only after the alignment is constructed. Directly optimizing edge conservation during alignment construction in addition to node conservation may result in superior alignments. Thus, we present a novel MNA method called multiMAGNA++ that can achieve this. Indeed, multiMAGNA++ outperforms or is on par with existing MNA methods, while often completing faster than existing methods. That is, multiMAGNA++ scales well to larger network data and can be parallelized effectively. During method evaluation, we also introduce new MNA quality measures to allow for more fair MNA method comparison compared to the existing alignment quality measures. MultiMAGNA++ code is available on the method's web page at http://nd.edu/~cone/multiMAGNA++/.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szoka de Valladares, M.R.; Mack, S.
The DOE Hydrogen Program needs to develop criteria as part of a systematic evaluation process for proposal identification, evaluation and selection. The H Scan component of this process provides a framework in which a project proposer can fully describe their candidate technology system and its components. The H Scan complements traditional methods of capturing cost and technical information. It consists of a special set of survey forms designed to elicit information so expert reviewers can assess the proposal relative to DOE specified selection criteria. The Analytic Hierarchy Process (AHP) component of the decision process assembles the management defined evaluation andmore » selection criteria into a coherent multi-level decision construct by which projects can be evaluated in pair-wise comparisons. The AHP model will reflect management`s objectives and it will assist in the ranking of individual projects based on the extent to which each contributes to management`s objectives. This paper contains a detailed description of the products and activities associated with the planning and evaluation process: The objectives or criteria; the H Scan; and The Analytic Hierarchy Process (AHP).« less
Fuzzy Logic Approaches to Multi-Objective Decision-Making in Aerospace Applications
NASA Technical Reports Server (NTRS)
Hardy, Terry L.
1994-01-01
Fuzzy logic allows for the quantitative representation of multi-objective decision-making problems which have vague or fuzzy objectives and parameters. As such, fuzzy logic approaches are well-suited to situations where alternatives must be assessed by using criteria that are subjective and of unequal importance. This paper presents an overview of fuzzy logic and provides sample applications from the aerospace industry. Applications include an evaluation of vendor proposals, an analysis of future space vehicle options, and the selection of a future space propulsion system. On the basis of the results provided in this study, fuzzy logic provides a unique perspective on the decision-making process, allowing the evaluator to assess the degree to which each option meets the evaluation criteria. Future decision-making should take full advantage of fuzzy logic methods to complement existing approaches in the selection of alternatives.
Liang, Yicheng; Peng, Hao
2015-02-07
Depth-of-interaction (DOI) poses a major challenge for a PET system to achieve uniform spatial resolution across the field-of-view, particularly for small animal and organ-dedicated PET systems. In this work, we implemented an analytical method to model system matrix for resolution recovery, which was then incorporated in PET image reconstruction on a graphical processing unit platform, due to its parallel processing capacity. The method utilizes the concepts of virtual DOI layers and multi-ray tracing to calculate the coincidence detection response function for a given line-of-response. The accuracy of the proposed method was validated for a small-bore PET insert to be used for simultaneous PET/MR breast imaging. In addition, the performance comparisons were studied among the following three cases: 1) no physical DOI and no resolution modeling; 2) two physical DOI layers and no resolution modeling; and 3) no physical DOI design but with a different number of virtual DOI layers. The image quality was quantitatively evaluated in terms of spatial resolution (full-width-half-maximum and position offset), contrast recovery coefficient and noise. The results indicate that the proposed method has the potential to be used as an alternative to other physical DOI designs and achieve comparable imaging performances, while reducing detector/system design cost and complexity.
Detecting Water Bodies in LANDSAT8 Oli Image Using Deep Learning
NASA Astrophysics Data System (ADS)
Jiang, W.; He, G.; Long, T.; Ni, Y.
2018-04-01
Water body identifying is critical to climate change, water resources, ecosystem service and hydrological cycle. Multi-layer perceptron(MLP) is the popular and classic method under deep learning framework to detect target and classify image. Therefore, this study adopts this method to identify the water body of Landsat8. To compare the performance of classification, the maximum likelihood and water index are employed for each study area. The classification results are evaluated from accuracy indices and local comparison. Evaluation result shows that multi-layer perceptron(MLP) can achieve better performance than the other two methods. Moreover, the thin water also can be clearly identified by the multi-layer perceptron. The proposed method has the application potential in mapping global scale surface water with multi-source medium-high resolution satellite data.
Here we report results from a multi-laboratory (n=11) evaluation of four different PCR methods targeting the 16S rRNA gene of Catellicoccus marimammalium used to detect fecal contamination from birds in coastal environments. The methods included conventional end-point PCR, a SYBR...
Yu, Yuncui; Jia, Lulu; Meng, Yao; Hu, Lihua; Liu, Yiwei; Nie, Xiaolu; Zhang, Meng; Zhang, Xuan; Han, Sheng; Peng, Xiaoxia; Wang, Xiaoling
2018-04-01
Establishing a comprehensive clinical evaluation system is critical in enacting national drug policy and promoting rational drug use. In China, the 'Clinical Comprehensive Evaluation System for Pediatric Drugs' (CCES-P) project, which aims to compare drugs based on clinical efficacy and cost effectiveness to help decision makers, was recently proposed; therefore, a systematic and objective method is required to guide the process. An evidence-based multi-criteria decision analysis model that involved an analytic hierarchy process (AHP) was developed, consisting of nine steps: (1) select the drugs to be reviewed; (2) establish the evaluation criterion system; (3) determine the criterion weight based on the AHP; (4) construct the evidence body for each drug under evaluation; (5) select comparative measures and calculate the original utility score; (6) place a common utility scale and calculate the standardized utility score; (7) calculate the comprehensive utility score; (8) rank the drugs; and (9) perform a sensitivity analysis. The model was applied to the evaluation of three different inhaled corticosteroids (ICSs) used for asthma management in children (a total of 16 drugs with different dosage forms and strengths or different manufacturers). By applying the drug analysis model, the 16 ICSs under review were successfully scored and evaluated. Budesonide suspension for inhalation (drug ID number: 7) ranked the highest, with comprehensive utility score of 80.23, followed by fluticasone propionate inhaled aerosol (drug ID number: 16), with a score of 79.59, and budesonide inhalation powder (drug ID number: 6), with a score of 78.98. In the sensitivity analysis, the ranking of the top five and lowest five drugs remains unchanged, suggesting this model is generally robust. An evidence-based drug evaluation model based on AHP was successfully developed. The model incorporates sufficient utility and flexibility for aiding the decision-making process, and can be a useful tool for the CCES-P.
Prieto, N; Rodriguez-Méndez, M L; Leardi, R; Oliveri, P; Hernando-Esquisabel, D; Iñiguez-Crespo, M; de Saja, J A
2012-03-16
In this study, a multi-way method (Tucker3) was applied to evaluate the performance of an electronic nose for following the ageing of red wines. The odour evaluation carried out with the electronic nose was combined with the quantitative analysis of volatile composition performed by GC-MS, and colour characterisation by UV-visible spectroscopy. Thanks to Tucker3, it was possible to understand connections among data obtained from these three different systems and to estimate the effect of different sources of variability on wine evaluation. In particular, the application of Tucker3 supplied a global visualisation of data structure, which was very informative to understand relationships between sensors responses and chemical composition of wines. The results obtained indicate that the analytical methods employed are useful tools to follow the wine ageing process, to differentiate wine samples according to ageing type (either in barrel or in stainless steel tanks with the addition of small oak wood pieces) and to the origin (French or American) of the oak wood. Finally, it was possible to designate the volatile compounds which play a major role in such a characterisation. Copyright © 2012 Elsevier B.V. All rights reserved.
Application of multi-grid method on the simulation of incremental forging processes
NASA Astrophysics Data System (ADS)
Ramadan, Mohamad; Khaled, Mahmoud; Fourment, Lionel
2016-10-01
Numerical simulation becomes essential in manufacturing large part by incremental forging processes. It is a splendid tool allowing to show physical phenomena however behind the scenes, an expensive bill should be paid, that is the computational time. That is why many techniques are developed to decrease the computational time of numerical simulation. Multi-Grid method is a numerical procedure that permits to reduce computational time of numerical calculation by performing the resolution of the system of equations on several mesh of decreasing size which allows to smooth faster the low frequency of the solution as well as its high frequency. In this paper a Multi-Grid method is applied to cogging process in the software Forge 3. The study is carried out using increasing number of degrees of freedom. The results shows that calculation time is divide by two for a mesh of 39,000 nodes. The method is promising especially if coupled with Multi-Mesh method.
Vego, Goran; Kucar-Dragicević, Savka; Koprivanac, Natalija
2008-11-01
The efficiency of providing a waste management system in the coastal part of Croatia consisting of four Dalmatian counties has been modelled. Two multi-criteria decision-making (MCDM) methods, PROMETHEE and GAIA, were applied to assist with the systematic analysis and evaluation of the alternatives. The analysis covered two levels; first, the potential number of waste management centres resulting from possible inter-county cooperation; and second, the relative merits of siting of waste management centres in the coastal or hinterland zone was evaluated. The problem was analysed according to several criteria; and ecological, economic, social and functional criteria sets were identified as relevant to the decision-making process. The PROMETHEE and GAIA methods were shown to be efficient tools for analysing the problem considered. Such an approach provided new insights to waste management planning at the strategic level, and gave a reason for rethinking some of the existing strategic waste management documents in Croatia.
Evaluation of four inch diameter VGF-Ge substrates used for manufacturing multi-junction solar cell
NASA Astrophysics Data System (ADS)
Kewei, Cao; Tong, Liu; Jingming, Liu; Hui, Xie; Dongyan, Tao; Youwen, Zhao; Zhiyuan, Dong; Feng, Hui
2016-06-01
Low dislocation density Ge wafers grown by a vertical gradient freeze (VGF) method used for the fabrication of multi-junction photovoltaic cells (MJC) have been studied by a whole wafer scale measurement of the lattice parameter, X-ray rocking curves, etch pit density (EPD), impurities concentration, minority carrier lifetime and residual stress. Impurity content in the VGF-Ge wafers, including that of B, is quite low although B2O3 encapsulation is used in the growth process. An obvious difference exists across the whole wafer regarding the distribution of etch pit density, lattice parameter, full width at half maximum (FWHM) of the X-ray rocking curve and residual stress measured by Raman spectra. These are in contrast to a reference Ge substrate wafer grown by the Cz method. The influence of the VGF-Ge substrate on the performance of the MJC is analyzed and evaluated by a comparison of the statistical results of cell parameters. Project supported by the National Natural Science Foundation of China (No. 61474104).
Multi-task Gaussian process for imputing missing data in multi-trait and multi-environment trials.
Hori, Tomoaki; Montcho, David; Agbangla, Clement; Ebana, Kaworu; Futakuchi, Koichi; Iwata, Hiroyoshi
2016-11-01
A method based on a multi-task Gaussian process using self-measuring similarity gave increased accuracy for imputing missing phenotypic data in multi-trait and multi-environment trials. Multi-environmental trial (MET) data often encounter the problem of missing data. Accurate imputation of missing data makes subsequent analysis more effective and the results easier to understand. Moreover, accurate imputation may help to reduce the cost of phenotyping for thinned-out lines tested in METs. METs are generally performed for multiple traits that are correlated to each other. Correlation among traits can be useful information for imputation, but single-trait-based methods cannot utilize information shared by traits that are correlated. In this paper, we propose imputation methods based on a multi-task Gaussian process (MTGP) using self-measuring similarity kernels reflecting relationships among traits, genotypes, and environments. This framework allows us to use genetic correlation among multi-trait multi-environment data and also to combine MET data and marker genotype data. We compared the accuracy of three MTGP methods and iterative regularized PCA using rice MET data. Two scenarios for the generation of missing data at various missing rates were considered. The MTGP performed a better imputation accuracy than regularized PCA, especially at high missing rates. Under the 'uniform' scenario, in which missing data arise randomly, inclusion of marker genotype data in the imputation increased the imputation accuracy at high missing rates. Under the 'fiber' scenario, in which missing data arise in all traits for some combinations between genotypes and environments, the inclusion of marker genotype data decreased the imputation accuracy for most traits while increasing the accuracy in a few traits remarkably. The proposed methods will be useful for solving the missing data problem in MET data.
Evaluation of accelerometer based multi-sensor versus single-sensor activity recognition systems.
Gao, Lei; Bourke, A K; Nelson, John
2014-06-01
Physical activity has a positive impact on people's well-being and it had been shown to decrease the occurrence of chronic diseases in the older adult population. To date, a substantial amount of research studies exist, which focus on activity recognition using inertial sensors. Many of these studies adopt a single sensor approach and focus on proposing novel features combined with complex classifiers to improve the overall recognition accuracy. In addition, the implementation of the advanced feature extraction algorithms and the complex classifiers exceed the computing ability of most current wearable sensor platforms. This paper proposes a method to adopt multiple sensors on distributed body locations to overcome this problem. The objective of the proposed system is to achieve higher recognition accuracy with "light-weight" signal processing algorithms, which run on a distributed computing based sensor system comprised of computationally efficient nodes. For analysing and evaluating the multi-sensor system, eight subjects were recruited to perform eight normal scripted activities in different life scenarios, each repeated three times. Thus a total of 192 activities were recorded resulting in 864 separate annotated activity states. The methods for designing such a multi-sensor system required consideration of the following: signal pre-processing algorithms, sampling rate, feature selection and classifier selection. Each has been investigated and the most appropriate approach is selected to achieve a trade-off between recognition accuracy and computing execution time. A comparison of six different systems, which employ single or multiple sensors, is presented. The experimental results illustrate that the proposed multi-sensor system can achieve an overall recognition accuracy of 96.4% by adopting the mean and variance features, using the Decision Tree classifier. The results demonstrate that elaborate classifiers and feature sets are not required to achieve high recognition accuracies on a multi-sensor system. Copyright © 2014 IPEM. Published by Elsevier Ltd. All rights reserved.
Robust Sensitivity Analysis for Multi-Attribute Deterministic Hierarchical Value Models
2002-03-01
such as weighted sum method, weighted 5 product method, and the Analytic Hierarchy Process ( AHP ). This research focuses on only weighted sum...different groups. They can be termed as deterministic, stochastic, or fuzzy multi-objective decision methods if they are classified according to the...weighted product model (WPM), and analytic hierarchy process ( AHP ). His method attempts to identify the most important criteria weight and the most
Fan, Feiyi; Yan, Yuepeng; Tang, Yongzhong; Zhang, Hao
2017-12-01
Monitoring pulse oxygen saturation (SpO 2 ) and heart rate (HR) using photoplethysmography (PPG) signal contaminated by a motion artifact (MA) remains a difficult problem, especially when the oximeter is not equipped with a 3-axis accelerometer for adaptive noise cancellation. In this paper, we report a pioneering investigation on the impact of altering the frame length of Molgedey and Schuster independent component analysis (ICAMS) on performance, design a multi-classifier fusion strategy for selecting the PPG correlated signal component, and propose a novel approach to extract SpO 2 and HR readings from PPG signal contaminated by strong MA interference. The algorithm comprises multiple stages, including dual frame length ICAMS, a multi-classifier-based PPG correlated component selector, line spectral analysis, tree-based HR monitoring, and post-processing. Our approach is evaluated by multi-subject tests. The root mean square error (RMSE) is calculated for each trial. Three statistical metrics are selected as performance evaluation criteria: mean RMSE, median RMSE and the standard deviation (SD) of RMSE. The experimental results demonstrate that a shorter ICAMS analysis window probably results in better performance in SpO 2 estimation. Notably, the designed multi-classifier signal component selector achieved satisfactory performance. The subject tests indicate that our algorithm outperforms other baseline methods regarding accuracy under most criteria. The proposed work can contribute to improving the performance of current pulse oximetry and personal wearable monitoring devices. Copyright © 2017 Elsevier Ltd. All rights reserved.
Yao, Bao-Guo; Peng, Yun-Liang; Zhang, De-Pin
2017-01-01
Porous polymeric materials, such as textile fabrics, are elastic and widely used in our daily life for garment and household products. The mechanical and dynamic heat transfer properties of porous polymeric materials, which describe the sensations during the contact process between porous polymeric materials and parts of the human body, such as the hand, primarily influence comfort sensations and aesthetic qualities of clothing. A multi-sensory measurement system and a new method were proposed to simultaneously sense the contact and characterize the mechanical and dynamic heat transfer properties of porous polymeric materials, such as textile fabrics in one instrument, with consideration of the interactions between different aspects of contact feels. The multi-sensory measurement system was developed for simulating the dynamic contact and psychological judgment processes during human hand contact with porous polymeric materials, and measuring the surface smoothness, compression resilience, bending and twisting, and dynamic heat transfer signals simultaneously. The contact sensing principle and the evaluation methods were presented. Twelve typical sample materials with different structural parameters were measured. The results of the experiments and the interpretation of the test results were described. An analysis of the variance and a capacity study were investigated to determine the significance of differences among the test materials and to assess the gage repeatability and reproducibility. A correlation analysis was conducted by comparing the test results of this measurement system with the results of Kawabata Evaluation System (KES) in separate instruments. This multi-sensory measurement system provides a new method for simultaneous contact sensing and characterizing of mechanical and dynamic heat transfer properties of porous polymeric materials. PMID:29084152
Yao, Bao-Guo; Peng, Yun-Liang; Zhang, De-Pin
2017-10-30
Porous polymeric materials, such as textile fabrics, are elastic and widely used in our daily life for garment and household products. The mechanical and dynamic heat transfer properties of porous polymeric materials, which describe the sensations during the contact process between porous polymeric materials and parts of the human body, such as the hand, primarily influence comfort sensations and aesthetic qualities of clothing. A multi-sensory measurement system and a new method were proposed to simultaneously sense the contact and characterize the mechanical and dynamic heat transfer properties of porous polymeric materials, such as textile fabrics in one instrument, with consideration of the interactions between different aspects of contact feels. The multi-sensory measurement system was developed for simulating the dynamic contact and psychological judgment processes during human hand contact with porous polymeric materials, and measuring the surface smoothness, compression resilience, bending and twisting, and dynamic heat transfer signals simultaneously. The contact sensing principle and the evaluation methods were presented. Twelve typical sample materials with different structural parameters were measured. The results of the experiments and the interpretation of the test results were described. An analysis of the variance and a capacity study were investigated to determine the significance of differences among the test materials and to assess the gage repeatability and reproducibility. A correlation analysis was conducted by comparing the test results of this measurement system with the results of Kawabata Evaluation System (KES) in separate instruments. This multi-sensory measurement system provides a new method for simultaneous contact sensing and characterizing of mechanical and dynamic heat transfer properties of porous polymeric materials.
Systematic cloning of human minisatellites from ordered array charomid libraries.
Armour, J A; Povey, S; Jeremiah, S; Jeffreys, A J
1990-11-01
We present a rapid and efficient method for the isolation of minisatellite loci from human DNA. The method combines cloning a size-selected fraction of human MboI DNA fragments in a charomid vector with hybridization screening of the library in ordered array. Size-selection of large MboI fragments enriches for the longer, more variable minisatellites and reduces the size of the library required. The library was screened with a series of multi-locus probes known to detect a large number of hypervariable loci in human DNA. The gridded library allowed both the rapid processing of positive clones and the comparative evaluation of the different multi-locus probes used, in terms of both the relative success in detecting hypervariable loci and the degree of overlap between the sets of loci detected. We report 23 new human minisatellite loci isolated by this method, which map to 14 autosomes and the sex chromosomes.
NASA Astrophysics Data System (ADS)
Pan, Minqiang; Zeng, Dehuai; Tang, Yong
A novel multi-cutter milling process for multiple parallel microchannels with manifolds is proposed to address the challenge of mass manufacture as required for cost-effective commercial applications. Several slotting cutters are stacked together to form a composite tool for machining microchannels simultaneously. The feasibility of this new fabrication process is experimentally investigated under different machining conditions and reaction characteristics of methanol steam reforming for hydrogen production. The influences of cutting parameters and the composite tool on the microchannel qualities and burr formation are analyzed. Experimental results indicate that larger cutting speed, smaller feed rate and cutting depth are in favor of obtaining relatively good microchannel qualities and small burrs. Of all the cutting parameters considered in these experiments, 94.2 m min -1 cutting speed, 23.5 mm min -1 feed rate and 0.5 mm cutting depth are found to be the optimum value. According to the comparisons of experimental results of multi-cutter milling process and estimated one of other alternative methods, it is found that multi-cutter milling process shows much shorter machining time and higher work removal rate than that of other alternative methods. Reaction characteristics of methanol steam reforming in microchannels also indicate that multi-cutter milling process is probably suitable for a commercial application.
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Abumeri, Galib H.
2000-01-01
Aircraft engines are assemblies of dynamically interacting components. Engine updates to keep present aircraft flying safely and engines for new aircraft are progressively required to operate in more demanding technological and environmental requirements. Designs to effectively meet those requirements are necessarily collections of multi-scale, multi-level, multi-disciplinary analysis and optimization methods and probabilistic methods are necessary to quantify respective uncertainties. These types of methods are the only ones that can formally evaluate advanced composite designs which satisfy those progressively demanding requirements while assuring minimum cost, maximum reliability and maximum durability. Recent research activities at NASA Glenn Research Center have focused on developing multi-scale, multi-level, multidisciplinary analysis and optimization methods. Multi-scale refers to formal methods which describe complex material behavior metal or composite; multi-level refers to integration of participating disciplines to describe a structural response at the scale of interest; multidisciplinary refers to open-ended for various existing and yet to be developed discipline constructs required to formally predict/describe a structural response in engine operating environments. For example, these include but are not limited to: multi-factor models for material behavior, multi-scale composite mechanics, general purpose structural analysis, progressive structural fracture for evaluating durability and integrity, noise and acoustic fatigue, emission requirements, hot fluid mechanics, heat-transfer and probabilistic simulations. Many of these, as well as others, are encompassed in an integrated computer code identified as Engine Structures Technology Benefits Estimator (EST/BEST) or Multi-faceted/Engine Structures Optimization (MP/ESTOP). The discipline modules integrated in MP/ESTOP include: engine cycle (thermodynamics), engine weights, internal fluid mechanics, cost, mission and coupled structural/thermal, various composite property simulators and probabilistic methods to evaluate uncertainty effects (scatter ranges) in all the design parameters. The objective of the proposed paper is to briefly describe a multi-faceted design analysis and optimization capability for coupled multi-discipline engine structures optimization. Results are presented for engine and aircraft type metrics to illustrate the versatility of that capability. Results are also presented for reliability, noise and fatigue to illustrate its inclusiveness. For example, replacing metal rotors with composites reduces the engine weight by 20 percent, 15 percent noise reduction, and an order of magnitude improvement in reliability. Composite designs exist to increase fatigue life by at least two orders of magnitude compared to state-of-the-art metals.
Project evaluation and selection using fuzzy Delphi method and zero - one goal programming
NASA Astrophysics Data System (ADS)
Alias, Suriana; Adna, Nofarziah; Arsad, Roslah; Soid, Siti Khuzaimah; Ali, Zaileha Md
2014-12-01
Project evaluation and selection is a factor affecting the impotence of board director in which is trying to maximize all the possible goals. Assessment of the problem occurred in organization plan is the first phase for decision making process. The company needs a group of expert to evaluate the problems. The Fuzzy Delphi Method (FDM) is a systematic procedure to evoke the group's opinion in order to get the best result to evaluate the project performance. This paper proposes an evaluation and selection of the best alternative project based on combination of FDM and Zero - One Goal Programming (ZOGP) formulation. ZOGP is used to solve the multi-criteria decision making for final decision part by using optimization software LINDO 6.1. An empirical example on an ongoing decision making project in Johor, Malaysia is implemented for case study.
Tučník, Petr; Bureš, Vladimír
2016-01-01
Multi-criteria decision-making (MCDM) can be formally implemented by various methods. This study compares suitability of four selected MCDM methods, namely WPM, TOPSIS, VIKOR, and PROMETHEE, for future applications in agent-based computational economic (ACE) models of larger scale (i.e., over 10 000 agents in one geographical region). These four MCDM methods were selected according to their appropriateness for computational processing in ACE applications. Tests of the selected methods were conducted on four hardware configurations. For each method, 100 tests were performed, which represented one testing iteration. With four testing iterations conducted on each hardware setting and separated testing of all configurations with the-server parameter de/activated, altogether, 12800 data points were collected and consequently analyzed. An illustrational decision-making scenario was used which allows the mutual comparison of all of the selected decision making methods. Our test results suggest that although all methods are convenient and can be used in practice, the VIKOR method accomplished the tests with the best results and thus can be recommended as the most suitable for simulations of large-scale agent-based models.
Using practice development methodology to develop children's centre teams: ideas for the future.
Hemingway, Ann; Cowdell, Fiona
2009-09-01
The Children's Centre Programme is a recent development in the UK and brings together multi-agency teams to work with disadvantaged families. Practice development methods enable teams to work together in new ways. Although the term practice development remains relatively poorly defined, its key properties suggest that it embraces engagement, empowerment, evaluation and evolution. This paper introduces the Children's Centre Programme and practice development methods and aims to discuss the relevance of using this method to develop teams in children's centres through considering the findings from an evaluation of a two-year project to develop inter-agency public health teams. The evaluation showed that practice development methods can enable successful team development and showed that through effective facilitation, teams can change their practice to focus on areas of local need. The team came up with their own process to develop a strategy for their locality.
NASA Astrophysics Data System (ADS)
Huang, Xiangsheng; Zhong, Mingqiu; Li, Ying; Yang, Hongyuan
2018-05-01
High-power of the offshore wind turbine is in the early stage of development, then how to establish a scientific and impartial performance evaluation system of the offshore wind turbine becomes the key to the health development of the industry. This paper adopts the method of multi-level analysis and site testing, which can reduce the impact of human factors on evaluation to the most extent. A more reasonable judging criterion with the relative importance of different factors of the same criterion level is also put forward, which constructs a more scientific and fair evaluation system of the high-power offshore wind turbine.
NASA Astrophysics Data System (ADS)
Xuan, Li; He, Bin; Hu, Li-Fa; Li, Da-Yu; Xu, Huan-Yu; Zhang, Xing-Yun; Wang, Shao-Xin; Wang, Yu-Kun; Yang, Cheng-Liang; Cao, Zhao-Liang; Mu, Quan-Quan; Lu, Xing-Hai
2016-09-01
Multi-conjugation adaptive optics (MCAOs) have been investigated and used in the large aperture optical telescopes for high-resolution imaging with large field of view (FOV). The atmospheric tomographic phase reconstruction and projection of three-dimensional turbulence volume onto wavefront correctors, such as deformable mirrors (DMs) or liquid crystal wavefront correctors (LCWCs), is a very important step in the data processing of an MCAO’s controller. In this paper, a method according to the wavefront reconstruction performance of MCAO is presented to evaluate the optimized configuration of multi laser guide stars (LGSs) and the reasonable conjugation heights of LCWCs. Analytical formulations are derived for the different configurations and are used to generate optimized parameters for MCAO. Several examples are given to demonstrate our LGSs configuration optimization method. Compared with traditional methods, our method has minimum wavefront tomographic error, which will be helpful to get higher imaging resolution at large FOV in MCAO. Project supported by the National Natural Science Foundation of China (Grant Nos. 11174274, 11174279, 61205021, 11204299, 61475152, and 61405194) and the State Key Laboratory of Applied Optics, Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences.
Kahl, Johannes; Bodroza-Solarov, Marija; Busscher, Nicolaas; Hajslova, Jana; Kneifel, Wolfgang; Kokornaczyk, Maria Olga; van Ruth, Saskia; Schulzova, Vera; Stolz, Peter
2014-10-01
Organic food quality determination needs multi-dimensional evaluation tools. The main focus is on the authentication as an analytical verification of the certification process. New fingerprinting approaches such as ultra-performance liquid chromatography-mass spectrometry, gas chromatography-mass spectrometry, direct analysis in real time-high-resolution mass spectrometry as well as crystallization with and without the presence of additives seem to be promising methods in terms of time of analysis and detecting organic system-related parameters. For further methodological development, a system approach is recommended, which also takes into account food structure aspects. Furthermore, the authentication of processed organic samples needs more consciousness, hence most of organic food is complex and processed. © 2013 Society of Chemical Industry.
Schaarup, Clara; Hartvigsen, Gunnar; Larsen, Lars Bo; Tan, Zheng-Hua; Årsand, Eirik; Hejlesen, Ole Kristian
2015-01-01
The Online Diabetes Exercise System was developed to motivate people with Type 2 diabetes to do a 25 minutes low-volume high-intensity interval training program. In a previous multi-method evaluation of the system, several usability issues were identified and corrected. Despite the thorough testing, it was unclear whether all usability problems had been identified using the multi-method evaluation. Our hypothesis was that adding the eye-tracking triangulation to the multi-method evaluation would increase the accuracy and completeness when testing the usability of the system. The study design was an Eye-tracking Triangulation; conventional eye-tracking with predefined tasks followed by The Post-Experience Eye-Tracked Protocol (PEEP). Six Areas of Interests were the basis for the PEEP-session. The eye-tracking triangulation gave objective and subjective results, which are believed to be highly relevant for designing, implementing, evaluating and optimizing systems in the field of health informatics. Future work should include testing the method on a larger and more representative group of users and apply the method on different system types.
Multi-Model Ensemble Wake Vortex Prediction
NASA Technical Reports Server (NTRS)
Koerner, Stephan; Holzaepfel, Frank; Ahmad, Nash'at N.
2015-01-01
Several multi-model ensemble methods are investigated for predicting wake vortex transport and decay. This study is a joint effort between National Aeronautics and Space Administration and Deutsches Zentrum fuer Luft- und Raumfahrt to develop a multi-model ensemble capability using their wake models. An overview of different multi-model ensemble methods and their feasibility for wake applications is presented. The methods include Reliability Ensemble Averaging, Bayesian Model Averaging, and Monte Carlo Simulations. The methodologies are evaluated using data from wake vortex field experiments.
Car-to-pedestrian collision reconstruction with injury as an evaluation index.
Weng, Yiliu; Jin, Xianlong; Zhao, Zhijie; Zhang, Xiaoyun
2010-07-01
Reconstruction of accidents is currently considered as a useful means in the analysis of accidents. By multi-body dynamics and numerical methods, and by adopting vehicle and pedestrian models, the scenario of the crash can often be simulated. When reconstructing the collisions, questions often arise regarding the criteria for the evaluation of simulation results. This paper proposes a reconstruction method for car-to-pedestrian collisions based on injuries of the pedestrians. In this method, pedestrian injury becomes a critical index in judging the correctness of the reconstruction result and guiding the simulation process. Application of this method to a real accident case is also presented in this paper. The study showed a good agreement between injuries obtained by numerical simulation and that by forensic identification. Copyright 2010 Elsevier Ltd. All rights reserved.
Karagiannidis, A; Perkoulidis, G
2009-04-01
This paper describes a conceptual framework and methodological tool developed for the evaluation of different anaerobic digestion technologies suitable for treating the organic fraction of municipal solid waste, by introducing the multi-criteria decision support method Electre III and demonstrating its related applicability via a test application. Several anaerobic digestion technologies have been proposed over the last years; when compared to biogas recovery from landfills, their advantage is the stability in biogas production and the stabilization of waste prior to final disposal. Anaerobic digestion technologies also show great adaptability to a broad spectrum of different input material beside the organic fraction of municipal solid waste (e.g. agricultural and animal wastes, sewage sludge) and can also be used in remote and isolated communities, either stand-alone or in conjunction to other renewable energy sources. Main driver for this work was the preliminary screening of such methods for potential application in Hellenic islands in the municipal solid waste management sector. Anaerobic digestion technologies follow different approaches to the anaerobic digestion process and also can include production of compost. In the presented multi-criteria analysis exercise, Electre III is implemented for comparing and ranking 5 selected alternative anaerobic digestion technologies. The results of a performed sensitivity analysis are then discussed. In conclusion, the performed multi-criteria approach was found to be a practical and feasible method for the integrated assessment and ranking of anaerobic digestion technologies by also considering different viewpoints and other uncertainties of the decision-making process.
Fuzzy MCDM Technique for Planning the Environment Watershed
NASA Astrophysics Data System (ADS)
Chen, Yi-Chun; Lien, Hui-Pang; Tzeng, Gwo-Hshiung; Yang, Lung-Shih; Yen, Leon
In the real word, the decision making problems are very vague and uncertain in a number of ways. The most criteria have interdependent and interactive features so they cannot be evaluated by conventional measures method. Such as the feasibility, thus, to approximate the human subjective evaluation process, it would be more suitable to apply a fuzzy method in environment-watershed plan topic. This paper describes the design of a fuzzy decision support system in multi-criteria analysis approach for selecting the best plan alternatives or strategies in environmentwatershed. The Fuzzy Analytic Hierarchy Process (FAHP) method is used to determine the preference weightings of criteria for decision makers by subjective perception. A questionnaire was used to find out from three related groups comprising fifteen experts. Subjectivity and vagueness analysis is dealt with the criteria and alternatives for selection process and simulation results by using fuzzy numbers with linguistic terms. Incorporated the decision makers’ attitude towards preference, overall performance value of each alternative can be obtained based on the concept of Fuzzy Multiple Criteria Decision Making (FMCDM). This research also gives an example of evaluating consisting of five alternatives, solicited from a environmentwatershed plan works in Taiwan, is illustrated to demonstrate the effectiveness and usefulness of the proposed approach.
Li, Dongming; Sun, Changming; Yang, Jinhua; Liu, Huan; Peng, Jiaqi; Zhang, Lijuan
2017-04-06
An adaptive optics (AO) system provides real-time compensation for atmospheric turbulence. However, an AO image is usually of poor contrast because of the nature of the imaging process, meaning that the image contains information coming from both out-of-focus and in-focus planes of the object, which also brings about a loss in quality. In this paper, we present a robust multi-frame adaptive optics image restoration algorithm via maximum likelihood estimation. Our proposed algorithm uses a maximum likelihood method with image regularization as the basic principle, and constructs the joint log likelihood function for multi-frame AO images based on a Poisson distribution model. To begin with, a frame selection method based on image variance is applied to the observed multi-frame AO images to select images with better quality to improve the convergence of a blind deconvolution algorithm. Then, by combining the imaging conditions and the AO system properties, a point spread function estimation model is built. Finally, we develop our iterative solutions for AO image restoration addressing the joint deconvolution issue. We conduct a number of experiments to evaluate the performances of our proposed algorithm. Experimental results show that our algorithm produces accurate AO image restoration results and outperforms the current state-of-the-art blind deconvolution methods.
Li, Dongming; Sun, Changming; Yang, Jinhua; Liu, Huan; Peng, Jiaqi; Zhang, Lijuan
2017-01-01
An adaptive optics (AO) system provides real-time compensation for atmospheric turbulence. However, an AO image is usually of poor contrast because of the nature of the imaging process, meaning that the image contains information coming from both out-of-focus and in-focus planes of the object, which also brings about a loss in quality. In this paper, we present a robust multi-frame adaptive optics image restoration algorithm via maximum likelihood estimation. Our proposed algorithm uses a maximum likelihood method with image regularization as the basic principle, and constructs the joint log likelihood function for multi-frame AO images based on a Poisson distribution model. To begin with, a frame selection method based on image variance is applied to the observed multi-frame AO images to select images with better quality to improve the convergence of a blind deconvolution algorithm. Then, by combining the imaging conditions and the AO system properties, a point spread function estimation model is built. Finally, we develop our iterative solutions for AO image restoration addressing the joint deconvolution issue. We conduct a number of experiments to evaluate the performances of our proposed algorithm. Experimental results show that our algorithm produces accurate AO image restoration results and outperforms the current state-of-the-art blind deconvolution methods. PMID:28383503
Luxton, David D; Thomas, Elissa K; Chipps, Joan; Relova, Rona M; Brown, Daphne; McLay, Robert; Lee, Tina T; Nakama, Helenna; Smolenski, Derek J
2014-03-01
Caring letters is a suicide prevention intervention that entails the sending of brief messages that espouse caring concern to patients following discharge from treatment. First tested more than four decades ago, this intervention is one of the only interventions shown in a randomized controlled trial to reduce suicide mortality rates. Due to elevated suicide risk among patients following psychiatric hospitalization and the steady increase in suicide rates among the U.S. military personnel, it is imperative to test interventions that may help prevent suicide among high-risk military personnel and veterans. This paper describes the design, methods, study protocol, and regulatory implementation processes for a multi-site randomized controlled trial that aims to evaluate the effectiveness of a caring emails intervention for suicide prevention in the military and VA healthcare systems. The primary outcome is suicide mortality rates to be determined 24 months post-discharge from index hospital stay. Healthcare re-utilization rates will also be evaluated and comprehensive data will be collected regarding suicide risk factors. Recommendations for navigating the military and VA research regulatory processes and implementing a multi-site clinical trial at military and VA hospitals are discussed. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Vajedian, S.; Motagh, M.; Nilfouroushan, F.
2013-09-01
InSAR capacity to detect slow deformation over terrain areas is limited by temporal and geometric decorrelations. Multitemporal InSAR techniques involving Persistent Scatterer (Ps-InSAR) and Small Baseline (SBAS) are recently developed to compensate the decorrelation problems. Geometric decorrelation in mountainous areas especially for Envisat images makes phase unwrapping process difficult. To improve this unwrapping problem, we first modified phase filtering to make the wrapped phase image as smooth as possible. In addition, in order to improve unwrapping results, a modified unwrapping method has been developed. This method includes removing possible orbital and tropospheric effects. Topographic correction is done within three-dimensional unwrapping, Orbital and tropospheric corrections are done after unwrapping process. To evaluate the effectiveness of our improved method we tested the proposed algorithm by Envisat and ALOS dataset and compared our results with recently developed PS software (StaMAPS). In addition we used GPS observations for evaluating the modified method. The results indicate that our method improves the estimated deformation significantly.
Using Multi-Objective Genetic Programming to Synthesize Stochastic Processes
NASA Astrophysics Data System (ADS)
Ross, Brian; Imada, Janine
Genetic programming is used to automatically construct stochastic processes written in the stochastic π-calculus. Grammar-guided genetic programming constrains search to useful process algebra structures. The time-series behaviour of a target process is denoted with a suitable selection of statistical feature tests. Feature tests can permit complex process behaviours to be effectively evaluated. However, they must be selected with care, in order to accurately characterize the desired process behaviour. Multi-objective evaluation is shown to be appropriate for this application, since it permits heterogeneous statistical feature tests to reside as independent objectives. Multiple undominated solutions can be saved and evaluated after a run, for determination of those that are most appropriate. Since there can be a vast number of candidate solutions, however, strategies for filtering and analyzing this set are required.
Multi-time scale energy management of wind farms based on comprehensive evaluation technology
NASA Astrophysics Data System (ADS)
Xu, Y. P.; Huang, Y. H.; Liu, Z. J.; Wang, Y. F.; Li, Z. Y.; Guo, L.
2017-11-01
A novel energy management of wind farms is proposed in this paper. Firstly, a novel comprehensive evaluation system is proposed to quantify economic properties of each wind farm to make the energy management more economical and reasonable. Then, a combination of multi time-scale schedule method is proposed to develop a novel energy management. The day-ahead schedule optimizes unit commitment of thermal power generators. The intraday schedule is established to optimize power generation plan for all thermal power generating units, hydroelectric generating sets and wind power plants. At last, the power generation plan can be timely revised in the process of on-line schedule. The paper concludes with simulations conducted on a real provincial integrated energy system in northeast China. Simulation results have validated the proposed model and corresponding solving algorithms.
Ravesteijn, Wim; Liu, Yi; Yan, Ping
2015-01-01
The paper outlines and specifies 'responsible port innovation', introducing the development of a methodological and procedural step-by-step plan for the implementation and evaluation of (responsible) innovations. Subsequently, it uses this as a guideline for the analysis and evaluation of two case-studies. The construction of the Rotterdam Maasvlakte 2 Port meets most of the formulated requirements, though making values more explicit and treating it as a process right from the start could have benefitted the project. The Dalian Dayao Port could improve its decision-making procedures in several respects, including the introduction of new methods to handle value tensions. Both projects show that public support is crucial in responsible port innovation and that it should be not only a multi-faceted but also a multi-level strategy.
ERIC Educational Resources Information Center
Babu, Rakesh; Singh, Rahul
2013-01-01
This paper presents a novel task-oriented, user-centered, multi-method evaluation (TUME) technique and shows how it is useful in providing a more complete, practical and solution-oriented assessment of the accessibility and usability of Learning Management Systems (LMS) for blind and visually impaired (BVI) students. Novel components of TUME…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Josimović, Boško, E-mail: bosko@iaus.ac.rs; Marić, Igor; Milijić, Saša
2015-02-15
Highlights: • The paper deals with the specific method of multi-criteria evaluation applied in drafting the SEA for the Belgrade WMP. • MCE of the planning solutions, assessed according to 37 objectives of the SEA and four sets of criteria, was presented in the matrix form. • The results are presented in the form of graphs so as to be easily comprehensible to all the participants in the decision-making process. • The results represent concrete contribution proven in practice. - Abstract: Strategic Environmental Assessment (SEA) is one of the key instruments for implementing sustainable development strategies in planning in general;more » in addition to being used in sectoral planning, it can also be used in other areas such as waste management planning. SEA in waste management planning has become a tool for considering the benefits and consequences of the proposed changes in space, also taking into account the capacity of space to sustain the implementation of the planned activities. In order to envisage both the positive and negative implications of a waste management plan for the elements of sustainable development, an adequate methodological approach to evaluating the potential impacts must be adopted and the evaluation results presented in a simple and clear way, so as to allow planners to make relevant decisions as a precondition for the sustainability of the activities planned in the waste management sector. This paper examines the multi-criteria evaluation method for carrying out an SEA for the Waste Management Plan for the city of Belgrade (BWMP). The method was applied to the evaluation of the impacts of the activities planned in the waste management sector on the basis of the environmental and socioeconomic indicators of sustainability, taking into consideration the intensity, spatial extent, probability and frequency of impact, by means of a specific planning approach and simple and clear presentation of the obtained results.« less
2012-01-01
Nanochannel arrays were fabricated by the self-organized multi-electrolyte-step anodic aluminum oxide [AAO] method in this study. The anodization conditions used in the multi-electrolyte-step AAO method included a phosphoric acid solution as the electrolyte and an applied high voltage. There was a change in the phosphoric acid by the oxalic acid solution as the electrolyte and the applied low voltage. This method was used to produce self-organized nanochannel arrays with good regularity and circularity, meaning less power loss and processing time than with the multi-step AAO method. PMID:22333268
Deng, Xinyang; Jiang, Wen
2017-09-12
Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model.
Deng, Xinyang
2017-01-01
Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model. PMID:28895905
2016 International Land Model Benchmarking (ILAMB) Workshop Report
NASA Technical Reports Server (NTRS)
Hoffman, Forrest M.; Koven, Charles D.; Keppel-Aleks, Gretchen; Lawrence, David M.; Riley, William J.; Randerson, James T.; Ahlstrom, Anders; Abramowitz, Gabriel; Baldocchi, Dennis D.; Best, Martin J.;
2016-01-01
As earth system models (ESMs) become increasingly complex, there is a growing need for comprehensive and multi-faceted evaluation of model projections. To advance understanding of terrestrial biogeochemical processes and their interactions with hydrology and climate under conditions of increasing atmospheric carbon dioxide, new analysis methods are required that use observations to constrain model predictions, inform model development, and identify needed measurements and field experiments. Better representations of biogeochemistryclimate feedbacks and ecosystem processes in these models are essential for reducing the acknowledged substantial uncertainties in 21st century climate change projections.
2016 International Land Model Benchmarking (ILAMB) Workshop Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoffman, Forrest M.; Koven, Charles D.; Keppel-Aleks, Gretchen
As Earth system models become increasingly complex, there is a growing need for comprehensive and multi-faceted evaluation of model projections. To advance understanding of biogeochemical processes and their interactions with hydrology and climate under conditions of increasing atmospheric carbon dioxide, new analysis methods are required that use observations to constrain model predictions, inform model development, and identify needed measurements and field experiments. Better representations of biogeochemistry–climate feedbacks and ecosystem processes in these models are essential for reducing uncertainties associated with projections of climate change during the remainder of the 21st century.
NASA Astrophysics Data System (ADS)
Shahzad, Muhammad Wakil; Ng, Kim Choon; Thu, Kyaw
2016-06-01
Power and desalination cogeneration plants are common in many water scared courtiers. Designers and planners for cogeneration face tough challenges in deciding the options:- Is it better to operate a power plant (PP) with the reverse osmosis (i.e., PP+RO) or the thermally-driven multi-effect distillation/multi-stage flashed (PP+MED/MSF) methods. From literature, the RO methods are known to be energy efficient whilst the MED/MSF are known to have excellent thermodynamic synergies as only low pressure and temperature steam are used. Not with-standing the challenges of severe feed seawater of the Gulf, such as the frequent harmful algae blooms (HABs) and high silt contents, this presentation presents a quantitative analyses using the exergy and energetic approaches in evaluating the performances of a real cogeneration plant that was recently proposed in the eastern part of Saudi Arabia. We demonstrate that the process choice of PP+RO versus PP+MED depends on the inherent efficiencies of individual process method which is closely related to innovative process design. In this connection, a method of primary fuel cost apportionment for a co-generation plant with a MED desalination is presented. We show that an energy approach, that captures the quality of expanding steam, is a better method over the conventional work output (energetic) and the energy method seems to be over-penalizing a thermally-driven MED by as much as 22% in the operating cost of water.
Imaging quality analysis of multi-channel scanning radiometer
NASA Astrophysics Data System (ADS)
Fan, Hong; Xu, Wujun; Wang, Chengliang
2008-03-01
Multi-channel scanning radiometer, on boarding FY-2 geostationary meteorological satellite, plays a key role in remote sensing because of its wide field of view and continuous multi-spectral images acquirements. It is significant to evaluate image quality after performance parameters of the imaging system are validated. Several methods of evaluating imaging quality are discussed. Of these methods, the most fundamental is the MTF. The MTF of photoelectric scanning remote instrument, in the scanning direction, is the multiplication of optics transfer function (OTF), detector transfer function (DTF) and electronics transfer function (ETF). For image motion compensation, moving speed of scanning mirror should be considered. The optical MTF measurement is performed in both the EAST/WEST and NORTH/SOUTH direction, whose values are used for alignment purposes and are used to determine the general health of the instrument during integration and testing. Imaging systems cannot perfectly reproduce what they see and end up "blurring" the image. Many parts of the imaging system can cause blurring. Among these are the optical elements, the sampling of the detector itself, post-processing, or the earth's atmosphere for systems that image through it. Through theory calculation and actual measurement, it is proved that DTF and ETF are the main factors of system MTF and the imaging quality can satisfy the requirement of instrument design.
A semi-automated tool for treatment plan-quality evaluation and clinical trial quality assurance
NASA Astrophysics Data System (ADS)
Wang, Jiazhou; Chen, Wenzhou; Studenski, Matthew; Cui, Yunfeng; Lee, Andrew J.; Xiao, Ying
2013-07-01
The goal of this work is to develop a plan-quality evaluation program for clinical routine and multi-institutional clinical trials so that the overall evaluation efficiency is improved. In multi-institutional clinical trials evaluating the plan quality is a time-consuming and labor-intensive process. In this note, we present a semi-automated plan-quality evaluation program which combines MIMVista, Java/MATLAB, and extensible markup language (XML). More specifically, MIMVista is used for data visualization; Java and its powerful function library are implemented for calculating dosimetry parameters; and to improve the clarity of the index definitions, XML is applied. The accuracy and the efficiency of the program were evaluated by comparing the results of the program with the manually recorded results in two RTOG trials. A slight difference of about 0.2% in volume or 0.6 Gy in dose between the semi-automated program and manual recording was observed. According to the criteria of indices, there are minimal differences between the two methods. The evaluation time is reduced from 10-20 min to 2 min by applying the semi-automated plan-quality evaluation program.
Wang, Qianfeng; Wu, Jianjun; Li, Xiaohan; Zhou, Hongkui; Yang, Jianhua; Geng, Guangpo; An, Xueli; Liu, Leizhen; Tang, Zhenghong
2017-04-01
The quantitative evaluation of the impact of drought on crop yield is one of the most important aspects in agricultural water resource management. To assess the impact of drought on wheat yield, the Environmental Policy Integrated Climate (EPIC) crop growth model and daily Standardized Precipitation Evapotranspiration Index (SPEI), which is based on daily meteorological data, are adopted in the Huang Huai Hai Plain. The winter wheat crop yields are estimated at 28 stations, after calibrating the cultivar coefficients based on the experimental site data, and SPEI data was taken 11 times across the growth season from 1981 to 2010. The relationship between estimated yield and multi-scale SPEI were analyzed. The optimum time scale SPEI to monitor drought during the crop growth period was determined. The reference yield was determined by averaging the yields from numerous non-drought years. From this data, we propose a comprehensive quantitative method which can be used to predict the impact of drought on wheat yields by combining the daily multi-scale SPEI and crop growth process model. This method was tested in the Huang Huai Hai Plain. The results suggested that estimation of calibrated EPIC was a good predictor of crop yield in the Huang Huai Hai Plain, with lower RMSE (15.4 %) between estimated yield and observed yield at six agrometeorological stations. The soil moisture at planting time was affected by the precipitation and evapotranspiration during the previous 90 days (about 3 months) in the Huang Huai Hai Plain. SPEI G90 was adopted as the optimum time scale SPEI to identify the drought and non-drought years, and identified a drought year in 2000. The water deficit in the year 2000 was significant, and the rate of crop yield reduction did not completely correspond with the volume of water deficit. Our proposed comprehensive method which quantitatively evaluates the impact of drought on crop yield is reliable. The results of this study further our understanding why the adoption of counter measures against drought is important and direct farmers to choose drought-resistant crops.
Clarifying values: an updated review
2013-01-01
Background Consensus guidelines have recommended that decision aids include a process for helping patients clarify their values. We sought to examine the theoretical and empirical evidence related to the use of values clarification methods in patient decision aids. Methods Building on the International Patient Decision Aid Standards (IPDAS) Collaboration’s 2005 review of values clarification methods in decision aids, we convened a multi-disciplinary expert group to examine key definitions, decision-making process theories, and empirical evidence about the effects of values clarification methods in decision aids. To summarize the current state of theory and evidence about the role of values clarification methods in decision aids, we undertook a process of evidence review and summary. Results Values clarification methods (VCMs) are best defined as methods to help patients think about the desirability of options or attributes of options within a specific decision context, in order to identify which option he/she prefers. Several decision making process theories were identified that can inform the design of values clarification methods, but no single “best” practice for how such methods should be constructed was determined. Our evidence review found that existing VCMs were used for a variety of different decisions, rarely referenced underlying theory for their design, but generally were well described in regard to their development process. Listing the pros and cons of a decision was the most common method used. The 13 trials that compared decision support with or without VCMs reached mixed results: some found that VCMs improved some decision-making processes, while others found no effect. Conclusions Values clarification methods may improve decision-making processes and potentially more distal outcomes. However, the small number of evaluations of VCMs and, where evaluations exist, the heterogeneity in outcome measures makes it difficult to determine their overall effectiveness or the specific characteristics that increase effectiveness. PMID:24625261
Multi-Role Project (MRP): A New Project-Based Learning Method for STEM
ERIC Educational Resources Information Center
Warin, Bruno; Talbi, Omar; Kolski, Christophe; Hoogstoel, Frédéric
2016-01-01
This paper presents the "Multi-Role Project" method (MRP), a broadly applicable project-based learning method, and describes its implementation and evaluation in the context of a Science, Technology, Engineering, and Mathematics (STEM) course. The MRP method is designed around a meta-principle that considers the project learning activity…
Manda, Prashanti; McCarthy, Fiona; Bridges, Susan M
2013-10-01
The Gene Ontology (GO), a set of three sub-ontologies, is one of the most popular bio-ontologies used for describing gene product characteristics. GO annotation data containing terms from multiple sub-ontologies and at different levels in the ontologies is an important source of implicit relationships between terms from the three sub-ontologies. Data mining techniques such as association rule mining that are tailored to mine from multiple ontologies at multiple levels of abstraction are required for effective knowledge discovery from GO annotation data. We present a data mining approach, Multi-ontology data mining at All Levels (MOAL) that uses the structure and relationships of the GO to mine multi-ontology multi-level association rules. We introduce two interestingness measures: Multi-ontology Support (MOSupport) and Multi-ontology Confidence (MOConfidence) customized to evaluate multi-ontology multi-level association rules. We also describe a variety of post-processing strategies for pruning uninteresting rules. We use publicly available GO annotation data to demonstrate our methods with respect to two applications (1) the discovery of co-annotation suggestions and (2) the discovery of new cross-ontology relationships. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.
Orthorectification by Using Gpgpu Method
NASA Astrophysics Data System (ADS)
Sahin, H.; Kulur, S.
2012-07-01
Thanks to the nature of the graphics processing, the newly released products offer highly parallel processing units with high-memory bandwidth and computational power of more than teraflops per second. The modern GPUs are not only powerful graphic engines but also they are high level parallel programmable processors with very fast computing capabilities and high-memory bandwidth speed compared to central processing units (CPU). Data-parallel computations can be shortly described as mapping data elements to parallel processing threads. The rapid development of GPUs programmability and capabilities attracted the attentions of researchers dealing with complex problems which need high level calculations. This interest has revealed the concepts of "General Purpose Computation on Graphics Processing Units (GPGPU)" and "stream processing". The graphic processors are powerful hardware which is really cheap and affordable. So the graphic processors became an alternative to computer processors. The graphic chips which were standard application hardware have been transformed into modern, powerful and programmable processors to meet the overall needs. Especially in recent years, the phenomenon of the usage of graphics processing units in general purpose computation has led the researchers and developers to this point. The biggest problem is that the graphics processing units use different programming models unlike current programming methods. Therefore, an efficient GPU programming requires re-coding of the current program algorithm by considering the limitations and the structure of the graphics hardware. Currently, multi-core processors can not be programmed by using traditional programming methods. Event procedure programming method can not be used for programming the multi-core processors. GPUs are especially effective in finding solution for repetition of the computing steps for many data elements when high accuracy is needed. Thus, it provides the computing process more quickly and accurately. Compared to the GPUs, CPUs which perform just one computing in a time according to the flow control are slower in performance. This structure can be evaluated for various applications of computer technology. In this study covers how general purpose parallel programming and computational power of the GPUs can be used in photogrammetric applications especially direct georeferencing. The direct georeferencing algorithm is coded by using GPGPU method and CUDA (Compute Unified Device Architecture) programming language. Results provided by this method were compared with the traditional CPU programming. In the other application the projective rectification is coded by using GPGPU method and CUDA programming language. Sample images of various sizes, as compared to the results of the program were evaluated. GPGPU method can be used especially in repetition of same computations on highly dense data, thus finding the solution quickly.
Age and gender estimation using Region-SIFT and multi-layered SVM
NASA Astrophysics Data System (ADS)
Kim, Hyunduk; Lee, Sang-Heon; Sohn, Myoung-Kyu; Hwang, Byunghun
2018-04-01
In this paper, we propose an age and gender estimation framework using the region-SIFT feature and multi-layered SVM classifier. The suggested framework entails three processes. The first step is landmark based face alignment. The second step is the feature extraction step. In this step, we introduce the region-SIFT feature extraction method based on facial landmarks. First, we define sub-regions of the face. We then extract SIFT features from each sub-region. In order to reduce the dimensions of features we employ a Principal Component Analysis (PCA) and a Linear Discriminant Analysis (LDA). Finally, we classify age and gender using a multi-layered Support Vector Machines (SVM) for efficient classification. Rather than performing gender estimation and age estimation independently, the use of the multi-layered SVM can improve the classification rate by constructing a classifier that estimate the age according to gender. Moreover, we collect a dataset of face images, called by DGIST_C, from the internet. A performance evaluation of proposed method was performed with the FERET database, CACD database, and DGIST_C database. The experimental results demonstrate that the proposed approach classifies age and performs gender estimation very efficiently and accurately.
Xu, Gongxian; Liu, Ying; Gao, Qunwang
2016-02-10
This paper deals with multi-objective optimization of continuous bio-dissimilation process of glycerol to 1, 3-propanediol. In order to maximize the production rate of 1, 3-propanediol, maximize the conversion rate of glycerol to 1, 3-propanediol, maximize the conversion rate of glycerol, and minimize the concentration of by-product ethanol, we first propose six new multi-objective optimization models that can simultaneously optimize any two of the four objectives above. Then these multi-objective optimization problems are solved by using the weighted-sum and normal-boundary intersection methods respectively. Both the Pareto filter algorithm and removal criteria are used to remove those non-Pareto optimal points obtained by the normal-boundary intersection method. The results show that the normal-boundary intersection method can successfully obtain the approximate Pareto optimal sets of all the proposed multi-objective optimization problems, while the weighted-sum approach cannot achieve the overall Pareto optimal solutions of some multi-objective problems. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Tang, Jian; Qiao, Junfei; Wu, ZhiWei; Chai, Tianyou; Zhang, Jian; Yu, Wen
2018-01-01
Frequency spectral data of mechanical vibration and acoustic signals relate to difficult-to-measure production quality and quantity parameters of complex industrial processes. A selective ensemble (SEN) algorithm can be used to build a soft sensor model of these process parameters by fusing valued information selectively from different perspectives. However, a combination of several optimized ensemble sub-models with SEN cannot guarantee the best prediction model. In this study, we use several techniques to construct mechanical vibration and acoustic frequency spectra of a data-driven industrial process parameter model based on selective fusion multi-condition samples and multi-source features. Multi-layer SEN (MLSEN) strategy is used to simulate the domain expert cognitive process. Genetic algorithm and kernel partial least squares are used to construct the inside-layer SEN sub-model based on each mechanical vibration and acoustic frequency spectral feature subset. Branch-and-bound and adaptive weighted fusion algorithms are integrated to select and combine outputs of the inside-layer SEN sub-models. Then, the outside-layer SEN is constructed. Thus, "sub-sampling training examples"-based and "manipulating input features"-based ensemble construction methods are integrated, thereby realizing the selective information fusion process based on multi-condition history samples and multi-source input features. This novel approach is applied to a laboratory-scale ball mill grinding process. A comparison with other methods indicates that the proposed MLSEN approach effectively models mechanical vibration and acoustic signals.
Research a Novel Integrated and Dynamic Multi-object Trade-Off Mechanism in Software Project
NASA Astrophysics Data System (ADS)
Jiang, Weijin; Xu, Yuhui
Aiming at practical requirements of present software project management and control, the paper presented to construct integrated multi-object trade-off model based on software project process management, so as to actualize integrated and dynamic trade-oil of the multi-object system of project. Based on analyzing basic principle of dynamic controlling and integrated multi-object trade-off system process, the paper integrated method of cybernetics and network technology, through monitoring on some critical reference points according to the control objects, emphatically discussed the integrated and dynamic multi- object trade-off model and corresponding rules and mechanism in order to realize integration of process management and trade-off of multi-object system.
Scandurra, I; Hägglund, M; Koch, S
2008-08-01
This paper presents a new multi-disciplinary method for user needs analysis and requirements specification in the context of health information systems based on established theories from the fields of participatory design and computer supported cooperative work (CSCW). Whereas conventional methods imply a separate, sequential needs analysis for each profession, the "multi-disciplinary thematic seminar" (MdTS) method uses a collaborative design process. Application of the method in elderly homecare resulted in prototypes that were well adapted to the intended user groups. Vital information in the points of intersection between different care professions was elicited and a holistic view of the entire care process was obtained. Health informatics-usability specialists and clinical domain experts are necessary to apply the method. Although user needs acquisition can be time-consuming, MdTS was perceived to efficiently identify in-context user needs, and transformed these directly into requirements specifications. Consequently the method was perceived to expedite the entire ICT implementation process.
Evaluation of angiogram visualization methods for fast and reliable aneurysm diagnosis
NASA Astrophysics Data System (ADS)
Lesar, Žiga; Bohak, Ciril; Marolt, Matija
2015-03-01
In this paper we present the results of an evaluation of different visualization methods for angiogram volumetric data-ray casting, marching cubes, and multi-level partition of unity implicits. There are several options available with ray-casting: isosurface extraction, maximum intensity projection and alpha compositing, each producing fundamentally different results. Different visualization methods are suitable for different needs, so this choice is crucial in diagnosis and decision making processes. We also evaluate visual effects such as ambient occlusion, screen space ambient occlusion, and depth of field. Some visualization methods include transparency, so we address the question of relevancy of this additional visual information. We employ transfer functions to map data values to color and transparency, allowing us to view or hide particular tissues. All the methods presented in this paper were developed using OpenCL, striving for real-time rendering and quality interaction. An evaluation has been conducted to assess the suitability of the visualization methods. Results show superiority of isosurface extraction with ambient occlusion effects. Visual effects may positively or negatively affect perception of depth, motion, and relative positions in space.
Multi-site field studies were conducted to evaluate the performance of sampling methods for measuring the coarse fraction of PM10 (PM10 2.5) in ambient air. The field studies involved the use of both time-integrated filter-based and direct continuous methods. Despite operationa...
Keyphrase based Evaluation of Automatic Text Summarization
NASA Astrophysics Data System (ADS)
Elghannam, Fatma; El-Shishtawy, Tarek
2015-05-01
The development of methods to deal with the informative contents of the text units in the matching process is a major challenge in automatic summary evaluation systems that use fixed n-gram matching. The limitation causes inaccurate matching between units in a peer and reference summaries. The present study introduces a new Keyphrase based Summary Evaluator KpEval for evaluating automatic summaries. The KpEval relies on the keyphrases since they convey the most important concepts of a text. In the evaluation process, the keyphrases are used in their lemma form as the matching text unit. The system was applied to evaluate different summaries of Arabic multi-document data set presented at TAC2011. The results showed that the new evaluation technique correlates well with the known evaluation systems: Rouge1, Rouge2, RougeSU4, and AutoSummENG MeMoG. KpEval has the strongest correlation with AutoSummENG MeMoG, Pearson and spearman correlation coefficient measures are 0.8840, 0.9667 respectively.
Zeeman, Jacqueline M; McLaughlin, Jacqueline E; Cox, Wendy C
2017-11-01
With increased emphasis placed on non-academic skills in the workplace, a need exists to identify an admissions process that evaluates these skills. This study assessed the validity and reliability of an application review process involving three dedicated application reviewers in a multi-stage admissions model. A multi-stage admissions model was utilized during the 2014-2015 admissions cycle. After advancing through the academic review, each application was independently reviewed by two dedicated application reviewers utilizing a six-construct rubric (written communication, extracurricular and community service activities, leadership experience, pharmacy career appreciation, research experience, and resiliency). Rubric scores were extrapolated to a three-tier ranking to select candidates for on-site interviews. Kappa statistics were used to assess interrater reliability. A three-facet Many-Facet Rasch Model (MFRM) determined reviewer severity, candidate suitability, and rubric construct difficulty. The kappa statistic for candidates' tier rank score (n = 388 candidates) was 0.692 with a perfect agreement frequency of 84.3%. There was substantial interrater reliability between reviewers for the tier ranking (kappa: 0.654-0.710). Highest construct agreement occurred in written communication (kappa: 0.924-0.984). A three-facet MFRM analysis explained 36.9% of variance in the ratings, with 0.06% reflecting application reviewer scoring patterns (i.e., severity or leniency), 22.8% reflecting candidate suitability, and 14.1% reflecting construct difficulty. Utilization of dedicated application reviewers and a defined tiered rubric provided a valid and reliable method to effectively evaluate candidates during the application review process. These analyses provide insight into opportunities for improving the application review process among schools and colleges of pharmacy. Copyright © 2017 Elsevier Inc. All rights reserved.
Segrott, Jeremy; Murphy, Simon; Rothwell, Heather; Scourfield, Jonathan; Foxcroft, David; Gillespie, David; Holliday, Jo; Hood, Kerenza; Hurlow, Claire; Morgan-Trimmer, Sarah; Phillips, Ceri; Reed, Hayley; Roberts, Zoe; Moore, Laurence
2017-12-01
Process evaluations generate important data on the extent to which interventions are delivered as intended. However, the tendency to focus only on assessment of pre-specified structural aspects of fidelity has been criticised for paying insufficient attention to implementation processes and how intervention-context interactions influence programme delivery. This paper reports findings from a process evaluation nested within a randomised controlled trial of the Strengthening Families Programme 10-14 (SFP 10-14) in Wales, UK. It uses Extended Normalisation Process Theory to theorise how interaction between SFP 10-14 and local delivery systems - particularly practitioner commitment/capability and organisational capacity - influenced delivery of intended programme activities: fidelity (adherence to SFP 10-14 content and implementation requirements); dose delivered; dose received (participant engagement); participant recruitment and reach (intervention attendance). A mixed methods design was utilised. Fidelity assessment sheets (completed by practitioners), structured observation by researchers, and routine data were used to assess: adherence to programme content; staffing numbers and consistency; recruitment/retention; and group size and composition. Interviews with practitioners explored implementation processes and context. Adherence to programme content was high - with some variation, linked to practitioner commitment to, and understanding of, the intervention's content and mechanisms. Variation in adherence rates was associated with the extent to which multi-agency delivery team planning meetings were held. Recruitment challenges meant that targets for group size/composition were not always met, but did not affect adherence levels or family engagement. Targets for staffing numbers and consistency were achieved, though capacity within multi-agency networks reduced over time. Extended Normalisation Process Theory provided a useful framework for assessing implementation and explaining variation by examining intervention-context interactions. Findings highlight the need for process evaluations to consider both the structural and process components of implementation to explain whether programme activities are delivered as intended and why.
Discriminative dictionary learning for abdominal multi-organ segmentation.
Tong, Tong; Wolz, Robin; Wang, Zehan; Gao, Qinquan; Misawa, Kazunari; Fujiwara, Michitaka; Mori, Kensaku; Hajnal, Joseph V; Rueckert, Daniel
2015-07-01
An automated segmentation method is presented for multi-organ segmentation in abdominal CT images. Dictionary learning and sparse coding techniques are used in the proposed method to generate target specific priors for segmentation. The method simultaneously learns dictionaries which have reconstructive power and classifiers which have discriminative ability from a set of selected atlases. Based on the learnt dictionaries and classifiers, probabilistic atlases are then generated to provide priors for the segmentation of unseen target images. The final segmentation is obtained by applying a post-processing step based on a graph-cuts method. In addition, this paper proposes a voxel-wise local atlas selection strategy to deal with high inter-subject variation in abdominal CT images. The segmentation performance of the proposed method with different atlas selection strategies are also compared. Our proposed method has been evaluated on a database of 150 abdominal CT images and achieves a promising segmentation performance with Dice overlap values of 94.9%, 93.6%, 71.1%, and 92.5% for liver, kidneys, pancreas, and spleen, respectively. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Wang, Peng; Fang, Weining; Guo, Beiyuan
2017-04-01
This paper proposed a colored petri nets based workload evaluation model. A formal interpretation of workload was firstly introduced based on the process that reflection of petri nets components to task. A petri net based description of Multiple Resources theory was given by comprehending it from a new angle. A new application of VACP rating scales named V/A-C-P unit, and the definition of colored transitions were proposed to build a model of task process. The calculation of workload mainly has the following four steps: determine token's initial position and values; calculate the weight of directed arcs on the basis of the rules proposed; calculate workload from different transitions, and correct the influence of repetitive behaviors. Verify experiments were carried out based on Multi-Attribute Task Battery-II software. Our results show that there is a strong correlation between the model values and NASA -Task Load Index scores (r=0.9513). In addition, this method can also distinguish behavior characteristics between different people. Copyright © 2016 Elsevier Ltd. All rights reserved.
Post-processing of multi-hydrologic model simulations for improved streamflow projections
NASA Astrophysics Data System (ADS)
khajehei, sepideh; Ahmadalipour, Ali; Moradkhani, Hamid
2016-04-01
Hydrologic model outputs are prone to bias and uncertainty due to knowledge deficiency in model and data. Uncertainty in hydroclimatic projections arises due to uncertainty in hydrologic model as well as the epistemic or aleatory uncertainties in GCM parameterization and development. This study is conducted to: 1) evaluate the recently developed multi-variate post-processing method for historical simulations and 2) assess the effect of post-processing on uncertainty and reliability of future streamflow projections in both high-flow and low-flow conditions. The first objective is performed for historical period of 1970-1999. Future streamflow projections are generated for 10 statistically downscaled GCMs from two widely used downscaling methods: Bias Corrected Statistically Downscaled (BCSD) and Multivariate Adaptive Constructed Analogs (MACA), over the period of 2010-2099 for two representative concentration pathways of RCP4.5 and RCP8.5. Three semi-distributed hydrologic models were employed and calibrated at 1/16 degree latitude-longitude resolution for over 100 points across the Columbia River Basin (CRB) in the pacific northwest USA. Streamflow outputs are post-processed through a Bayesian framework based on copula functions. The post-processing approach is relying on a transfer function developed based on bivariate joint distribution between the observation and simulation in historical period. Results show that application of post-processing technique leads to considerably higher accuracy in historical simulations and also reducing model uncertainty in future streamflow projections.
NASA Technical Reports Server (NTRS)
Brown, Christopher A.
1993-01-01
The approach of the project is to base the design of multi-function, reflective topographies on the theory that topographically dependent phenomena react with surfaces and interfaces at certain scales. The first phase of the project emphasizes the development of methods for understanding the sizes of topographic features which influence reflectivity. Subsequent phases, if necessary, will address the scales of interaction for adhesion and manufacturing processes. A simulation of the interaction of electromagnetic radiation, or light, with a reflective surface is performed using specialized software. Reflectivity of the surface as a function of scale is evaluated and the results from the simulation are compared with reflectivity measurements made on multi-function, reflective surfaces.
Junghöfer, Markus; Rehbein, Maimu Alissa; Maitzen, Julius; Schindler, Sebastian
2017-01-01
Abstract Humans have a remarkable capacity for rapid affective learning. For instance, using first-order US such as odors or electric shocks, magnetoencephalography (MEG) studies of multi-CS conditioning demonstrate enhanced early (<150 ms) and mid-latency (150–300 ms) visual evoked responses to affectively conditioned faces, together with changes in stimulus evaluation. However, particularly in social contexts, human affective learning is often mediated by language, a class of complex higher-order US. To elucidate mechanisms of this type of learning, we investigate how face processing changes following verbal evaluative multi-CS conditioning. Sixty neutral expression male faces were paired with phrases about aversive crimes (30) or neutral occupations (30). Post conditioning, aversively associated faces evoked stronger magnetic fields in a mid-latency interval between 220 and 320 ms, localized primarily in left visual cortex. Aversively paired faces were also rated as more arousing and more unpleasant, evaluative changes occurring both with and without contingency awareness. However, no early MEG effects were found, implying that verbal evaluative conditioning may require conceptual processing and does not engage rapid, possibly sub-cortical, pathways. Results demonstrate the efficacy of verbal evaluative multi-CS conditioning and indicate both common and distinct neural mechanisms of first- and higher-order multi-CS conditioning, thereby informing theories of associative learning. PMID:28008078
Junghöfer, Markus; Rehbein, Maimu Alissa; Maitzen, Julius; Schindler, Sebastian; Kissler, Johanna
2017-04-01
Humans have a remarkable capacity for rapid affective learning. For instance, using first-order US such as odors or electric shocks, magnetoencephalography (MEG) studies of multi-CS conditioning demonstrate enhanced early (<150 ms) and mid-latency (150-300 ms) visual evoked responses to affectively conditioned faces, together with changes in stimulus evaluation. However, particularly in social contexts, human affective learning is often mediated by language, a class of complex higher-order US. To elucidate mechanisms of this type of learning, we investigate how face processing changes following verbal evaluative multi-CS conditioning. Sixty neutral expression male faces were paired with phrases about aversive crimes (30) or neutral occupations (30). Post conditioning, aversively associated faces evoked stronger magnetic fields in a mid-latency interval between 220 and 320 ms, localized primarily in left visual cortex. Aversively paired faces were also rated as more arousing and more unpleasant, evaluative changes occurring both with and without contingency awareness. However, no early MEG effects were found, implying that verbal evaluative conditioning may require conceptual processing and does not engage rapid, possibly sub-cortical, pathways. Results demonstrate the efficacy of verbal evaluative multi-CS conditioning and indicate both common and distinct neural mechanisms of first- and higher-order multi-CS conditioning, thereby informing theories of associative learning. © The Author (2016). Published by Oxford University Press.
Chen, Jing; Wang, Shu-Mei; Meng, Jiang; Sun, Fei; Liang, Sheng-Wang
2013-05-01
To establish a new method for quality evaluation and validate its feasibilities by simultaneous quantitative assay of five alkaloids in Sophora flavescens. The new quality evaluation method, quantitative analysis of multi-components by single marker (QAMS), was established and validated with S. flavescens. Five main alkaloids, oxymatrine, sophocarpine, matrine, oxysophocarpine and sophoridine, were selected as analytes to evaluate the quality of rhizome of S. flavescens, and the relative correction factor has good repeatibility. Their contents in 21 batches of samples, collected from different areas, were determined by both external standard method and QAMS. The method was evaluated by comparison of the quantitative results between external standard method and QAMS. No significant differences were found in the quantitative results of five alkaloids in 21 batches of S. flavescens determined by external standard method and QAMS. It is feasible and suitable to evaluate the quality of rhizome of S. flavescens by QAMS.
NASA Technical Reports Server (NTRS)
Cleary, T.; Grosshandler, W.
1999-01-01
As part of the National Aeronautics and Space Administration (NASA) initiated program on global civil aviation, NIST is assisting Federal Aviation Administration in its research to improve fire detection in aircraft cargo compartments. Aircraft cargo compartment detection certification methods have been reviewed. The Fire Emulator-Detector Evaluator (FE/DE) has been designed to evaluate fire detection technologies such as new sensors, multi-element detectors, and detectors that employ complex algorithms. The FE/DE is a flow tunnel that can reproduce velocity, temperature, smoke, and Combustion gas levels to which a detector might be exposed during a fire. A scientific literature survey and patent search have been conducted relating to existing and emerging fire detection technologies, and the potential use of new fire detection strategies in cargo compartment areas has been assessed. In the near term, improved detector signal processing and multi-sensor detectors based on combinations of smoke measurements, combustion gases and temperature are envisioned as significantly impacting detector system performance.
Upper Mantle Shear Wave Structure Beneath North America From Multi-mode Surface Wave Tomography
NASA Astrophysics Data System (ADS)
Yoshizawa, K.; Ekström, G.
2008-12-01
The upper mantle structure beneath the North American continent has been investigated from measurements of multi-mode phase speeds of Love and Rayleigh waves. To estimate fundamental-mode and higher-mode phase speeds of surface waves from a single seismogram at regional distances, we have employed a method of nonlinear waveform fitting based on a direct model-parameter search using the neighbourhood algorithm (Yoshizawa & Kennett, 2002). The method of the waveform analysis has been fully automated by employing empirical quantitative measures for evaluating the accuracy/reliability of estimated multi-mode phase dispersion curves, and thus it is helpful in processing the dramatically increasing numbers of seismic data from the latest regional networks such as USArray. As a first step toward modeling the regional anisotropic shear-wave velocity structure of the North American upper mantle with extended vertical resolution, we have applied the method to long-period three-component records of seismic stations in North America, which mostly comprise the GSN and US regional networks as well as the permanent and transportable USArray stations distributed by the IRIS DMC. Preliminary multi-mode phase-speed models show large-scale patterns of isotropic heterogeneity, such as a strong velocity contrast between the western and central/eastern United States, which are consistent with the recent global and regional models (e.g., Marone, et al. 2007; Nettles & Dziewonski, 2008). We will also discuss radial anisotropy of shear wave speed beneath North America from multi-mode dispersion measurements of Love and Rayleigh waves.
Multi-Modality Cascaded Convolutional Neural Networks for Alzheimer's Disease Diagnosis.
Liu, Manhua; Cheng, Danni; Wang, Kundong; Wang, Yaping
2018-03-23
Accurate and early diagnosis of Alzheimer's disease (AD) plays important role for patient care and development of future treatment. Structural and functional neuroimages, such as magnetic resonance images (MRI) and positron emission tomography (PET), are providing powerful imaging modalities to help understand the anatomical and functional neural changes related to AD. In recent years, machine learning methods have been widely studied on analysis of multi-modality neuroimages for quantitative evaluation and computer-aided-diagnosis (CAD) of AD. Most existing methods extract the hand-craft imaging features after image preprocessing such as registration and segmentation, and then train a classifier to distinguish AD subjects from other groups. This paper proposes to construct cascaded convolutional neural networks (CNNs) to learn the multi-level and multimodal features of MRI and PET brain images for AD classification. First, multiple deep 3D-CNNs are constructed on different local image patches to transform the local brain image into more compact high-level features. Then, an upper high-level 2D-CNN followed by softmax layer is cascaded to ensemble the high-level features learned from the multi-modality and generate the latent multimodal correlation features of the corresponding image patches for classification task. Finally, these learned features are combined by a fully connected layer followed by softmax layer for AD classification. The proposed method can automatically learn the generic multi-level and multimodal features from multiple imaging modalities for classification, which are robust to the scale and rotation variations to some extent. No image segmentation and rigid registration are required in pre-processing the brain images. Our method is evaluated on the baseline MRI and PET images of 397 subjects including 93 AD patients, 204 mild cognitive impairment (MCI, 76 pMCI +128 sMCI) and 100 normal controls (NC) from Alzheimer's Disease Neuroimaging Initiative (ADNI) database. Experimental results show that the proposed method achieves an accuracy of 93.26% for classification of AD vs. NC and 82.95% for classification pMCI vs. NC, demonstrating the promising classification performance.
NASA Astrophysics Data System (ADS)
Gaël, Dumont; Tanguy, Robert; Nicolas, Marck; Frédéric, Nguyen
2017-10-01
In this study, we tested the ability of geophysical methods to characterize a large technical landfill installed in a former sand quarry. The geophysical surveys specifically aimed at delimitating the deposit site horizontal extension, at estimating its thickness and at characterizing the waste material composition (the moisture content in the present case). The site delimitation was conducted with electromagnetic (in-phase and out-of-phase) and magnetic (vertical gradient and total field) methods that clearly showed the transition between the waste deposit and the host formation. Regarding waste deposit thickness evaluation, electrical resistivity tomography appeared inefficient on this particularly thick deposit site. Thus, we propose a combination of horizontal to vertical noise spectral ratio (HVNSR) and multichannel analysis of the surface waves (MASW), which successfully determined the approximate waste deposit thickness in our test landfill. However, ERT appeared to be an appropriate tool to characterize the moisture content of the waste, which is of prior information for the organic waste biodegradation process. The global multi-scale and multi-method geophysical survey offers precious information for site rehabilitation studies, water content mitigation processes for enhanced biodegradation or landfill mining operation planning.
Ding, Jing-Yi; Zhao, Wen-Wu
2014-09-01
The 5th World Conference on Ecological Restoration was held in Madison, Wisconsin, USA on October 6-11, 2013. About 1200 delegates from more than 50 countries attended the conference, and discussed the latest developments in different thematic areas of ecological restoration. Discussions on evaluation of ecological restoration were mainly from three aspects: The construction for evaluation indicator system of ecological restoration; the evaluation methods of ecological restoration; monitoring and dynamic evaluation of ecological restoration. The meeting stressed the importance of evaluation in the process of ecological restoration and concerned the challenges in evaluation of ecological restoration. The conference had the following enlightenments for China' s research on evaluation of ecological restoration: 1) Strengthening the construction of comprehensive evaluation indicators system and focusing on the multi-participation in the evaluation process. 2) Paying more attentions on scale effect and scale transformation in the evaluation process of ecological restoration. 3) Expanding the application of 3S technology in assessing the success of ecological restoration and promoting the dynamic monitoring of ecological restoration. 4) Carrying out international exchanges and cooperation actively, and promoting China's international influence in ecological restoration research.
Braithwaite, Jeffrey; Westbrook, Johanna; Pawsey, Marjorie; Greenfield, David; Naylor, Justine; Iedema, Rick; Runciman, Bill; Redman, Sally; Jorm, Christine; Robinson, Maureen; Nathan, Sally; Gibberd, Robert
2006-01-01
Background Accreditation has become ubiquitous across the international health care landscape. Award of full accreditation status in health care is viewed, as it is in other sectors, as a valid indicator of high quality organisational performance. However, few studies have empirically demonstrated this assertion. The value of accreditation, therefore, remains uncertain, and this persists as a central legitimacy problem for accreditation providers, policymakers and researchers. The question arises as to how best to research the validity, impact and value of accreditation processes in health care. Most health care organisations participate in some sort of accreditation process and thus it is not possible to study its merits using a randomised controlled strategy. Further, tools and processes for accreditation and organisational performance are multifaceted. Methods/design To understand the relationship between them a multi-method research approach is required which incorporates both quantitative and qualitative data. The generic nature of accreditation standard development and inspection within different sectors enhances the extent to which the findings of in-depth study of accreditation process in one industry can be generalised to other industries. This paper presents a research design which comprises a prospective, multi-method, multi-level, multi-disciplinary approach to assess the validity, impact and value of accreditation. Discussion The accreditation program which assesses over 1,000 health services in Australia is used as an exemplar for testing this design. The paper proposes this design as a framework suitable for application to future international research into accreditation. Our aim is to stimulate debate on the role of accreditation and how to research it. PMID:16968552
NASA Astrophysics Data System (ADS)
Ribes, S.; Voicu, I.; Girault, J. M.; Fournier, M.; Perrotin, F.; Tranquart, F.; Kouamé, D.
2011-03-01
Electronic fetal monitoring may be required during the whole pregnancy to closely monitor specific fetal and maternal disorders. Currently used methods suffer from many limitations and are not sufficient to evaluate fetal asphyxia. Fetal activity parameters such as movements, heart rate and associated parameters are essential indicators of the fetus well being, and no current device gives a simultaneous and sufficient estimation of all these parameters to evaluate the fetus well-being. We built for this purpose, a multi-transducer-multi-gate Doppler system and developed dedicated signal processing techniques for fetal activity parameter extraction in order to investigate fetus's asphyxia or well-being through fetal activity parameters. To reach this goal, this paper shows preliminary feasibility of separating normal and compromised fetuses using our system. To do so, data set consisting of two groups of fetal signals (normal and compromised) has been established and provided by physicians. From estimated parameters an instantaneous Manning-like score, referred to as ultrasonic score was introduced and was used together with movements, heart rate and associated parameters in a classification process using Support Vector Machines (SVM) method. The influence of the fetal activity parameters and the performance of the SVM were evaluated using the computation of sensibility, specificity, percentage of support vectors and total classification accuracy. We showed our ability to separate the data into two sets : normal fetuses and compromised fetuses and obtained an excellent matching with the clinical classification performed by physician.
Moore, Priscilla A; Kery, Vladimir
2009-01-01
High-throughput protein purification is a complex, multi-step process. There are several technical challenges in the course of this process that are not experienced when purifying a single protein. Among the most challenging are the high-throughput protein concentration and buffer exchange, which are not only labor-intensive but can also result in significant losses of purified proteins. We describe two methods of high-throughput protein concentration and buffer exchange: one using ammonium sulfate precipitation and one using micro-concentrating devices based on membrane ultrafiltration. We evaluated the efficiency of both methods on a set of 18 randomly selected purified proteins from Shewanella oneidensis. While both methods provide similar yield and efficiency, the ammonium sulfate precipitation is much less labor intensive and time consuming than the ultrafiltration.
Sajn, Luka; Kukar, Matjaž
2011-12-01
The paper presents results of our long-term study on using image processing and data mining methods in a medical imaging. Since evaluation of modern medical images is becoming increasingly complex, advanced analytical and decision support tools are involved in integration of partial diagnostic results. Such partial results, frequently obtained from tests with substantial imperfections, are integrated into ultimate diagnostic conclusion about the probability of disease for a given patient. We study various topics such as improving the predictive power of clinical tests by utilizing pre-test and post-test probabilities, texture representation, multi-resolution feature extraction, feature construction and data mining algorithms that significantly outperform medical practice. Our long-term study reveals three significant milestones. The first improvement was achieved by significantly increasing post-test diagnostic probabilities with respect to expert physicians. The second, even more significant improvement utilizes multi-resolution image parametrization. Machine learning methods in conjunction with the feature subset selection on these parameters significantly improve diagnostic performance. However, further feature construction with the principle component analysis on these features elevates results to an even higher accuracy level that represents the third milestone. With the proposed approach clinical results are significantly improved throughout the study. The most significant result of our study is improvement in the diagnostic power of the whole diagnostic process. Our compound approach aids, but does not replace, the physician's judgment and may assist in decisions on cost effectiveness of tests. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Şahin, Rıdvan; Liu, Peide
2017-07-01
Simplified neutrosophic set (SNS) is an appropriate tool used to express the incompleteness, indeterminacy and uncertainty of the evaluation objects in decision-making process. In this study, we define the concept of possibility SNS including two types of information such as the neutrosophic performance provided from the evaluation objects and its possibility degree using a value ranging from zero to one. Then by extending the existing neutrosophic information, aggregation models for SNSs that cannot be used effectively to fusion the two different information described above, we propose two novel neutrosophic aggregation operators considering possibility, which are named as a possibility-induced simplified neutrosophic weighted arithmetic averaging operator and possibility-induced simplified neutrosophic weighted geometric averaging operator, and discuss their properties. Moreover, we develop a useful method based on the proposed aggregation operators for solving a multi-criteria group decision-making problem with the possibility simplified neutrosophic information, in which the weights of decision-makers and decision criteria are calculated based on entropy measure. Finally, a practical example is utilised to show the practicality and effectiveness of the proposed method.
Multi-scale Modeling in Clinical Oncology: Opportunities and Barriers to Success.
Yankeelov, Thomas E; An, Gary; Saut, Oliver; Luebeck, E Georg; Popel, Aleksander S; Ribba, Benjamin; Vicini, Paolo; Zhou, Xiaobo; Weis, Jared A; Ye, Kaiming; Genin, Guy M
2016-09-01
Hierarchical processes spanning several orders of magnitude of both space and time underlie nearly all cancers. Multi-scale statistical, mathematical, and computational modeling methods are central to designing, implementing and assessing treatment strategies that account for these hierarchies. The basic science underlying these modeling efforts is maturing into a new discipline that is close to influencing and facilitating clinical successes. The purpose of this review is to capture the state-of-the-art as well as the key barriers to success for multi-scale modeling in clinical oncology. We begin with a summary of the long-envisioned promise of multi-scale modeling in clinical oncology, including the synthesis of disparate data types into models that reveal underlying mechanisms and allow for experimental testing of hypotheses. We then evaluate the mathematical techniques employed most widely and present several examples illustrating their application as well as the current gap between pre-clinical and clinical applications. We conclude with a discussion of what we view to be the key challenges and opportunities for multi-scale modeling in clinical oncology.
Multi-scale Modeling in Clinical Oncology: Opportunities and Barriers to Success
Yankeelov, Thomas E.; An, Gary; Saut, Oliver; Luebeck, E. Georg; Popel, Aleksander S.; Ribba, Benjamin; Vicini, Paolo; Zhou, Xiaobo; Weis, Jared A.; Ye, Kaiming; Genin, Guy M.
2016-01-01
Hierarchical processes spanning several orders of magnitude of both space and time underlie nearly all cancers. Multi-scale statistical, mathematical, and computational modeling methods are central to designing, implementing and assessing treatment strategies that account for these hierarchies. The basic science underlying these modeling efforts is maturing into a new discipline that is close to influencing and facilitating clinical successes. The purpose of this review is to capture the state-of-the-art as well as the key barriers to success for multi-scale modeling in clinical oncology. We begin with a summary of the long-envisioned promise of multi-scale modeling in clinical oncology, including the synthesis of disparate data types into models that reveal underlying mechanisms and allow for experimental testing of hypotheses. We then evaluate the mathematical techniques employed most widely and present several examples illustrating their application as well as the current gap between pre-clinical and clinical applications. We conclude with a discussion of what we view to be the key challenges and opportunities for multi-scale modeling in clinical oncology. PMID:27384942
Usage of air jigging for multi-component separation of construction and demolition waste.
Ambrós, Weslei Monteiro; Sampaio, Carlos Hoffmann; Cazacliu, Bogdan Grigore; Miltzarek, Gerson Luis; Miranda, Leonardo R
2017-02-01
The use of air jigging for performing multi-component separation in the treatment of mixed construction and demolition waste was studied. Sorting tests were carried out with mixtures of equal bulk volume of concrete and brick in which fixed quantities of unwanted materials - gypsum, wood and paper - were added. Experimental results have demonstrated the possibility to use air jigging to carry out both the removal of low-density contaminants and the concrete concentration in only one process step. In relation to the removal of contaminants only, the overall performance of jigging process can be comparable with that of commercial air classifiers and automatic sorting systems. Also, the initial content of contaminants seems does not have a significant effect on the separation extent. These results are of particular importance for recycling plants processing as they represent an alternative to optimize the use of air jigs. Further investigation is needed in order to evaluate the practical feasibility of such method. Copyright © 2016 Elsevier Ltd. All rights reserved.
A new web-based framework development for fuzzy multi-criteria group decision-making.
Hanine, Mohamed; Boutkhoum, Omar; Tikniouine, Abdessadek; Agouti, Tarik
2016-01-01
Fuzzy multi-criteria group decision making (FMCGDM) process is usually used when a group of decision-makers faces imprecise data or linguistic variables to solve the problems. However, this process contains many methods that require many time-consuming calculations depending on the number of criteria, alternatives and decision-makers in order to reach the optimal solution. In this study, a web-based FMCGDM framework that offers decision-makers a fast and reliable response service is proposed. The proposed framework includes commonly used tools for multi-criteria decision-making problems such as fuzzy Delphi, fuzzy AHP and fuzzy TOPSIS methods. The integration of these methods enables taking advantages of the strengths and complements each method's weakness. Finally, a case study of location selection for landfill waste in Morocco is performed to demonstrate how this framework can facilitate decision-making process. The results demonstrate that the proposed framework can successfully accomplish the goal of this study.
NASA Astrophysics Data System (ADS)
Lee, Suk-Jun; Yu, Seung-Man
2017-08-01
The purpose of this study was to evaluate the usefulness and clinical applications of MultiVaneXD which was applying iterative motion correction reconstruction algorithm T2-weighted images compared with MultiVane images taken with a 3T MRI. A total of 20 patients with suspected pathologies of the liver and pancreatic-biliary system based on clinical and laboratory findings underwent upper abdominal MRI, acquired using the MultiVane and MultiVaneXD techniques. Two reviewers analyzed the MultiVane and MultiVaneXD T2-weighted images qualitatively and quantitatively. Each reviewer evaluated vessel conspicuity by observing motion artifacts and the sharpness of the portal vein, hepatic vein, and upper organs. The signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) were calculated by one reviewer for quantitative analysis. The interclass correlation coefficient was evaluated to measure inter-observer reliability. There were significant differences between MultiVane and MultiVaneXD in motion artifact evaluation. Furthermore, MultiVane was given a better score than MultiVaneXD in abdominal organ sharpness and vessel conspicuity, but the difference was insignificant. The reliability coefficient values were over 0.8 in every evaluation. MultiVaneXD (2.12) showed a higher value than did MultiVane (1.98), but the difference was insignificant ( p = 0.135). MultiVaneXD is a motion correction method that is more advanced than MultiVane, and it produced an increased SNR, resulting in a greater ability to detect focal abdominal lesions.
NASA Astrophysics Data System (ADS)
Liao, S.; Chen, L.; Li, J.; Xiong, W.; Wu, Q.
2015-07-01
Existing spatiotemporal database supports spatiotemporal aggregation query over massive moving objects datasets. Due to the large amounts of data and single-thread processing method, the query speed cannot meet the application requirements. On the other hand, the query efficiency is more sensitive to spatial variation then temporal variation. In this paper, we proposed a spatiotemporal aggregation query method using multi-thread parallel technique based on regional divison and implemented it on the server. Concretely, we divided the spatiotemporal domain into several spatiotemporal cubes, computed spatiotemporal aggregation on all cubes using the technique of multi-thread parallel processing, and then integrated the query results. By testing and analyzing on the real datasets, this method has improved the query speed significantly.
Health state evaluation of shield tunnel SHM using fuzzy cluster method
NASA Astrophysics Data System (ADS)
Zhou, Fa; Zhang, Wei; Sun, Ke; Shi, Bin
2015-04-01
Shield tunnel SHM is in the path of rapid development currently while massive monitoring data processing and quantitative health grading remain a real challenge, since multiple sensors belonging to different types are employed in SHM system. This paper addressed the fuzzy cluster method based on fuzzy equivalence relationship for the health evaluation of shield tunnel SHM. The method was optimized by exporting the FSV map to automatically generate the threshold value. A new holistic health score(HHS) was proposed and its effectiveness was validated by conducting a pilot test. A case study on Nanjing Yangtze River Tunnel was presented to apply this method. Three types of indicators, namely soil pressure, pore pressure and steel strain, were used to develop the evaluation set U. The clustering results were verified by analyzing the engineering geological conditions; the applicability and validity of the proposed method was also demonstrated. Besides, the advantage of multi-factor evaluation over single-factor model was discussed by using the proposed HHS. This investigation indicated the fuzzy cluster method and HHS is capable of characterizing the fuzziness of tunnel health, and it is beneficial to clarify the tunnel health evaluation uncertainties.
Multi-Objective Hybrid Optimal Control for Interplanetary Mission Planning
NASA Technical Reports Server (NTRS)
Englander, Jacob; Vavrina, Matthew; Ghosh, Alexander
2015-01-01
Preliminary design of low-thrust interplanetary missions is a highly complex process. The mission designer must choose discrete parameters such as the number of flybys, the bodies at which those flybys are performed and in some cases the final destination. In addition, a time-history of control variables must be chosen which defines the trajectory. There are often many thousands, if not millions, of possible trajectories to be evaluated. The customer who commissions a trajectory design is not usually interested in a point solution, but rather the exploration of the trade space of trajectories between several different objective functions. This can be a very expensive process in terms of the number of human analyst hours required. An automated approach is therefore very diserable. This work presents such as an approach by posing the mission design problem as a multi-objective hybrid optimal control problem. The method is demonstrated on a hypothetical mission to the main asteroid belt.
Hanine, Mohamed; Boutkhoum, Omar; Tikniouine, Abdessadek; Agouti, Tarik
2016-01-01
Actually, a set of ETL software (Extract, Transform and Load) is available to constitute a major investment market. Each ETL uses its own techniques for extracting, transforming and loading data into data warehouse, which makes the task of evaluating ETL software very difficult. However, choosing the right software of ETL is critical to the success or failure of any Business Intelligence project. As there are many impacting factors in the selection of ETL software, the same process is considered as a complex multi-criteria decision making (MCDM) problem. In this study, an application of decision-making methodology that employs the two well-known MCDM techniques, namely Analytic Hierarchy Process (AHP) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) methods is designed. In this respect, the aim of using AHP is to analyze the structure of the ETL software selection problem and obtain weights of the selected criteria. Then, TOPSIS technique is used to calculate the alternatives' ratings. An example is given to illustrate the proposed methodology. Finally, a software prototype for demonstrating both methods is implemented.
NASA Astrophysics Data System (ADS)
Yuan, Y.; Meng, Y.; Chen, Y. X.; Jiang, C.; Yue, A. Z.
2018-04-01
In this study, we proposed a method to map urban encroachment onto farmland using satellite image time series (SITS) based on the hierarchical hidden Markov model (HHMM). In this method, the farmland change process is decomposed into three hierarchical levels, i.e., the land cover level, the vegetation phenology level, and the SITS level. Then a three-level HHMM is constructed to model the multi-level semantic structure of farmland change process. Once the HHMM is established, a change from farmland to built-up could be detected by inferring the underlying state sequence that is most likely to generate the input time series. The performance of the method is evaluated on MODIS time series in Beijing. Results on both simulated and real datasets demonstrate that our method improves the change detection accuracy compared with the HMM-based method.
2016-01-01
Multi-criteria decision-making (MCDM) can be formally implemented by various methods. This study compares suitability of four selected MCDM methods, namely WPM, TOPSIS, VIKOR, and PROMETHEE, for future applications in agent-based computational economic (ACE) models of larger scale (i.e., over 10 000 agents in one geographical region). These four MCDM methods were selected according to their appropriateness for computational processing in ACE applications. Tests of the selected methods were conducted on four hardware configurations. For each method, 100 tests were performed, which represented one testing iteration. With four testing iterations conducted on each hardware setting and separated testing of all configurations with the–server parameter de/activated, altogether, 12800 data points were collected and consequently analyzed. An illustrational decision-making scenario was used which allows the mutual comparison of all of the selected decision making methods. Our test results suggest that although all methods are convenient and can be used in practice, the VIKOR method accomplished the tests with the best results and thus can be recommended as the most suitable for simulations of large-scale agent-based models. PMID:27806061
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-03
...-Exclusive Licenses: Multi-Focal Structured Illumination Microscopy Systems and Methods AGENCY: National... pertains to a system and method for digital confocal microscopy that rapidly processes enhanced images. In particular, the invention is a method for digital confocal microscopy that includes a digital mirror device...
Suner, Aslı; Oruc, Ozlem Ege; Buke, Cagri; Ozkaya, Hacer Deniz; Kitapcioglu, Gul
2017-08-31
Hand hygiene is one of the most effective attempts to control nosocomial infections, and it is an important measure to avoid the transmission of pathogens. However, the compliance of healthcare workers (HCWs) with hand washing is still poor worldwide. Herein, we aimed to determine the best hand hygiene preference of the infectious diseases and clinical microbiology (IDCM) specialists to prevent transmission of microorganisms from one patient to another. Expert opinions regarding the criteria that influence the best hand hygiene preference were collected through a questionnaire via face-to-face interviews. Afterwards, these opinions were examined with two widely used multi-criteria decision analysis (MCDA) methods, the Multi-Attribute Utility Theory (MAUT) and the Analytic Hierarchy Process (AHP). A total of 15 IDCM specialist opinions were collected from diverse private and public hospitals located in İzmir, Turkey. The mean age of the participants was 49.73 ± 8.46, and the mean experience year of the participants in their fields was 17.67 ± 11.98. The findings that we obtained through two distinct decision making methods, the MAUT and the AHP, suggest that alcohol-based antiseptic solution (ABAS) has the highest utility (0.86) and priority (0.69) among the experts' choices. In conclusion, the MAUT and the AHP, decision models developed here indicate that rubbing the hands with ABAS is the most favorable choice for IDCM specialists to prevent nosocomial infection.
Multi-parametric centrality method for graph network models
NASA Astrophysics Data System (ADS)
Ivanov, Sergei Evgenievich; Gorlushkina, Natalia Nikolaevna; Ivanova, Lubov Nikolaevna
2018-04-01
The graph model networks are investigated to determine centrality, weights and the significance of vertices. For centrality analysis appliesa typical method that includesany one of the properties of graph vertices. In graph theory, methods of analyzing centrality are used: in terms by degree, closeness, betweenness, radiality, eccentricity, page-rank, status, Katz and eigenvector. We have proposed a new method of multi-parametric centrality, which includes a number of basic properties of the network member. The mathematical model of multi-parametric centrality method is developed. Comparison of results for the presented method with the centrality methods is carried out. For evaluate the results for the multi-parametric centrality methodthe graph model with hundreds of vertices is analyzed. The comparative analysis showed the accuracy of presented method, includes simultaneously a number of basic properties of vertices.
Application of multi response optimization with grey relational analysis and fuzzy logic method
NASA Astrophysics Data System (ADS)
Winarni, Sri; Wahyu Indratno, Sapto
2018-01-01
Multi-response optimization is an optimization process by considering multiple responses simultaneously. The purpose of this research is to get the optimum point on multi-response optimization process using grey relational analysis and fuzzy logic method. The optimum point is determined from the Fuzzy-GRG (Grey Relational Grade) variable which is the conversion of the Signal to Noise Ratio of the responses involved. The case study used in this research are case optimization of electrical process parameters in electrical disharge machining. It was found that the combination of treatments resulting to optimum MRR and SR was a 70 V gap voltage factor, peak current 9 A and duty factor 0.8.
NASA Astrophysics Data System (ADS)
Jitsuhiro, Takatoshi; Toriyama, Tomoji; Kogure, Kiyoshi
We propose a noise suppression method based on multi-model compositions and multi-pass search. In real environments, input speech for speech recognition includes many kinds of noise signals. To obtain good recognized candidates, suppressing many kinds of noise signals at once and finding target speech is important. Before noise suppression, to find speech and noise label sequences, we introduce multi-pass search with acoustic models including many kinds of noise models and their compositions, their n-gram models, and their lexicon. Noise suppression is frame-synchronously performed using the multiple models selected by recognized label sequences with time alignments. We evaluated this method using the E-Nightingale task, which contains voice memoranda spoken by nurses during actual work at hospitals. The proposed method obtained higher performance than the conventional method.
NASA Astrophysics Data System (ADS)
Zhang, Lijuan; Li, Yang; Wang, Junnan; Liu, Ying
2018-03-01
In this paper, we propose a point spread function (PSF) reconstruction method and joint maximum a posteriori (JMAP) estimation method for the adaptive optics image restoration. Using the JMAP method as the basic principle, we establish the joint log likelihood function of multi-frame adaptive optics (AO) images based on the image Gaussian noise models. To begin with, combining the observed conditions and AO system characteristics, a predicted PSF model for the wavefront phase effect is developed; then, we build up iterative solution formulas of the AO image based on our proposed algorithm, addressing the implementation process of multi-frame AO images joint deconvolution method. We conduct a series of experiments on simulated and real degraded AO images to evaluate our proposed algorithm. Compared with the Wiener iterative blind deconvolution (Wiener-IBD) algorithm and Richardson-Lucy IBD algorithm, our algorithm has better restoration effects including higher peak signal-to-noise ratio ( PSNR) and Laplacian sum ( LS) value than the others. The research results have a certain application values for actual AO image restoration.
USDA-ARS?s Scientific Manuscript database
A multi-class, multi-residue method for the analysis of 13 novel flame retardants, 18 representative pesticides, 14 polychlorinated biphenyl (PCB) congeners, 16 polycyclic aromatic hydrocarbons (PAHs), and 7 polybrominated diphenyl ether (PBDE) congeners in catfish muscle was developed and evaluated...
NASA Astrophysics Data System (ADS)
Hou, Zhenlong; Huang, Danian
2017-09-01
In this paper, we make a study on the inversion of probability tomography (IPT) with gravity gradiometry data at first. The space resolution of the results is improved by multi-tensor joint inversion, depth weighting matrix and the other methods. Aiming at solving the problems brought by the big data in the exploration, we present the parallel algorithm and the performance analysis combining Compute Unified Device Architecture (CUDA) with Open Multi-Processing (OpenMP) based on Graphics Processing Unit (GPU) accelerating. In the test of the synthetic model and real data from Vinton Dome, we get the improved results. It is also proved that the improved inversion algorithm is effective and feasible. The performance of parallel algorithm we designed is better than the other ones with CUDA. The maximum speedup could be more than 200. In the performance analysis, multi-GPU speedup and multi-GPU efficiency are applied to analyze the scalability of the multi-GPU programs. The designed parallel algorithm is demonstrated to be able to process larger scale of data and the new analysis method is practical.
NASA Astrophysics Data System (ADS)
Zimoń, M. J.; Prosser, R.; Emerson, D. R.; Borg, M. K.; Bray, D. J.; Grinberg, L.; Reese, J. M.
2016-11-01
Filtering of particle-based simulation data can lead to reduced computational costs and enable more efficient information transfer in multi-scale modelling. This paper compares the effectiveness of various signal processing methods to reduce numerical noise and capture the structures of nano-flow systems. In addition, a novel combination of these algorithms is introduced, showing the potential of hybrid strategies to improve further the de-noising performance for time-dependent measurements. The methods were tested on velocity and density fields, obtained from simulations performed with molecular dynamics and dissipative particle dynamics. Comparisons between the algorithms are given in terms of performance, quality of the results and sensitivity to the choice of input parameters. The results provide useful insights on strategies for the analysis of particle-based data and the reduction of computational costs in obtaining ensemble solutions.
MS lesion segmentation using a multi-channel patch-based approach with spatial consistency
NASA Astrophysics Data System (ADS)
Mechrez, Roey; Goldberger, Jacob; Greenspan, Hayit
2015-03-01
This paper presents an automatic method for segmentation of Multiple Sclerosis (MS) in Magnetic Resonance Images (MRI) of the brain. The approach is based on similarities between multi-channel patches (T1, T2 and FLAIR). An MS lesion patch database is built using training images for which the label maps are known. For each patch in the testing image, k similar patches are retrieved from the database. The matching labels for these k patches are then combined to produce an initial segmentation map for the test case. Finally a novel iterative patch-based label refinement process based on the initial segmentation map is performed to ensure spatial consistency of the detected lesions. A leave-one-out evaluation is done for each testing image in the MS lesion segmentation challenge of MICCAI 2008. Results are shown to compete with the state-of-the-art methods on the MICCAI 2008 challenge.
Jaki, Thomas; Allacher, Peter; Horling, Frank
2016-09-05
Detecting and characterizing of anti-drug antibodies (ADA) against a protein therapeutic are crucially important to monitor the unwanted immune response. Usually a multi-tiered approach that initially rapidly screens for positive samples that are subsequently confirmed in a separate assay is employed for testing of patient samples for ADA activity. In this manuscript we evaluate the ability of different methods used to classify subject with screening and competition based confirmatory assays. We find that for the overall performance of the multi-stage process the method used for confirmation is most important where a t-test is best when differences are moderate to large. Moreover we find that, when differences between positive and negative samples are not sufficiently large, using a competition based confirmation step does yield poor classification of positive samples. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Agustinus, E. T. S.
2018-02-01
Indonesia's position on the path of ring of fire makes it rich in mineral resources. Nevertheless, in the past, the exploitation of Indonesian mineral resources was uncontrolled resulting in environmental degradation and marginal reserves. Exploitation of excessive mineral resources is very detrimental to the state. Reflecting on the occasion, the management and utilization of Indonesia's mineral resources need to be good in mining practice. The problem is how to utilize the mineral reserve resources effectively and efficiently. Utilization of marginal reserves requires new technologies and processing methods because the old processing methods are inadequate. This paper gives a result of Multi Blending Technology (MBT) Method. The underlying concept is not to do the extraction or refinement but processing through the formulation of raw materials by adding an additive and produce a new material called functional materials. Application of this method becomes important to be summarized into a scientific paper in a book form, so that the information can spread across multiple print media and become focused on and optimized. This book is expected to be used as a reference for stakeholder providing added value to environmentally marginal reserves in Indonesia. The conclusions are that Multi Blending Technology (MBT) Method can be used as a strategy to increase added values effectively and efficiently to marginal reserve minerals and that Multi Blending Technology (MBT) method has been applied to forsterite, Atapulgite Synthesis, Zeoceramic, GEM, MPMO, SMAC and Geomaterial.
NASA Astrophysics Data System (ADS)
Mitryaeva, N. S.; Myshlyavtsev, A. V.; Akimenko, S. S.
2017-08-01
The paper studies the effect of ultrasonic processing on the vulcanizing, physical, mechanical and electrophysical properties of elastomeric compositions based on synthetic isoprene rubber. Microscopic studies of multi-wall carbon nanotubes samples before and after ultrasonic processing are carried out. Due to the research, the applied ultrasonic processing method provides splitting of bundles formed from multi-wall carbon nanotubes. This results in elastomeric material with increased strength and high electrical conductivity with a low concentration of nanofiller.
A MUSIC-based method for SSVEP signal processing.
Chen, Kun; Liu, Quan; Ai, Qingsong; Zhou, Zude; Xie, Sheng Quan; Meng, Wei
2016-03-01
The research on brain computer interfaces (BCIs) has become a hotspot in recent years because it offers benefit to disabled people to communicate with the outside world. Steady state visual evoked potential (SSVEP)-based BCIs are more widely used because of higher signal to noise ratio and greater information transfer rate compared with other BCI techniques. In this paper, a multiple signal classification based method was proposed for multi-dimensional SSVEP feature extraction. 2-second data epochs from four electrodes achieved excellent accuracy rates including idle state detection. In some asynchronous mode experiments, the recognition accuracy reached up to 100%. The experimental results showed that the proposed method attained good frequency resolution. In most situations, the recognition accuracy was higher than canonical correlation analysis, which is a typical method for multi-channel SSVEP signal processing. Also, a virtual keyboard was successfully controlled by different subjects in an unshielded environment, which proved the feasibility of the proposed method for multi-dimensional SSVEP signal processing in practical applications.
Combined multi-spectrum and orthogonal Laplacianfaces for fast CB-XLCT imaging with single-view data
NASA Astrophysics Data System (ADS)
Zhang, Haibo; Geng, Guohua; Chen, Yanrong; Qu, Xuan; Zhao, Fengjun; Hou, Yuqing; Yi, Huangjian; He, Xiaowei
2017-12-01
Cone-beam X-ray luminescence computed tomography (CB-XLCT) is an attractive hybrid imaging modality, which has the potential of monitoring the metabolic processes of nanophosphors-based drugs in vivo. Single-view data reconstruction as a key issue of CB-XLCT imaging promotes the effective study of dynamic XLCT imaging. However, it suffers from serious ill-posedness in the inverse problem. In this paper, a multi-spectrum strategy is adopted to relieve the ill-posedness of reconstruction. The strategy is based on the third-order simplified spherical harmonic approximation model. Then, an orthogonal Laplacianfaces-based method is proposed to reduce the large computational burden without degrading the imaging quality. Both simulated data and in vivo experimental data were used to evaluate the efficiency and robustness of the proposed method. The results are satisfactory in terms of both location and quantitative recovering with computational efficiency, indicating that the proposed method is practical and promising for single-view CB-XLCT imaging.
Testing and evaluation of tactical electro-optical sensors
NASA Astrophysics Data System (ADS)
Middlebrook, Christopher T.; Smith, John G.
2002-07-01
As integrated electro-optical sensor payloads (multi- sensors) comprised of infrared imagers, visible imagers, and lasers advance in performance, the tests and testing methods must also advance in order to fully evaluate them. Future operational requirements will require integrated sensor payloads to perform missions at further ranges and with increased targeting accuracy. In order to meet these requirements sensors will require advanced imaging algorithms, advanced tracking capability, high-powered lasers, and high-resolution imagers. To meet the U.S. Navy's testing requirements of such multi-sensors, the test and evaluation group in the Night Vision and Chemical Biological Warfare Department at NAVSEA Crane is developing automated testing methods, and improved tests to evaluate imaging algorithms, and procuring advanced testing hardware to measure high resolution imagers and line of sight stabilization of targeting systems. This paper addresses: descriptions of the multi-sensor payloads tested, testing methods used and under development, and the different types of testing hardware and specific payload tests that are being developed and used at NAVSEA Crane.
NASA Astrophysics Data System (ADS)
Sizova, Evgeniya; Zhutaeva, Evgeniya; Chugunov, Andrei
2018-03-01
The article highlights features of processes of urban territory renovation from the perspective of a commercial entity participating in the implementation of a project. The requirements of high-rise construction projects to the entities, that carry out them, are considered. The advantages of large enterprises as participants in renovation projects are systematized, contributing to their most efficient implementation. The factors, which influence the success of the renovation projects, are presented. A method for selecting projects for implementation based on criteria grouped by qualitative characteristics and contributing to the most complete and comprehensive evaluation of the project is suggested. Patterns to prioritize and harmonize renovation projects in terms of multi-project activity of the enterprise are considered.
Ouyang, Xiaoguang; Guo, Fen
2018-04-01
Municipal wastewater discharge is widespread and one of the sources of coastal eutrophication, and is especially uncontrolled in developing and undeveloped coastal regions. Mangrove forests are natural filters of pollutants in wastewater. There are three paradigms of mangroves for municipal wastewater treatment and the selection of the optimal one is a multi-criteria decision-making problem. Combining intuitionistic fuzzy theory, the Fuzzy Delphi Method and the fuzzy analytical hierarchical process (AHP), this study develops an intuitionistic fuzzy AHP (IFAHP) method. For the Fuzzy Delphi Method, the judgments of experts and representatives on criterion weights are made by linguistic variables and quantified by intuitionistic fuzzy theory, which is also used to weight the importance of experts and representatives. This process generates the entropy weights of criteria, which are combined with indices values and weights to rank the alternatives by the fuzzy AHP method. The IFAHP method was used to select the optimal paradigm of mangroves for treating municipal wastewater. The entropy weights were entrained by the valid evaluation of 64 experts and representatives via online survey. Natural mangroves were found to be the optimal paradigm for municipal wastewater treatment. By assigning different weights to the criteria, sensitivity analysis shows that natural mangroves remain to be the optimal paradigm under most scenarios. This study stresses the importance of mangroves for wastewater treatment. Decision-makers need to contemplate mangrove reforestation projects, especially where mangroves are highly deforested but wastewater discharge is uncontrolled. The IFAHP method is expected to be applied in other multi-criteria decision-making cases. Copyright © 2017 Elsevier Ltd. All rights reserved.
Cheng, Yuhua; Deng, Yiming; Cao, Jing; Xiong, Xin; Bai, Libing; Li, Zhaojun
2013-01-01
In this article, the state-of-the-art multi-wave and hybrid imaging techniques in the field of nondestructive evaluation and structural health monitoring were comprehensively reviewed. A new direction for assessment and health monitoring of various structures by capitalizing the advantages of those imaging methods was discussed. Although sharing similar system configurations, the imaging physics and principles of multi-wave phenomena and hybrid imaging methods are inherently different. After a brief introduction of nondestructive evaluation (NDE), structure health monitoring (SHM) and their related challenges, several recent advances that have significantly extended imaging methods from laboratory development into practical applications were summarized, followed by conclusions and discussion on future directions. PMID:24287536
CNF Re-Inforced Polymer Composites
NASA Astrophysics Data System (ADS)
Lake, Max L.; Tibbetts, Gary G.; Glasgow, D. Gerald
2004-09-01
In properties of physical size, performance improvement, and production cost, carbon nanofiber (CNF) lies in a spectrum of materials bounded by carbon black, fullerenes, and single wall to multi-wall carbon nanotubes on one end and continuous carbon fiber on the other. Results show promise for use of CNF for modified electrical conductivity of polymer composites. Current compounding efforts focus on techniques for nanofiber dispersion designed to retain nanofiber length, including de-bulking methods and low shear melt processing. Heat treatment of CNF as a postproduction process has also been evaluated for its influence on electrical properties of CNF-reinforced polymer composites.
International Land Model Benchmarking (ILAMB) Workshop Report, Technical Report DOE/SC-0186
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoffman, Forrest M.; Koven, Charles D.; Kappel-Aleks, Gretchen
2016-11-01
As Earth system models become increasingly complex, there is a growing need for comprehensive and multi-faceted evaluation of model projections. To advance understanding of biogeochemical processes and their interactions with hydrology and climate under conditions of increasing atmospheric carbon dioxide, new analysis methods are required that use observations to constrain model predictions, inform model development, and identify needed measurements and field experiments. Better representations of biogeochemistry–climate feedbacks and ecosystem processes in these models are essential for reducing uncertainties associated with projections of climate change during the remainder of the 21st century.
Hybrid optical acoustic seafloor mapping
NASA Astrophysics Data System (ADS)
Inglis, Gabrielle
The oceanographic research and industrial communities have a persistent demand for detailed three dimensional sea floor maps which convey both shape and texture. Such data products are used for archeology, geology, ship inspection, biology, and habitat classification. There are a variety of sensing modalities and processing techniques available to produce these maps and each have their own potential benefits and related challenges. Multibeam sonar and stereo vision are such two sensors with complementary strengths making them ideally suited for data fusion. Data fusion approaches however, have seen only limited application to underwater mapping and there are no established methods for creating hybrid, 3D reconstructions from two underwater sensing modalities. This thesis develops a processing pipeline to synthesize hybrid maps from multi-modal survey data. It is helpful to think of this processing pipeline as having two distinct phases: Navigation Refinement and Map Construction. This thesis extends existing work in underwater navigation refinement by incorporating methods which increase measurement consistency between both multibeam and camera. The result is a self consistent 3D point cloud comprised of camera and multibeam measurements. In map construction phase, a subset of the multi-modal point cloud retaining the best characteristics of each sensor is selected to be part of the final map. To quantify the desired traits of a map several characteristics of a useful map are distilled into specific criteria. The different ways that hybrid maps can address these criteria provides justification for producing them as an alternative to current methodologies. The processing pipeline implements multi-modal data fusion and outlier rejection with emphasis on different aspects of map fidelity. The resulting point cloud is evaluated in terms of how well it addresses the map criteria. The final hybrid maps retain the strengths of both sensors and show significant improvement over the single modality maps and naively assembled multi-modal maps.
Global Ionospheric Modelling using Multi-GNSS: BeiDou, Galileo, GLONASS and GPS.
Ren, Xiaodong; Zhang, Xiaohong; Xie, Weiliang; Zhang, Keke; Yuan, Yongqiang; Li, Xingxing
2016-09-15
The emergence of China's Beidou, Europe's Galileo and Russia's GLONASS satellites has multiplied the number of ionospheric piercing points (IPP) offered by GPS alone. This provides great opportunities for deriving precise global ionospheric maps (GIMs) with high resolution to improve positioning accuracy and ionospheric monitoring capabilities. In this paper, the GIM is developed based on multi-GNSS (GPS, GLONASS, BeiDou and Galileo) observations in the current multi-constellation condition. The performance and contribution of multi-GNSS for ionospheric modelling are carefully analysed and evaluated. Multi-GNSS observations of over 300 stations from the Multi-GNSS Experiment (MGEX) and International GNSS Service (IGS) networks for two months are processed. The results show that the multi-GNSS GIM products are better than those of GIM products based on GPS-only. Differential code biases (DCB) are by-products of the multi-GNSS ionosphere modelling, the corresponding standard deviations (STDs) are 0.06 ns, 0.10 ns, 0.18 ns and 0.15 ns for GPS, GLONASS, BeiDou and Galileo, respectively in satellite, and the STDs for the receiver are approximately 0.2~0.4 ns. The single-frequency precise point positioning (SF-PPP) results indicate that the ionospheric modelling accuracy of the proposed method based on multi-GNSS observations is better than that of the current dual-system GIM in specific areas.
Global Ionospheric Modelling using Multi-GNSS: BeiDou, Galileo, GLONASS and GPS
Ren, Xiaodong; Zhang, Xiaohong; Xie, Weiliang; Zhang, Keke; Yuan, Yongqiang; Li, Xingxing
2016-01-01
The emergence of China’s Beidou, Europe’s Galileo and Russia’s GLONASS satellites has multiplied the number of ionospheric piercing points (IPP) offered by GPS alone. This provides great opportunities for deriving precise global ionospheric maps (GIMs) with high resolution to improve positioning accuracy and ionospheric monitoring capabilities. In this paper, the GIM is developed based on multi-GNSS (GPS, GLONASS, BeiDou and Galileo) observations in the current multi-constellation condition. The performance and contribution of multi-GNSS for ionospheric modelling are carefully analysed and evaluated. Multi-GNSS observations of over 300 stations from the Multi-GNSS Experiment (MGEX) and International GNSS Service (IGS) networks for two months are processed. The results show that the multi-GNSS GIM products are better than those of GIM products based on GPS-only. Differential code biases (DCB) are by-products of the multi-GNSS ionosphere modelling, the corresponding standard deviations (STDs) are 0.06 ns, 0.10 ns, 0.18 ns and 0.15 ns for GPS, GLONASS, BeiDou and Galileo, respectively in satellite, and the STDs for the receiver are approximately 0.2~0.4 ns. The single-frequency precise point positioning (SF-PPP) results indicate that the ionospheric modelling accuracy of the proposed method based on multi-GNSS observations is better than that of the current dual-system GIM in specific areas. PMID:27629988
Gülci, Sercan; Akay, Abdullah Emin
2015-12-01
Major roads cause barrier effect and fragmentation on wildlife habitats that are suitable places for feeding, mating, socializing, and hiding. Due to wildlife collisions (Wc), human-wildlife conflicts result in lost lives and loss of biodiversity. Geographical information system (GIS)-based multi criteria evaluation (MCE) methods have been successfully used in short-term planning of road networks considering wild animals. Recently, wildlife passages have been effectively utilized as road engineering structures provide quick and certain solutions for traffic safety and wildlife conservation problems. GIS-based MCE methods provide decision makers with optimum location for ecological passages based on habitat suitability models (HSMs) that classify the areas based on ecological requirements of target species. In this study, ecological passages along Motorway 52 within forested areas in Mediterranean city of Osmaniye in Turkey were evaluated. Firstly, HSM coupled with nine eco-geographic decision variables were developed based on ecological requirements of roe deer (Capreolus capreolus) that were chosen as target species. Then specified decision variables were evaluated using GIS-based weighted linear combination (WLC) method to estimate movement corridors and mitigation points along the motorway. In the solution process, two linkage nodes were evaluated for eco-passages which were determined based on the least-cost movement corridor intersecting with the motorway. One of the passages was identified as a natural wildlife overpass while the other was suggested as underpass construction. The results indicated that computer-based models provide accurate and quick solutions for positioning ecological passages to reduce environmental effects of road networks on wild animals.
Forkan, Abdur Rahim Mohammad; Khalil, Ibrahim
2017-02-01
In home-based context-aware monitoring patient's real-time data of multiple vital signs (e.g. heart rate, blood pressure) are continuously generated from wearable sensors. The changes in such vital parameters are highly correlated. They are also patient-centric and can be either recurrent or can fluctuate. The objective of this study is to develop an intelligent method for personalized monitoring and clinical decision support through early estimation of patient-specific vital sign values, and prediction of anomalies using the interrelation among multiple vital signs. In this paper, multi-label classification algorithms are applied in classifier design to forecast these values and related abnormalities. We proposed a completely new approach of patient-specific vital sign prediction system using their correlations. The developed technique can guide healthcare professionals to make accurate clinical decisions. Moreover, our model can support many patients with various clinical conditions concurrently by utilizing the power of cloud computing technology. The developed method also reduces the rate of false predictions in remote monitoring centres. In the experimental settings, the statistical features and correlations of six vital signs are formulated as multi-label classification problem. Eight multi-label classification algorithms along with three fundamental machine learning algorithms are used and tested on a public dataset of 85 patients. Different multi-label classification evaluation measures such as Hamming score, F1-micro average, and accuracy are used for interpreting the prediction performance of patient-specific situation classifications. We achieved 90-95% Hamming score values across 24 classifier combinations for 85 different patients used in our experiment. The results are compared with single-label classifiers and without considering the correlations among the vitals. The comparisons show that multi-label method is the best technique for this problem domain. The evaluation results reveal that multi-label classification techniques using the correlations among multiple vitals are effective ways for early estimation of future values of those vitals. In context-aware remote monitoring this process can greatly help the doctors in quick diagnostic decision making. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Analysis of multi lobe journal bearings with surface roughness using finite difference method
NASA Astrophysics Data System (ADS)
PhaniRaja Kumar, K.; Bhaskar, SUdaya; Manzoor Hussain, M.
2018-04-01
Multi lobe journal bearings are used for high operating speeds and high loads in machines. In this paper symmetrical multi lobe journal bearings are analyzed to find out the effect of surface roughnessduring non linear loading. Using the fourth order RungeKutta method, time transient analysis was performed to calculate and plot the journal centre trajectories. Flow factor method is used to evaluate the roughness and the finite difference method (FDM) is used to predict the pressure distribution over the bearing surface. The Transient analysis is done on the multi lobe journal bearings for threedifferent surface roughness orientations. Longitudinal surface roughness is more effective when compared with isotopic and traverse surface roughness.
An Integrated DEMATEL-VIKOR Method-Based Approach for Cotton Fibre Selection and Evaluation
NASA Astrophysics Data System (ADS)
Chakraborty, Shankar; Chatterjee, Prasenjit; Prasad, Kanika
2018-01-01
Selection of the most appropriate cotton fibre type for yarn manufacturing is often treated as a multi-criteria decision-making (MCDM) problem as the optimal selection decision needs to be taken in presence of several conflicting fibre properties. In this paper, two popular MCDM methods in the form of decision making trial and evaluation laboratory (DEMATEL) and VIse Kriterijumska Optimizacija kompromisno Resenje (VIKOR) are integrated to aid the cotton fibre selection decision. DEMATEL method addresses the interrelationships between various physical properties of cotton fibres while segregating them into cause and effect groups, whereas, VIKOR method helps in ranking all the considered 17 cotton fibres from the best to the worst. The derived ranking of cotton fibre alternatives closely matches with that obtained by the past researchers. This model can assist the spinning industry personnel in the blending process while making accurate fibre selection decision when cotton fibre properties are numerous and interrelated.
An Integrated DEMATEL-VIKOR Method-Based Approach for Cotton Fibre Selection and Evaluation
NASA Astrophysics Data System (ADS)
Chakraborty, Shankar; Chatterjee, Prasenjit; Prasad, Kanika
2018-06-01
Selection of the most appropriate cotton fibre type for yarn manufacturing is often treated as a multi-criteria decision-making (MCDM) problem as the optimal selection decision needs to be taken in presence of several conflicting fibre properties. In this paper, two popular MCDM methods in the form of decision making trial and evaluation laboratory (DEMATEL) and VIse Kriterijumska Optimizacija kompromisno Resenje (VIKOR) are integrated to aid the cotton fibre selection decision. DEMATEL method addresses the interrelationships between various physical properties of cotton fibres while segregating them into cause and effect groups, whereas, VIKOR method helps in ranking all the considered 17 cotton fibres from the best to the worst. The derived ranking of cotton fibre alternatives closely matches with that obtained by the past researchers. This model can assist the spinning industry personnel in the blending process while making accurate fibre selection decision when cotton fibre properties are numerous and interrelated.
Stuit, Marco; Wortmann, Hans; Szirbik, Nick; Roodenburg, Jan
2011-12-01
In the healthcare domain, human collaboration processes (HCPs), which consist of interactions between healthcare workers from different (para)medical disciplines and departments, are of growing importance as healthcare delivery becomes increasingly integrated. Existing workflow-based process modelling tools for healthcare process management, which are the most commonly applied, are not suited for healthcare HCPs mainly due to their focus on the definition of task sequences instead of the graphical description of human interactions. This paper uses a case study of a healthcare HCP at a Dutch academic hospital to evaluate a novel interaction-centric process modelling method. The HCP under study is the care pathway performed by the head and neck oncology team. The evaluation results show that the method brings innovative, effective, and useful features. First, it collects and formalizes the tacit domain knowledge of the interviewed healthcare workers in individual interaction diagrams. Second, the method automatically integrates these local diagrams into a single global interaction diagram that reflects the consolidated domain knowledge. Third, the case study illustrates how the method utilizes a graphical modelling language for effective tree-based description of interactions, their composition and routing relations, and their roles. A process analysis of the global interaction diagram is shown to identify HCP improvement opportunities. The proposed interaction-centric method has wider applicability since interactions are the core of most multidisciplinary patient-care processes. A discussion argues that, although (multidisciplinary) collaboration is in many cases not optimal in the healthcare domain, it is increasingly considered a necessity to improve integration, continuity, and quality of care. The proposed method is helpful to describe, analyze, and improve the functioning of healthcare collaboration. Copyright © 2011 Elsevier Inc. All rights reserved.
Multi-template tensor-based morphometry: Application to analysis of Alzheimer's disease
Koikkalainen, Juha; Lötjönen, Jyrki; Thurfjell, Lennart; Rueckert, Daniel; Waldemar, Gunhild; Soininen, Hilkka
2012-01-01
In this paper methods for using multiple templates in tensor-based morphometry (TBM) are presented and comparedtothe conventional single-template approach. TBM analysis requires non-rigid registrations which are often subject to registration errors. When using multiple templates and, therefore, multiple registrations, it can be assumed that the registration errors are averaged and eventually compensated. Four different methods are proposed for multi-template TBM. The methods were evaluated using magnetic resonance (MR) images of healthy controls, patients with stable or progressive mild cognitive impairment (MCI), and patients with Alzheimer's disease (AD) from the ADNI database (N=772). The performance of TBM features in classifying images was evaluated both quantitatively and qualitatively. Classification results show that the multi-template methods are statistically significantly better than the single-template method. The overall classification accuracy was 86.0% for the classification of control and AD subjects, and 72.1%for the classification of stable and progressive MCI subjects. The statistical group-level difference maps produced using multi-template TBM were smoother, formed larger continuous regions, and had larger t-values than the maps obtained with single-template TBM. PMID:21419228
NASA Astrophysics Data System (ADS)
Pradhanang, S. M.; Hasan, M. A.; Booth, P.; Fallatah, O.
2016-12-01
The monsoon and snow driven regime in the Himalayan region has received increasing attention in the recent decade regarding the effects of climate change on hydrologic regimes. Modeling streamflow in such spatially varied catchment requires proper calibration and validation in hydrologic modeling. While calibration and validation are time consuming and computationally intensive, an effective regionalized approach with multi-site information is crucial for flow estimation, especially in daily scale. In this study, we adopted a multi-site approach to calibration and validation of the Soil Water Assessment Tool (SWAT) model for the Karnali river catchment, which is characterized as being the most vulnerable catchment to climate change in the Himalayan region. APHRODITE's (Asian Precipitation - Highly-Resolved Observational Data Integration Towards Evaluation) daily gridded precipitation data, one of the accurate and reliable weather date over this region were utilized in this study. The model evaluation of the entire catchment divided into four sub-catchments, utilizing discharge records from 1963 to 2010. In previous studies, multi-site calibration used only a single set of calibration parameters for all sub-catchment of a large watershed. In this study, we introduced a technique that can incorporate different sets of calibration parameters for each sub-basin, which eventually ameliorate the flow of the whole watershed. Results show that the calibrated model with new method can capture almost identical pattern of flow over the region. The predicted daily streamflow matched the observed values, with a Nash-Sutcliffe coefficient of 0.73 during calibration and 0.71 during validation period. The method perfumed better than existing multi-site calibration methods. To assess the influence of continued climate change on hydrologic processes, we modified the weather inputs for the model using precipitation and temperature changes for two Representative Concentration Pathways (RCPs) scenarios, RCP 4.5 and 8.5. Climate simulation for RCP scenarios were conducted from 1981-2100, where 1981-2005 was considered as baseline and 2006-2100 was considered as the future projection. The result shows that probability of flooding will eventually increase in future years due to increased flow in both scenarios.
Multi-Criteria selection of technology for processing ore raw materials
NASA Astrophysics Data System (ADS)
Gorbatova, E. A.; Emelianenko, E. A.; Zaretckii, M. V.
2017-10-01
The development of Computer-Aided Process Planning (CAPP) for the Ore Beneficiation process is considered. The set of parameters to define the quality of the Ore Beneficiation process is identified. The ontological model of CAPP for the Ore Beneficiation process is described. The hybrid choice method of the most appropriate variant of the Ore Beneficiation process based on the Logical Conclusion Rules and the Fuzzy Multi-Criteria Decision Making (MCDM) approach is proposed.
Determination of criteria weights in solving multi-criteria problems
NASA Astrophysics Data System (ADS)
Kasim, Maznah Mat
2014-12-01
A multi-criteria (MC) problem comprises of units to be analyzed under a set of evaluation criteria. Solving a MC problem is basically the process of finding the overall performance or overall quality of the units of analysis by using certain aggregation method. Based on these overall measures of each unit, a decision can be made whether to sort them, to select the best or to group them according to certain ranges. Prior to solving the MC problems, the weights of the related criteria have to be determined with the assumption that the weights represent the degree of importance or the degree of contribution towards the overall performance of the units. This paper presents two main approaches which are called as subjective and objective approaches, where the first one involves evaluator(s) while the latter approach depends on the intrinsic information contained in each criterion. The subjective and objective weights are defined if the criteria are assumed to be independent with each other, but if they are dependent, there is another type of weight, which is called as monotone measure weight or compound weights which represent degree of interaction among the criteria. The measure of individual weights or compound weights must be addressed in solving multi-criteria problems so that the solutions are more reliable since in the real world, evaluation criteria always come with different degree of importance or are dependent with each other. As the real MC problems have their own uniqueness, it is up to the decision maker(s) to decide which type of weights and which method are the most applicable ones for the problem under study.
Analysis and Relative Evaluation of Connectivity of a Mobile Multi-Hop Network
NASA Astrophysics Data System (ADS)
Nakano, Keisuke; Miyakita, Kazuyuki; Sengoku, Masakazu; Shinoda, Shoji
In mobile multi-hop networks, a source node S and a destination node D sometimes encounter a situation where there is no multi-hop path between them when a message M, destined for D, arrives at S. In this situation, we cannot send M from S to D immediately; however, we can deliver M to D after waiting some time with the help of two capabilities of mobility. One of the capabilities is to construct a connected multi-hop path by changing the topology of the network during the waiting time (Capability 1), and the other is to move M closer to D during the waiting time (Capability 2). In this paper, we consider three methods to deliver M from S to D by using these capabilities in different ways. Method 1 uses Capability 1 and sends M from S to D after waiting until a connected multi-hop path appears between S and D. Method 2 uses Capability 2 and delivers M to D by allowing a mobile node to carry M from S to D. Method 3 is a combination of Methods 1 and 2 and minimizes the waiting time. We evaluate and compare these three methods in terms of the mean waiting time, from the time when M arrives at S to the time when D starts receiving M, as a new approach to connectivity evaluation. We consider a one-dimensional mobile multi-hop network consisting of mobile nodes flowing in opposite directions along a street. First, we derive some approximate equations and propose an estimation method to compute the mean waiting time of Method 1. Second, we theoretically analyze the mean waiting time of Method 2, and compute a lower bound of that of Method 3. By comparing the three methods under the same assumptions using results of the analyses and some simulation results, we show relations between the mean waiting times of these methods and show how Capabilities 1 and 2 differently affect the mean waiting time.
Magnetic manipulation device for the optimization of cell processing conditions.
Ito, Hiroshi; Kato, Ryuji; Ino, Kosuke; Honda, Hiroyuki
2010-02-01
Variability in human cell phenotypes make it's advancements in optimized cell processing necessary for personalized cell therapy. Here we propose a strategy of palm-top sized device to assist physically manipulating cells for optimizing cell preparations. For the design of such a device, we combined two conventional approaches: multi-well plate formatting and magnetic cell handling using magnetite cationic liposomes (MCLs). From our previous works, we showed the labeling applications of MCL on adhesive cells for various tissue engineering approaches. To feasibly transfer cells in multi-well plate, we here evaluated the magnetic response of MCL-labeled suspension type cells. The cell handling performance of Jurkat cells proved to be faster and more robust compared to MACS (Magnetic Cell Sorting) bead methods. To further confirm our strategy, prototype palm-top sized device "magnetic manipulation device (MMD)" was designed. In the device, the actual cell transportation efficacy of Jurkat cells was satisfying. Moreover, as a model of the most distributed clinical cell processing, primary peripheral blood mononuclear cells (PBMCs) from different volunteers were evaluated. By MMD, individual PBMCs indicated to have optimum Interleukin-2 (IL-2) concentrations for the expansion. Such huge differences of individual cells indicated that MMD, our proposing efficient and self-contained support tool, could assist the feasible and cost-effective optimization of cell processing in clinical facilities. Copyright (c) 2009 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Multi-criteria analysis for municipal solid waste management in a Brazilian metropolitan area.
Santos, Simone Machado; Silva, Maisa Mendonça; Melo, Renata Maciel; Gavazza, Savia; Florencio, Lourdinha; Kato, Mario T
2017-10-15
The decision-making process involved in municipal solid waste management (MSWM) must consider more than just financial aspects, which makes it a difficult task in developing countries. The Recife Metropolitan Region (RMR) in the Northeast of Brazil faces a MSWM problem that has been ongoing since the 1970s, with no common solution. In order to direct short-term solutions, three MSWM alternatives were outlined for the RMR, considering the current and future situations, the time and cost involved and social/environmental criteria. A multi-criteria approach, based on the Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was proposed to rank these alternatives. The alternative that included two private landfill sites and seven transfer, sorting and composting stations was confirmed as the most suitable and stable option for short-term MSWM, considering the two scenarios for the criteria weights. Sensitivity analysis was also performed to support the robustness of the results. The implementation of separate collections would minimize the amount of waste buried, while maximizing the useful life of landfill sites and increasing the timeframe of the alternative. Overall, the multi-criteria analysis was helpful and accurate during the alternative selection process, considering the similarities and restrictions of each option, which can lead to difficulties during the decision-making process.
Forkert, N D; Cheng, B; Kemmling, A; Thomalla, G; Fiehler, J
2014-01-01
The objective of this work is to present the software tool ANTONIA, which has been developed to facilitate a quantitative analysis of perfusion-weighted MRI (PWI) datasets in general as well as the subsequent multi-parametric analysis of additional datasets for the specific purpose of acute ischemic stroke patient dataset evaluation. Three different methods for the analysis of DSC or DCE PWI datasets are currently implemented in ANTONIA, which can be case-specifically selected based on the study protocol. These methods comprise a curve fitting method as well as a deconvolution-based and deconvolution-free method integrating a previously defined arterial input function. The perfusion analysis is extended for the purpose of acute ischemic stroke analysis by additional methods that enable an automatic atlas-based selection of the arterial input function, an analysis of the perfusion-diffusion and DWI-FLAIR mismatch as well as segmentation-based volumetric analyses. For reliability evaluation, the described software tool was used by two observers for quantitative analysis of 15 datasets from acute ischemic stroke patients to extract the acute lesion core volume, FLAIR ratio, perfusion-diffusion mismatch volume with manually as well as automatically selected arterial input functions, and follow-up lesion volume. The results of this evaluation revealed that the described software tool leads to highly reproducible results for all parameters if the automatic arterial input function selection method is used. Due to the broad selection of processing methods that are available in the software tool, ANTONIA is especially helpful to support image-based perfusion and acute ischemic stroke research projects.
Validation of two innovative methods to measure contaminant mass flux in groundwater
NASA Astrophysics Data System (ADS)
Goltz, Mark N.; Close, Murray E.; Yoon, Hyouk; Huang, Junqi; Flintoft, Mark J.; Kim, Sehjong; Enfield, Carl
2009-04-01
The ability to quantify the mass flux of a groundwater contaminant that is leaching from a source area is critical to enable us to: (1) evaluate the risk posed by the contamination source and prioritize cleanup, (2) evaluate the effectiveness of source remediation technologies or natural attenuation processes, and (3) quantify a source term for use in models that may be applied to predict maximum contaminant concentrations in downstream wells. Recently, a number of new methods have been developed and subsequently applied to measure contaminant mass flux in groundwater in the field. However, none of these methods has been validated at larger than the laboratory-scale through a comparison of measured mass flux and a known flux that has been introduced into flowing groundwater. A couple of innovative flux measurement methods, the tandem circulation well (TCW) and modified integral pumping test (MIPT) methods, have recently been proposed. The TCW method can measure mass flux integrated over a large subsurface volume without extracting water. The TCW method may be implemented using two different techniques. One technique, the multi-dipole technique, is relatively simple and inexpensive, only requiring measurement of heads, while the second technique requires conducting a tracer test. The MIPT method is an easily implemented method of obtaining volume-integrated flux measurements. In the current study, flux measurements obtained using these two methods are compared with known mass fluxes in a three-dimensional, artificial aquifer. Experiments in the artificial aquifer show that the TCW multi-dipole and tracer test techniques accurately estimated flux, within 2% and 16%, respectively; although the good results obtained using the multi-dipole technique may be fortuitous. The MIPT method was not as accurate as the TCW method, underestimating flux by as much as 70%. MIPT method inaccuracies may be due to the fact that the method assumptions (two-dimensional steady groundwater flow to fully-screened wells) were not well-approximated. While fluxes measured using the MIPT method were consistently underestimated, the method's simplicity and applicability to the field may compensate for the inaccuracies that were observed in this artificial aquifer test.
NASA Astrophysics Data System (ADS)
Lu, Hongwei; Ren, Lixia; Chen, Yizhong; Tian, Peipei; Liu, Jia
2017-12-01
Due to the uncertainty (i.e., fuzziness, stochasticity and imprecision) existed simultaneously during the process for groundwater remediation, the accuracy of ranking results obtained by the traditional methods has been limited. This paper proposes a cloud model based multi-attribute decision making framework (CM-MADM) with Monte Carlo for the contaminated-groundwater remediation strategies selection. The cloud model is used to handle imprecise numerical quantities, which can describe the fuzziness and stochasticity of the information fully and precisely. In the proposed approach, the contaminated concentrations are aggregated via the backward cloud generator and the weights of attributes are calculated by employing the weight cloud module. A case study on the remedial alternative selection for a contaminated site suffering from a 1,1,1-trichloroethylene leakage problem in Shanghai, China is conducted to illustrate the efficiency and applicability of the developed approach. Totally, an attribute system which consists of ten attributes were used for evaluating each alternative through the developed method under uncertainty, including daily total pumping rate, total cost and cloud model based health risk. Results indicated that A14 was evaluated to be the most preferred alternative for the 5-year, A5 for the 10-year, A4 for the 15-year and A6 for the 20-year remediation.
Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith
2015-01-01
Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment.
NASA Astrophysics Data System (ADS)
Peng, Hong-Gang; Wang, Jian-Qiang
2017-11-01
In recent years, sustainable energy crop has become an important energy development strategy topic in many countries. Selecting the most sustainable energy crop is a significant problem that must be addressed during any biofuel production process. The focus of this study is the development of an innovative multi-criteria decision-making (MCDM) method to handle sustainable energy crop selection problems. Given that various uncertain data are encountered in the evaluation of sustainable energy crops, linguistic intuitionistic fuzzy numbers (LIFNs) are introduced to present the information necessary to the evaluation process. Processing qualitative concepts requires the effective support of reliable tools; then, a cloud model can be used to deal with linguistic intuitionistic information. First, LIFNs are converted and a novel concept of linguistic intuitionistic cloud (LIC) is proposed. The operations, score function and similarity measurement of the LICs are defined. Subsequently, the linguistic intuitionistic cloud density-prioritised weighted Heronian mean operator is developed, which served as the basis for the construction of an applicable MCDM model for sustainable energy crop selection. Finally, an illustrative example is provided to demonstrate the proposed method, and its feasibility and validity are further verified by comparing it with other existing methods.
Multi-Scale Scattering Transform in Music Similarity Measuring
NASA Astrophysics Data System (ADS)
Wang, Ruobai
Scattering transform is a Mel-frequency spectrum based, time-deformation stable method, which can be used in evaluating music similarity. Compared with Dynamic time warping, it has better performance in detecting similar audio signals under local time-frequency deformation. Multi-scale scattering means to combine scattering transforms of different window lengths. This paper argues that, multi-scale scattering transform is a good alternative of dynamic time warping in music similarity measuring. We tested the performance of multi-scale scattering transform against other popular methods, with data designed to represent different conditions.
Outcome Evaluation of a Community Center-Based Program for Mothers at High Psychosocial Risk
ERIC Educational Resources Information Center
Rodrigo, Maria Jose; Maiquez, Maria Luisa; Correa, Ana Delia; Martin, Juan Carlos; Rodriguez, Guacimara
2006-01-01
Objective: This study reported the outcome evaluation of the "Apoyo Personal y Familiar" (APF) program for poorly-educated mothers from multi-problem families, showing inadequate behavior with their children. APF is a community-based multi-site program delivered through weekly group meetings in municipal resource centers. Method: A total…
Ren, Jingzheng
2018-01-01
This objective of this study is to develop a generic multi-attribute decision analysis framework for ranking the technologies for ballast water treatment and determine their grades. An evaluation criteria system consisting of eight criteria in four categories was used to evaluate the technologies for ballast water treatment. The Best-Worst method, which is a subjective weighting method and Criteria importance through inter-criteria correlation method, which is an objective weighting method, were combined to determine the weights of the evaluation criteria. The extension theory was employed to prioritize the technologies for ballast water treatment and determine their grades. An illustrative case including four technologies for ballast water treatment, i.e. Alfa Laval (T 1 ), Hyde (T 2 ), Unitor (T 3 ), and NaOH (T 4 ), were studied by the proposed method, and the Hyde (T 2 ) was recognized as the best technology. Sensitivity analysis was also carried to investigate the effects of the combined coefficients and the weights of the evaluation criteria on the final priority order of the four technologies for ballast water treatment. The sum weighted method and the TOPSIS was also employed to rank the four technologies, and the results determined by these two methods are consistent to that determined by the proposed method in this study. Copyright © 2017 Elsevier Ltd. All rights reserved.
MRI Post-processing in Pre-surgical Evaluation
Wang, Z. Irene; Alexopoulos, Andreas V.
2016-01-01
Purpose of Review Advanced MRI post-processing techniques are increasingly used to complement visual analysis and elucidate structural epileptogenic lesions. This review summarizes recent developments in MRI post-processing in the context of epilepsy pre-surgical evaluation, with the focus on patients with unremarkable MRI by visual analysis (i.e., “nonlesional” MRI). Recent Findings Various methods of MRI post-processing have been reported to show additional clinical values in the following areas: (1) lesion detection on an individual level; (2) lesion confirmation for reducing the risk of over reading the MRI; (3) detection of sulcal/gyral morphologic changes that are particularly difficult for visual analysis; and (4) delineation of cortical abnormalities extending beyond the visible lesion. Future directions to improve performance of MRI post-processing include using higher magnetic field strength for better signal and contrast to noise ratio, adopting a multi-contrast frame work, and integration with other noninvasive modalities. Summary MRI post-processing can provide essential value to increase the yield of structural MRI and should be included as part of the presurgical evaluation of nonlesional epilepsies. MRI post-processing allows for more accurate identification/delineation of cortical abnormalities, which should then be more confidently targeted and mapped. PMID:26900745
Témoin-Fardini, S; Servant, J; Sellam, S
2017-10-01
The aim of this study was to develop a test method to evaluate the preservation efficacy for a specific product, a very high-alkaline liquid soap (pH around 10) made by a saponification process. Several manufacturers have experienced contamination issues with these high-pH soaps despite passing a classic preservative efficacy challenge test or even a multi-inoculation challenge test. Bacteria were isolated from contaminated soaps and were identified using 16S rRNA gene sequencing. High-alkaline-pH unpreserved soaps were tested using the Thor Personal Care internal multichallenge test method (TM206) with classical microorganisms and then with the bacterial strains isolated from various contaminated soaps (TM768). Preservatives were added to these soaps and assessed for their efficacy using the newly developed test. Four different species of bacteria (Nesterenkonia lacusekhoensis, Dermacoccus sp., Halomonas sp. and Roseomonas sp.) were identified by sequencing among the contaminants of the various soaps tested. Among these, only one bacterial species, Nesterenkonia lacusekhoensis, appeared to be responsible for the specific contamination of these high-alkaline soaps. Thus, one specific wild-type strain of Nesterenkonia lacusekhoensis, named as strain 768, was used in a new multi-inoculation test (TM768). Unlike the single inoculation challenge test, the multi-inoculation test using the Nesterenkonia strain 768 was able to predict the sensitivity of a product towards this bacterium. Among the 27 different preservatives tested, 10 were able to protect the formula against contamination with this bacterium. This study enabled the development of a test method to evaluate the efficacy of preservation using a specific bacterium, Nesterenkonia lacusekhoensis, responsible for the contamination of very alkaline soaps made by saponification and identify an appropriate preservative system. © 2017 Society of Cosmetic Scientists and the Société Française de Cosmétologie.
Brooks, Robin; Thorpe, Richard; Wilson, John
2004-11-11
A new mathematical treatment of alarms that considers them as multi-variable interactions between process variables has provided the first-ever method to calculate values for alarm limits. This has resulted in substantial reductions in false alarms and hence in alarm annunciation rates in field trials. It has also unified alarm management, process control and product quality control into a single mathematical framework so that operations improvement and hence economic benefits are obtained at the same time as increased process safety. Additionally, an algorithm has been developed that advises what changes should be made to Manipulable process variables to clear an alarm. The multi-variable Best Operating Zone at the heart of the method is derived from existing historical data using equation-free methods. It does not require a first-principles process model or an expensive series of process identification experiments. Integral with the method is a new format Process Operator Display that uses only existing variables to fully describe the multi-variable operating space. This combination of features makes it an affordable and maintainable solution for small plants and single items of equipment as well as for the largest plants. In many cases, it also provides the justification for the investments about to be made or already made in process historian systems. Field Trials have been and are being conducted at IneosChlor and Mallinckrodt Chemicals, both in the UK, of the new geometric process control (GPC) method for improving the quality of both process operations and product by providing Process Alarms and Alerts of much high quality than ever before. The paper describes the methods used, including a simple visual method for Alarm Rationalisation that quickly delivers large sets of Consistent Alarm Limits, and the extension to full Alert Management with highlights from the Field Trials to indicate the overall effectiveness of the method in practice.
ERIC Educational Resources Information Center
Lin, Teng-Chiao; Ho, Hui-Ping; Chang, Ching-Ter
2014-01-01
With the widespread use of the Internet, adopting e-learning systems in courses has gradually become more and more important in universities in Taiwan. However, because of limitations of teachers' time, selecting suitable online IT tools has become very important. This study proposes an analytic hierarchy process (AHP)-multi-choice goal…
Consistent linguistic fuzzy preference relations method with ranking fuzzy numbers
NASA Astrophysics Data System (ADS)
Ridzuan, Siti Amnah Mohd; Mohamad, Daud; Kamis, Nor Hanimah
2014-12-01
Multi-Criteria Decision Making (MCDM) methods have been developed to help decision makers in selecting the best criteria or alternatives from the options given. One of the well known methods in MCDM is the Consistent Fuzzy Preference Relation (CFPR) method, essentially utilizes a pairwise comparison approach. This method was later improved to cater subjectivity in the data by using fuzzy set, known as the Consistent Linguistic Fuzzy Preference Relations (CLFPR). The CLFPR method uses the additive transitivity property in the evaluation of pairwise comparison matrices. However, the calculation involved is lengthy and cumbersome. To overcome this problem, a method of defuzzification was introduced by researchers. Nevertheless, the defuzzification process has a major setback where some information may lose due to the simplification process. In this paper, we propose a method of CLFPR that preserves the fuzzy numbers form throughout the process. In obtaining the desired ordering result, a method of ranking fuzzy numbers is utilized in the procedure. This improved procedure for CLFPR is implemented to a case study to verify its effectiveness. This method is useful for solving decision making problems and can be applied to many areas of applications.
NASA Astrophysics Data System (ADS)
Sorokin, V. A.; Volkov, Yu V.; Sherstneva, A. I.; Botygin, I. A.
2016-11-01
This paper overviews a method of generating climate regions based on an analytic signal theory. When applied to atmospheric surface layer temperature data sets, the method allows forming climatic structures with the corresponding changes in the temperature to make conclusions on the uniformity of climate in an area and to trace the climate changes in time by analyzing the type group shifts. The algorithm is based on the fact that the frequency spectrum of the thermal oscillation process is narrow-banded and has only one mode for most weather stations. This allows using the analytic signal theory, causality conditions and introducing an oscillation phase. The annual component of the phase, being a linear function, was removed by the least squares method. The remaining phase fluctuations allow consistent studying of their coordinated behavior and timing, using the Pearson correlation coefficient for dependence evaluation. This study includes program experiments to evaluate the calculation efficiency in the phase grouping task. The paper also overviews some single-threaded and multi-threaded computing models. It is shown that the phase grouping algorithm for meteorological data can be parallelized and that a multi-threaded implementation leads to a 25-30% increase in the performance.
Multi-core processing and scheduling performance in CMS
NASA Astrophysics Data System (ADS)
Hernández, J. M.; Evans, D.; Foulkes, S.
2012-12-01
Commodity hardware is going many-core. We might soon not be able to satisfy the job memory needs per core in the current single-core processing model in High Energy Physics. In addition, an ever increasing number of independent and incoherent jobs running on the same physical hardware not sharing resources might significantly affect processing performance. It will be essential to effectively utilize the multi-core architecture. CMS has incorporated support for multi-core processing in the event processing framework and the workload management system. Multi-core processing jobs share common data in memory, such us the code libraries, detector geometry and conditions data, resulting in a much lower memory usage than standard single-core independent jobs. Exploiting this new processing model requires a new model in computing resource allocation, departing from the standard single-core allocation for a job. The experiment job management system needs to have control over a larger quantum of resource since multi-core aware jobs require the scheduling of multiples cores simultaneously. CMS is exploring the approach of using whole nodes as unit in the workload management system where all cores of a node are allocated to a multi-core job. Whole-node scheduling allows for optimization of the data/workflow management (e.g. I/O caching, local merging) but efficient utilization of all scheduled cores is challenging. Dedicated whole-node queues have been setup at all Tier-1 centers for exploring multi-core processing workflows in CMS. We present the evaluation of the performance scheduling and executing multi-core workflows in whole-node queues compared to the standard single-core processing workflows.
A Multi Agent Based Approach for Prehospital Emergency Management
Safdari, Reza; Shoshtarian Malak, Jaleh; Mohammadzadeh, Niloofar; Danesh Shahraki, Azimeh
2017-01-01
Objective: To demonstrate an architecture to automate the prehospital emergency process to categorize the specialized care according to the situation at the right time for reducing the patient mortality and morbidity. Methods: Prehospital emergency process were analyzed using existing prehospital management systems, frameworks and the extracted process were modeled using sequence diagram in Rational Rose software. System main agents were identified and modeled via component diagram, considering the main system actors and by logically dividing business functionalities, finally the conceptual architecture for prehospital emergency management was proposed. The proposed architecture was simulated using Anylogic simulation software. Anylogic Agent Model, State Chart and Process Model were used to model the system. Results: Multi agent systems (MAS) had a great success in distributed, complex and dynamic problem solving environments, and utilizing autonomous agents provides intelligent decision making capabilities. The proposed architecture presents prehospital management operations. The main identified agents are: EMS Center, Ambulance, Traffic Station, Healthcare Provider, Patient, Consultation Center, National Medical Record System and quality of service monitoring agent. Conclusion: In a critical condition like prehospital emergency we are coping with sophisticated processes like ambulance navigation health care provider and service assignment, consultation, recalling patients past medical history through a centralized EHR system and monitoring healthcare quality in a real-time manner. The main advantage of our work has been the multi agent system utilization. Our Future work will include proposed architecture implementation and evaluation of its impact on patient quality care improvement. PMID:28795061
NASA Astrophysics Data System (ADS)
Akinlalu, A. A.; Adegbuyiro, A.; Adiat, K. A. N.; Akeredolu, B. E.; Lateef, W. Y.
2017-06-01
Groundwater Potential of Oke-Ana area southwestern Nigeria have been evaluated using the integration of electrical resistivity method, remote sensing and geographic information systems. The effect of five hydrogeological indices, namely lineament density, drainage density, lithology, overburden thickness and aquifer layer resistivity on groundwater occurrence was established. Multi-criteria decision analysis technique was employed to assign weight to each of the index using the concept of analytical hierarchy process. The assigned weight was normalized and consistency ratio was established. In order to evaluate the groundwater potential of Oke-Ana, sixty-seven (67) vertical electrical sounding points were occupied. Ten curve types were delineated in the study area. The curve types vary from simple three layer A and H-type curves to the more complex four, five and six layer AA, HA, KH, QH, AKH, HKH, KHA and KHKH curves. Four subsurface geo-electric sequences of top soil, weathered layer, partially weathered/fractured basement and the fresh basement were delineated in the area. The analytical process assisted in classifying Oke-Ana into, low, medium and high groundwater potential zones. Validation of the model from well information and two aborted boreholes suggest 70% agreement.
NASA Astrophysics Data System (ADS)
Noordmans, Herke J.; Rutten, G. J. M.; Willems, Peter W. A.; Viergever, Max A.
2000-04-01
The visualization of brain vessels on the cortex helps the neurosurgeon in two ways: to avoid blood vessels when specifying the trepanation entry, and to overcome errors in the surgical navigation system due to brain shift. We compared 3D T1, MR, 3D T1 MR with gadolinium contrast, MR venography as scanning techniques, mutual information as registration technique, and thresholding and multi-vessel enhancement as image processing techniques. We evaluated the volume rendered results based on their quality and correspondence with photos took during surgery. It appears that with 3D T1 MR scans, gadolinium is required to show cortical veins. The visibility of small cortical veins is strongly enhanced by subtracting a 3D T1 MR baseline scan, which should be registered to the scan with gadolinium contrast, even when the scans are made during the same session. Multi-vessel enhancement helps to clarify the view on small vessels by reducing noise level, but strikingly does not reveal more. MR venography does show intracerebral veins with high detail, but is, as is, unsuited to show cortical veins due to the low contrast with CSF.
Seghezzo, Lucas; Venencia, Cristian; Buliubasich, E Catalina; Iribarnegaray, Martín A; Volante, José N
2017-02-01
Conflicts over land use and ownership are common in South America and generate frequent confrontations among indigenous peoples, small-scale farmers, and large-scale agricultural producers. We argue in this paper that an accurate identification of these conflicts, together with a participatory evaluation of their importance, will increase the social legitimacy of land use planning processes, rendering decision-making more sustainable in the long term. We describe here a participatory, multi-criteria conflict assessment model developed to identify, locate, and categorize land tenure and use conflicts. The model was applied to the case of the "Chaco" region of the province of Salta, in northwestern Argentina. Basic geographic, cadastral, and social information needed to apply the model was made spatially explicit on a Geographic Information System. Results illustrate the contrasting perceptions of different stakeholders (government officials, social and environmental non-governmental organizations, large-scale agricultural producers, and scholars) on the intensity of land use conflicts in the study area. These results can help better understand and address land tenure conflicts in areas with different cultures and conflicting social and enviornmental interests.
NASA Astrophysics Data System (ADS)
Seghezzo, Lucas; Venencia, Cristian; Buliubasich, E. Catalina; Iribarnegaray, Martín A.; Volante, José N.
2017-02-01
Conflicts over land use and ownership are common in South America and generate frequent confrontations among indigenous peoples, small-scale farmers, and large-scale agricultural producers. We argue in this paper that an accurate identification of these conflicts, together with a participatory evaluation of their importance, will increase the social legitimacy of land use planning processes, rendering decision-making more sustainable in the long term. We describe here a participatory, multi-criteria conflict assessment model developed to identify, locate, and categorize land tenure and use conflicts. The model was applied to the case of the "Chaco" region of the province of Salta, in northwestern Argentina. Basic geographic, cadastral, and social information needed to apply the model was made spatially explicit on a Geographic Information System. Results illustrate the contrasting perceptions of different stakeholders (government officials, social and environmental non-governmental organizations, large-scale agricultural producers, and scholars) on the intensity of land use conflicts in the study area. These results can help better understand and address land tenure conflicts in areas with different cultures and conflicting social and enviornmental interests.
NASA Astrophysics Data System (ADS)
Long, Nicholas James
This thesis serves to develop a preliminary foundational methodology for evaluating the static complexity of future lunar oxygen production systems when extensive information is not yet available about the various systems under consideration. Evaluating static complexity, as part of a overall system complexity analysis, is an important consideration in ultimately selecting a process to be used in a lunar base. When system complexity is higher, there is generally an overall increase in risk which could impact the safety of astronauts and the economic performance of the mission. To evaluate static complexity in lunar oxygen production, static complexity is simplified and defined into its essential components. First, three essential dimensions of static complexity are investigated, including interconnective complexity, strength of connections, and complexity in variety. Then a set of methods is developed upon which to separately evaluate each dimension. Q-connectivity analysis is proposed as a means to evaluate interconnective complexity and strength of connections. The law of requisite variety originating from cybernetic theory is suggested to interpret complexity in variety. Secondly, a means to aggregate the results of each analysis is proposed to create holistic measurement for static complexity using the Single Multi-Attribute Ranking Technique (SMART). Each method of static complexity analysis and the aggregation technique is demonstrated using notional data for four lunar oxygen production processes.
Huynh, Alexis K; Hamilton, Alison B; Farmer, Melissa M; Bean-Mayberry, Bevanne; Stirman, Shannon Wiltsey; Moin, Tannaz; Finley, Erin P
2018-01-01
Greater specification of implementation strategies is a challenge for implementation science, but there is little guidance for delineating the use of multiple strategies involved in complex interventions. The Cardiovascular (CV) Toolkit project entails implementation of a toolkit designed to reduce CV risk by increasing women's engagement in appropriate services. The CV Toolkit project follows an enhanced version of Replicating Effective Programs (REP), an evidence-based implementation strategy, to implement the CV Toolkit across four phases: pre-conditions, pre-implementation, implementation, and maintenance and evolution. Our current objective is to describe a method for mapping implementation strategies used in real time as part of the CV Toolkit project. This method supports description of the timing and content of bundled strategies and provides a structured process for developing a plan for implementation evaluation. We conducted a process of strategy mapping to apply Proctor and colleagues' rubric for specification of implementation strategies, constructing a matrix in which we identified each implementation strategy, its conceptual group, and the corresponding REP phase(s) in which it occurs. For each strategy, we also specified the actors involved, actions undertaken, action targets, dose of the implementation strategy, and anticipated outcome addressed. We iteratively refined the matrix with the implementation team, including use of simulation to provide initial validation. Mapping revealed patterns in the timing of implementation strategies within REP phases. Most implementation strategies involving the development of stakeholder interrelationships and training and educating stakeholders were introduced during the pre-conditions or pre-implementation phases. Strategies introduced in the maintenance and evolution phase emphasized communication, re-examination, and audit and feedback. In addition to its value for producing valid and reliable process evaluation data, mapping implementation strategies has informed development of a pragmatic blueprint for implementation and longitudinal analyses and evaluation activities. We update recent recommendations on specification of implementation strategies by considering the implications for multi-strategy frameworks and propose an approach for mapping the use of implementation strategies within complex, multi-level interventions, in support of rigorous evaluation. Developing pragmatic tools to aid in operationalizing the conduct of implementation and evaluation activities is essential to enacting sound implementation research.
Multi-Domain Transfer Learning for Early Diagnosis of Alzheimer's Disease.
Cheng, Bo; Liu, Mingxia; Shen, Dinggang; Li, Zuoyong; Zhang, Daoqiang
2017-04-01
Recently, transfer learning has been successfully applied in early diagnosis of Alzheimer's Disease (AD) based on multi-domain data. However, most of existing methods only use data from a single auxiliary domain, and thus cannot utilize the intrinsic useful correlation information from multiple domains. Accordingly, in this paper, we consider the joint learning of tasks in multi-auxiliary domains and the target domain, and propose a novel Multi-Domain Transfer Learning (MDTL) framework for early diagnosis of AD. Specifically, the proposed MDTL framework consists of two key components: 1) a multi-domain transfer feature selection (MDTFS) model that selects the most informative feature subset from multi-domain data, and 2) a multi-domain transfer classification (MDTC) model that can identify disease status for early AD detection. We evaluate our method on 807 subjects from the Alzheimer's Disease Neuroimaging Initiative (ADNI) database using baseline magnetic resonance imaging (MRI) data. The experimental results show that the proposed MDTL method can effectively utilize multi-auxiliary domain data for improving the learning performance in the target domain, compared with several state-of-the-art methods.
Le Troter, Arnaud; Fouré, Alexandre; Guye, Maxime; Confort-Gouny, Sylviane; Mattei, Jean-Pierre; Gondin, Julien; Salort-Campana, Emmanuelle; Bendahan, David
2016-04-01
Atlas-based segmentation is a powerful method for automatic structural segmentation of several sub-structures in many organs. However, such an approach has been very scarcely used in the context of muscle segmentation, and so far no study has assessed such a method for the automatic delineation of individual muscles of the quadriceps femoris (QF). In the present study, we have evaluated a fully automated multi-atlas method and a semi-automated single-atlas method for the segmentation and volume quantification of the four muscles of the QF and for the QF as a whole. The study was conducted in 32 young healthy males, using high-resolution magnetic resonance images (MRI) of the thigh. The multi-atlas-based segmentation method was conducted in 25 subjects. Different non-linear registration approaches based on free-form deformable (FFD) and symmetric diffeomorphic normalization algorithms (SyN) were assessed. Optimal parameters of two fusion methods, i.e., STAPLE and STEPS, were determined on the basis of the highest Dice similarity index (DSI) considering manual segmentation (MSeg) as the ground truth. Validation and reproducibility of this pipeline were determined using another MRI dataset recorded in seven healthy male subjects on the basis of additional metrics such as the muscle volume similarity values, intraclass coefficient, and coefficient of variation. Both non-linear registration methods (FFD and SyN) were also evaluated as part of a single-atlas strategy in order to assess longitudinal muscle volume measurements. The multi- and the single-atlas approaches were compared for the segmentation and the volume quantification of the four muscles of the QF and for the QF as a whole. Considering each muscle of the QF, the DSI of the multi-atlas-based approach was high 0.87 ± 0.11 and the best results were obtained with the combination of two deformation fields resulting from the SyN registration method and the STEPS fusion algorithm. The optimal variables for FFD and SyN registration methods were four templates and a kernel standard deviation ranging between 5 and 8. The segmentation process using a single-atlas-based method was more robust with DSI values higher than 0.9. From the vantage of muscle volume measurements, the multi-atlas-based strategy provided acceptable results regarding the QF muscle as a whole but highly variable results regarding individual muscle. On the contrary, the performance of the single-atlas-based pipeline for individual muscles was highly comparable to the MSeg, thereby indicating that this method would be adequate for longitudinal tracking of muscle volume changes in healthy subjects. In the present study, we demonstrated that both multi-atlas and single-atlas approaches were relevant for the segmentation of individual muscles of the QF in healthy subjects. Considering muscle volume measurements, the single-atlas method provided promising perspectives regarding longitudinal quantification of individual muscle volumes.
Multi-energy x-ray detectors to improve air-cargo security
NASA Astrophysics Data System (ADS)
Paulus, Caroline; Moulin, Vincent; Perion, Didier; Radisson, Patrick; Verger, Loïck
2017-05-01
X-ray based systems have been used for decades to screen luggage or cargo to detect illicit material. The advent of energy-sensitive photon-counting x-ray detectors mainly based on Cd(Zn)Te semi-conductor technology enables to improve discrimination between materials compared to single or dual energy technology. The presented work is part of the EUROSKY European project to develop a Single European Secure Air-Cargo Space. "Cargo" context implies the presence of relatively heavy objects and with potentially high atomic number. All the study is conducted on simulations with three different detectors: a typical dual energy sandwich detector, a realistic model of the commercial ME100 multi-energy detector marketed by MULTIX, and a ME100 "Cargo": a not yet existing modified multi-energy version of the ME100 more suited to air freight cargo inspection. Firstly, a comparison on simulated measurements shows the performances improvement of the new multi-energy detectors compared to the current dual-energy one. The relative performances are evaluated according to different criteria of separability or contrast-to-noise ratio and the impact of different parameters is studied (influence of channel number, type of materials and tube voltage). Secondly, performances of multi-energy detectors for overlaps processing in a dual-view system is accessed: the case of orthogonal projections has been studied, one giving dimensional values, the other one providing spectral data to assess effective atomic number. A method of overlap correction has been proposed and extended to multi-layer objects case. Therefore, Calibration and processing based on bi-material decomposition have been adapted for this purpose.
Rehabilitation centers in change: participatory methods for managing redesign and renovation.
Lahtinen, Marjaana; Nenonen, Suvi; Rasila, Heidi; Lehtelä, Jouni; Ruohomäki, Virpi; Reijula, Kari
2014-01-01
The aim of this article is to describe a set of participatory methods that we have either developed or modified for developing future work and service environments to better suit renewable rehabilitation processes. We discuss the methods in a larger framework of change process model and participatory design. Rehabilitation organizations are currently in transition; customer groups, financing, services, and the processes of rehabilitation centers are changing. The pressure for change challenges the centers to develop both their processes and facilities. There is a need for methods that support change management. Four participatory methods were developed: future workshop, change survey, multi-method assessment tool, and participatory design generator cards. They were tested and evaluated in three rehabilitation centers at the different phases of their change process. The developed methods were considered useful in creating a mutual understanding of the change goals between different stakeholders, providing a good picture of the work community's attitudes toward the change, forming an integrated overview of the built and perceived environment, inspiring new solutions, and supporting the management in steering the change process. The change process model described in this article serves as a practical framework that combined the viewpoints of organizational and facility development. However, participatory design continues to face challenges concerning communication between different stakeholders, and further development of the methods and processes is still needed. Intervention studies could provide data on the success factors that enhance the transformations in the rehabilitation sector. Design process, methodology, organizational transformation, planning, renovation.
Automatic digital surface model (DSM) generation from aerial imagery data
NASA Astrophysics Data System (ADS)
Zhou, Nan; Cao, Shixiang; He, Hongyan; Xing, Kun; Yue, Chunyu
2018-04-01
Aerial sensors are widely used to acquire imagery for photogrammetric and remote sensing application. In general, the images have large overlapped region, which provide a lot of redundant geometry and radiation information for matching. This paper presents a POS supported dense matching procedure for automatic DSM generation from aerial imagery data. The method uses a coarse-to-fine hierarchical strategy with an effective combination of several image matching algorithms: image radiation pre-processing, image pyramid generation, feature point extraction and grid point generation, multi-image geometrically constraint cross-correlation (MIG3C), global relaxation optimization, multi-image geometrically constrained least squares matching (MIGCLSM), TIN generation and point cloud filtering. The image radiation pre-processing is used in order to reduce the effects of the inherent radiometric problems and optimize the images. The presented approach essentially consists of 3 components: feature point extraction and matching procedure, grid point matching procedure and relational matching procedure. The MIGCLSM method is used to achieve potentially sub-pixel accuracy matches and identify some inaccurate and possibly false matches. The feasibility of the method has been tested on different aerial scale images with different landcover types. The accuracy evaluation is based on the comparison between the automatic extracted DSMs derived from the precise exterior orientation parameters (EOPs) and the POS.
Kratochvíla, Jiří; Jiřík, Radovan; Bartoš, Michal; Standara, Michal; Starčuk, Zenon; Taxt, Torfinn
2016-03-01
One of the main challenges in quantitative dynamic contrast-enhanced (DCE) MRI is estimation of the arterial input function (AIF). Usually, the signal from a single artery (ignoring contrast dispersion, partial volume effects and flow artifacts) or a population average of such signals (also ignoring variability between patients) is used. Multi-channel blind deconvolution is an alternative approach avoiding most of these problems. The AIF is estimated directly from the measured tracer concentration curves in several tissues. This contribution extends the published methods of multi-channel blind deconvolution by applying a more realistic model of the impulse residue function, the distributed capillary adiabatic tissue homogeneity model (DCATH). In addition, an alternative AIF model is used and several AIF-scaling methods are tested. The proposed method is evaluated on synthetic data with respect to the number of tissue regions and to the signal-to-noise ratio. Evaluation on clinical data (renal cell carcinoma patients before and after the beginning of the treatment) gave consistent results. An initial evaluation on clinical data indicates more reliable and less noise sensitive perfusion parameter estimates. Blind multi-channel deconvolution using the DCATH model might be a method of choice for AIF estimation in a clinical setup. © 2015 Wiley Periodicals, Inc.
Weighted Statistical Binning: Enabling Statistically Consistent Genome-Scale Phylogenetic Analyses
Bayzid, Md Shamsuzzoha; Mirarab, Siavash; Boussau, Bastien; Warnow, Tandy
2015-01-01
Because biological processes can result in different loci having different evolutionary histories, species tree estimation requires multiple loci from across multiple genomes. While many processes can result in discord between gene trees and species trees, incomplete lineage sorting (ILS), modeled by the multi-species coalescent, is considered to be a dominant cause for gene tree heterogeneity. Coalescent-based methods have been developed to estimate species trees, many of which operate by combining estimated gene trees, and so are called "summary methods". Because summary methods are generally fast (and much faster than more complicated coalescent-based methods that co-estimate gene trees and species trees), they have become very popular techniques for estimating species trees from multiple loci. However, recent studies have established that summary methods can have reduced accuracy in the presence of gene tree estimation error, and also that many biological datasets have substantial gene tree estimation error, so that summary methods may not be highly accurate in biologically realistic conditions. Mirarab et al. (Science 2014) presented the "statistical binning" technique to improve gene tree estimation in multi-locus analyses, and showed that it improved the accuracy of MP-EST, one of the most popular coalescent-based summary methods. Statistical binning, which uses a simple heuristic to evaluate "combinability" and then uses the larger sets of genes to re-calculate gene trees, has good empirical performance, but using statistical binning within a phylogenomic pipeline does not have the desirable property of being statistically consistent. We show that weighting the re-calculated gene trees by the bin sizes makes statistical binning statistically consistent under the multispecies coalescent, and maintains the good empirical performance. Thus, "weighted statistical binning" enables highly accurate genome-scale species tree estimation, and is also statistically consistent under the multi-species coalescent model. New data used in this study are available at DOI: http://dx.doi.org/10.6084/m9.figshare.1411146, and the software is available at https://github.com/smirarab/binning. PMID:26086579
Zeng, Jinle; Chang, Baohua; Du, Dong; Wang, Li; Chang, Shuhe; Peng, Guodong; Wang, Wenzhu
2018-01-05
Multi-layer/multi-pass welding (MLMPW) technology is widely used in the energy industry to join thick components. During automatic welding using robots or other actuators, it is very important to recognize the actual weld pass position using visual methods, which can then be used not only to perform reasonable path planning for actuators, but also to correct any deviations between the welding torch and the weld pass position in real time. However, due to the small geometrical differences between adjacent weld passes, existing weld position recognition technologies such as structured light methods are not suitable for weld position detection in MLMPW. This paper proposes a novel method for weld position detection, which fuses various kinds of information in MLMPW. First, a synchronous acquisition method is developed to obtain various kinds of visual information when directional light and structured light sources are on, respectively. Then, interferences are eliminated by fusing adjacent images. Finally, the information from directional and structured light images is fused to obtain the 3D positions of the weld passes. Experiment results show that each process can be done in 30 ms and the deviation is less than 0.6 mm. The proposed method can be used for automatic path planning and seam tracking in the robotic MLMPW process as well as electron beam freeform fabrication process.
NASA Astrophysics Data System (ADS)
Pries, V. V.; Proskuriakov, N. E.
2018-04-01
To control the assembly quality of multi-element mass-produced products on automatic rotor lines, control methods with operational feedback are required. However, due to possible failures in the operation of the devices and systems of automatic rotor line, there is always a real probability of getting defective (incomplete) products into the output process stream. Therefore, a continuous sampling control of the products completeness, based on the use of statistical methods, remains an important element in managing the quality of assembly of multi-element mass products on automatic rotor lines. The feature of continuous sampling control of the multi-element products completeness in the assembly process is its breaking sort, which excludes the possibility of returning component parts after sampling control to the process stream and leads to a decrease in the actual productivity of the assembly equipment. Therefore, the use of statistical procedures for continuous sampling control of the multi-element products completeness when assembled on automatic rotor lines requires the use of such sampling plans that ensure a minimum size of control samples. Comparison of the values of the limit of the average output defect level for the continuous sampling plan (CSP) and for the automated continuous sampling plan (ACSP) shows the possibility of providing lower limit values for the average output defects level using the ACSP-1. Also, the average sample size when using the ACSP-1 plan is less than when using the CSP-1 plan. Thus, the application of statistical methods in the assembly quality management of multi-element products on automatic rotor lines, involving the use of proposed plans and methods for continuous selective control, will allow to automating sampling control procedures and the required level of quality of assembled products while minimizing sample size.
Fabrication of tungsten wire reinforced nickel-base alloy composites
NASA Technical Reports Server (NTRS)
Brentnall, W. D.; Toth, I. J.
1974-01-01
Fabrication methods for tungsten fiber reinforced nickel-base superalloy composites were investigated. Three matrix alloys in pre-alloyed powder or rolled sheet form were evaluated in terms of fabricability into composite monotape and multi-ply forms. The utility of monotapes for fabricating more complex shapes was demonstrated. Preliminary 1093C (2000F) stress rupture tests indicated that efficient utilization of fiber strength was achieved in composites fabricated by diffusion bonding processes. The fabrication of thermal fatigue specimens is also described.
FVCOM one-way and two-way nesting using ESMF: Development and validation
NASA Astrophysics Data System (ADS)
Qi, Jianhua; Chen, Changsheng; Beardsley, Robert C.
2018-04-01
Built on the Earth System Modeling Framework (ESMF), the one-way and two-way nesting methods were implemented into the unstructured-grid Finite-Volume Community Ocean Model (FVCOM). These methods help utilize the unstructured-grid multi-domain nesting of FVCOM with an aim at resolving the multi-scale physical and ecosystem processes. A detail of procedures on implementing FVCOM into ESMF was described. The experiments were made to validate and evaluate the performance of the nested-grid FVCOM system. The first was made for a wave-current interaction case with a two-domain nesting with an emphasis on qualifying a critical need of nesting to resolve a high-resolution feature near the coast and harbor with little loss in computational efficiency. The second was conducted for the pseudo river plume cases to examine the differences in the model-simulated salinity between one-way and two-way nesting approaches and evaluate the performance of mass conservative two-way nesting method. The third was carried out for the river plume case in the realistic geometric domain in Mass Bay, supporting the importance for having the two-way nesting for coastal-estuarine integrated modeling. The nesting method described in this paper has been used in the Northeast Coastal Ocean Forecast System (NECOFS)-a global-regional-coastal nesting FVCOM system that has been placed into the end-to-end forecast and hindcast operations since 2007.
Parental Depressive Symptoms and Children's Sleep: The Role of Family Conflict
ERIC Educational Resources Information Center
El-Sheikh, Mona; Kelly, Ryan J.; Bagley, Erika J.; Wetter, Emily K.
2012-01-01
Background: We used a multi-method and multi-informant design to identify developmental pathways through which parental depressive symptoms contribute to children's sleep problems. Environmental factors including adult inter-partner conflict and parent-child conflict were considered as process variables of this relation. Methods: An ethnically and…
NASA Astrophysics Data System (ADS)
Noh, S. J.; Rakovec, O.; Kumar, R.; Samaniego, L. E.
2015-12-01
Accurate and reliable streamflow prediction is essential to mitigate social and economic damage coming from water-related disasters such as flood and drought. Sequential data assimilation (DA) may facilitate improved streamflow prediction using real-time observations to correct internal model states. In conventional DA methods such as state updating, parametric uncertainty is often ignored mainly due to practical limitations of methodology to specify modeling uncertainty with limited ensemble members. However, if parametric uncertainty related with routing and runoff components is not incorporated properly, predictive uncertainty by model ensemble may be insufficient to capture dynamics of observations, which may deteriorate predictability. Recently, a multi-scale parameter regionalization (MPR) method was proposed to make hydrologic predictions at different scales using a same set of model parameters without losing much of the model performance. The MPR method incorporated within the mesoscale hydrologic model (mHM, http://www.ufz.de/mhm) could effectively represent and control uncertainty of high-dimensional parameters in a distributed model using global parameters. In this study, we evaluate impacts of streamflow data assimilation over European river basins. Especially, a multi-parametric ensemble approach is tested to consider the effects of parametric uncertainty in DA. Because augmentation of parameters is not required within an assimilation window, the approach could be more stable with limited ensemble members and have potential for operational uses. To consider the response times and non-Gaussian characteristics of internal hydrologic processes, lagged particle filtering is utilized. The presentation will be focused on gains and limitations of streamflow data assimilation and multi-parametric ensemble method over large-scale basins.
Mentorship in Practice: A Multi-Method Approach.
ERIC Educational Resources Information Center
Schreck, Timothy J.; And Others
This study was conducted to evaluate a field-based mentorship program using a multi-method approach. It explored the use of mentorship as practiced in the Florida Compact, a business education partnership established in Florida in 1987. The study was designed to identify differences between mentors and mentorees, as well as differences within…
Li, Lian-Hui; Mo, Rong
2015-01-01
The production task queue has a great significance for manufacturing resource allocation and scheduling decision. Man-made qualitative queue optimization method has a poor effect and makes the application difficult. A production task queue optimization method is proposed based on multi-attribute evaluation. According to the task attributes, the hierarchical multi-attribute model is established and the indicator quantization methods are given. To calculate the objective indicator weight, criteria importance through intercriteria correlation (CRITIC) is selected from three usual methods. To calculate the subjective indicator weight, BP neural network is used to determine the judge importance degree, and then the trapezoid fuzzy scale-rough AHP considering the judge importance degree is put forward. The balanced weight, which integrates the objective weight and the subjective weight, is calculated base on multi-weight contribution balance model. The technique for order preference by similarity to an ideal solution (TOPSIS) improved by replacing Euclidean distance with relative entropy distance is used to sequence the tasks and optimize the queue by the weighted indicator value. A case study is given to illustrate its correctness and feasibility.
Li, Lian-hui; Mo, Rong
2015-01-01
The production task queue has a great significance for manufacturing resource allocation and scheduling decision. Man-made qualitative queue optimization method has a poor effect and makes the application difficult. A production task queue optimization method is proposed based on multi-attribute evaluation. According to the task attributes, the hierarchical multi-attribute model is established and the indicator quantization methods are given. To calculate the objective indicator weight, criteria importance through intercriteria correlation (CRITIC) is selected from three usual methods. To calculate the subjective indicator weight, BP neural network is used to determine the judge importance degree, and then the trapezoid fuzzy scale-rough AHP considering the judge importance degree is put forward. The balanced weight, which integrates the objective weight and the subjective weight, is calculated base on multi-weight contribution balance model. The technique for order preference by similarity to an ideal solution (TOPSIS) improved by replacing Euclidean distance with relative entropy distance is used to sequence the tasks and optimize the queue by the weighted indicator value. A case study is given to illustrate its correctness and feasibility. PMID:26414758
NASA Astrophysics Data System (ADS)
Walther, Christian; Frei, Michaela
2017-04-01
Mining of so-called "conflict minerals" is often related with small-scale mining activities. The here discussed activities are located in forested areas in the eastern DRC, which are often remote, difficult to access and insecure for traditional geological field inspection. In order to accelerate their CTC (Certified Trading Chain)-certification process, remote sensing data are used for detection and monitoring of these small-scale mining operations. This requires a high image acquisition frequency due to mining site relocations and for compensation of year-round high cloud coverage, especially for optical data evaluation. Freely available medium resolution optical data of Sentinel-2 and Landsat-8 as well as SAR data of Sentinel-1 are used for detecting small mining targets with a minimum size of approximately 0.5 km2. The developed method enables a robust multi-temporal detection of mining sites, monitoring of mining site spatio-temporal relocations and environmental changes. Since qualitative and quantitative comparable results are generated, the followed change detection approach is objective and transparent and may push the certification process forward.
Multi-analyte profiling of inflammatory mediators in COPD sputum--the effects of processing.
Pedersen, Frauke; Holz, Olaf; Lauer, Gereon; Quintini, Gianluca; Kiwull-Schöne, Heidrun; Kirsten, Anne-Marie; Magnussen, Helgo; Rabe, Klaus F; Goldmann, Torsten; Watz, Henrik
2015-02-01
Prior to using a new multi-analyte platform for the detection of markers in sputum it is advisable to assess whether sputum processing, especially mucus homogenization by dithiothreitol (DTT), affects the analysis. In this study we tested a novel Human Inflammation Multi Analyte Profiling® Kit (v1.0 Luminex platform; xMAP®). Induced sputum samples of 20 patients with stable COPD (mean FEV1, 59.2% pred.) were processed in parallel using standard processing (with DTT) and a more time consuming sputum dispersion method with phosphate buffered saline (PBS) only. A panel of 47 markers was analyzed in these sputum supernatants by the xMAP®. Twenty-five of 47 analytes have been detected in COPD sputum. Interestingly, 7 markers have been detected in sputum processed with DTT only, or significantly higher levels were observed following DTT treatment (VDBP, α-2-Macroglobulin, haptoglobin, α-1-antitrypsin, VCAM-1, and fibrinogen). However, standard DTT-processing resulted in lower detectable concentrations of ferritin, TIMP-1, MCP-1, MIP-1β, ICAM-1, and complement C3. The correlation between processing methods for the different markers indicates that DTT processing does not introduce a bias by affecting individual sputum samples differently. In conclusion, our data demonstrates that the Luminex-based xMAP® panel can be used for multi-analyte profiling of COPD sputum using the routinely applied method of sputum processing with DTT. However, researchers need to be aware that the absolute concentration of selected inflammatory markers can be affected by DTT. Copyright © 2014 Elsevier Ltd. All rights reserved.
Multi-scale image segmentation method with visual saliency constraints and its application
NASA Astrophysics Data System (ADS)
Chen, Yan; Yu, Jie; Sun, Kaimin
2018-03-01
Object-based image analysis method has many advantages over pixel-based methods, so it is one of the current research hotspots. It is very important to get the image objects by multi-scale image segmentation in order to carry out object-based image analysis. The current popular image segmentation methods mainly share the bottom-up segmentation principle, which is simple to realize and the object boundaries obtained are accurate. However, the macro statistical characteristics of the image areas are difficult to be taken into account, and fragmented segmentation (or over-segmentation) results are difficult to avoid. In addition, when it comes to information extraction, target recognition and other applications, image targets are not equally important, i.e., some specific targets or target groups with particular features worth more attention than the others. To avoid the problem of over-segmentation and highlight the targets of interest, this paper proposes a multi-scale image segmentation method with visually saliency graph constraints. Visual saliency theory and the typical feature extraction method are adopted to obtain the visual saliency information, especially the macroscopic information to be analyzed. The visual saliency information is used as a distribution map of homogeneity weight, where each pixel is given a weight. This weight acts as one of the merging constraints in the multi- scale image segmentation. As a result, pixels that macroscopically belong to the same object but are locally different can be more likely assigned to one same object. In addition, due to the constraint of visual saliency model, the constraint ability over local-macroscopic characteristics can be well controlled during the segmentation process based on different objects. These controls will improve the completeness of visually saliency areas in the segmentation results while diluting the controlling effect for non- saliency background areas. Experiments show that this method works better for texture image segmentation than traditional multi-scale image segmentation methods, and can enable us to give priority control to the saliency objects of interest. This method has been used in image quality evaluation, scattered residential area extraction, sparse forest extraction and other applications to verify its validation. All applications showed good results.
How to Measure Coach Burnout: An Evaluation of Three Burnout Measures
ERIC Educational Resources Information Center
Lundkvist, Erik; Stenling, Andreas; Gustafsson, Henrik; Hassmén, Peter
2014-01-01
Although coach burnout has been studied for 30 years, what measure to use in this context has not yet been problematized. This study focuses on evaluating convergent and discriminant validity of three coach burnout measures by using multi-trait/multi-method analysis (CT-C[M-1]) model. We choose Maslach Burnout Inventory (MBI), the two dimensional…
A Sociotechnical Approach to Evaluating the Impact of ICT on Clinical Care Environments
Li, Julie
2010-01-01
Introduction: Process-supporting information technology holds the potential to increase efficiency, reduce errors, and alter professional roles and responsibilities in a manner which allows improvement in the delivery of patient care. However, clashes between the model of health care work inscribed in these tools with the actual nature of work has resulted in staff resistance and decreased organisational uptake of ICT, as well as the facilitation of unexpected and negative effects in efficiency and patient safety. Sociotechnical theory provides a paradigm against which workflow and transfusion of ICT in healthcare could be better explored and understood. Design: This paper will conceptualise a formative, multi-method longitudinal evaluation process to explore the impact of ICT with an appreciation of the relationship between the social and technical systems within a clinical department. Method: Departmental culture, including clinical work processes and communication patterns will be thoroughly explored before system implementation using both quantitative and qualitative research methods. Findings will be compared with post implementation data, which will incorporate measurement of safety and workflow efficiency indicators. Discussion: Sociotechnical theory provides a paradigm against which workflow and transfusion of ICT in healthcare could be better explored and understood. However, sociotechnical and multimethod approaches to evaluation do not exist without criticism. Inherent in the protocol are limitations of sociotechnical theory and criticism of the multimethod approach; testing of the methodology in real clinical settings will serve to verify efficacy and refine the process. PMID:21594005
Simulation analysis of impulse characteristics of space debris irradiated by multi-pulse laser
NASA Astrophysics Data System (ADS)
Lin, Zhengguo; Jin, Xing; Chang, Hao; You, Xiangyu
2018-02-01
Cleaning space debris with laser is a hot topic in the field of space security research. Impulse characteristics are the basis of cleaning space debris with laser. In order to study the impulse characteristics of rotating irregular space debris irradiated by multi-pulse laser, the impulse calculation method of rotating space debris irradiated by multi-pulse laser is established based on the area matrix method. The calculation method of impulse and impulsive moment under multi-pulse irradiation is given. The calculation process of total impulse under multi-pulse irradiation is analyzed. With a typical non-planar space debris (cube) as example, the impulse characteristics of space debris irradiated by multi-pulse laser are simulated and analyzed. The effects of initial angular velocity, spot size and pulse frequency on impulse characteristics are investigated.
NASA Astrophysics Data System (ADS)
Liu, Shuai; Chen, Ge; Yao, Shifeng; Tian, Fenglin; Liu, Wei
2017-07-01
This paper presents a novel integrated marine visualization framework which focuses on processing, analyzing the multi-dimension spatiotemporal marine data in one workflow. Effective marine data visualization is needed in terms of extracting useful patterns, recognizing changes, and understanding physical processes in oceanography researches. However, the multi-source, multi-format, multi-dimension characteristics of marine data pose a challenge for interactive and feasible (timely) marine data analysis and visualization in one workflow. And, global multi-resolution virtual terrain environment is also needed to give oceanographers and the public a real geographic background reference and to help them to identify the geographical variation of ocean phenomena. This paper introduces a data integration and processing method to efficiently visualize and analyze the heterogeneous marine data. Based on the data we processed, several GPU-based visualization methods are explored to interactively demonstrate marine data. GPU-tessellated global terrain rendering using ETOPO1 data is realized and the video memory usage is controlled to ensure high efficiency. A modified ray-casting algorithm for the uneven multi-section Argo volume data is also presented and the transfer function is designed to analyze the 3D structure of ocean phenomena. Based on the framework we designed, an integrated visualization system is realized. The effectiveness and efficiency of the framework is demonstrated. This system is expected to make a significant contribution to the demonstration and understanding of marine physical process in a virtual global environment.
Ocean Color Measurements from Landsat-8 OLI using SeaDAS
NASA Technical Reports Server (NTRS)
Franz, Bryan Alden; Bailey, Sean W.; Kuring, Norman; Werdell, P. Jeremy
2014-01-01
The Operational Land Imager (OLI) is a multi-spectral radiometer hosted on the recently launched Landsat-8 satellite. OLI includes a suite of relatively narrow spectral bands at 30-meter spatial resolution in the visible to shortwave infrared that make it a potential tool for ocean color radiometry: measurement of the reflected spectral radiance upwelling from beneath the ocean surface that carries information on the biogeochemical constituents of the upper ocean euphotic zone. To evaluate the potential of OLI to measure ocean color, processing support was implemented in SeaDAS, which is an open-source software package distributed by NASA for processing, analysis, and display of ocean remote sensing measurements from a variety of satellite-based multi-spectral radiometers. Here we describe the implementation of OLI processing capabilities within SeaDAS, including support for various methods of atmospheric correction to remove the effects of atmospheric scattering and absorption and retrieve the spectral remote-sensing reflectance (Rrs; sr exp 1). The quality of the retrieved Rrs imagery will be assessed, as will the derived water column constituents such as the concentration of the phytoplankton pigment chlorophyll a.
Fast data reconstructed method of Fourier transform imaging spectrometer based on multi-core CPU
NASA Astrophysics Data System (ADS)
Yu, Chunchao; Du, Debiao; Xia, Zongze; Song, Li; Zheng, Weijian; Yan, Min; Lei, Zhenggang
2017-10-01
Imaging spectrometer can gain two-dimensional space image and one-dimensional spectrum at the same time, which shows high utility in color and spectral measurements, the true color image synthesis, military reconnaissance and so on. In order to realize the fast reconstructed processing of the Fourier transform imaging spectrometer data, the paper designed the optimization reconstructed algorithm with OpenMP parallel calculating technology, which was further used for the optimization process for the HyperSpectral Imager of `HJ-1' Chinese satellite. The results show that the method based on multi-core parallel computing technology can control the multi-core CPU hardware resources competently and significantly enhance the calculation of the spectrum reconstruction processing efficiency. If the technology is applied to more cores workstation in parallel computing, it will be possible to complete Fourier transform imaging spectrometer real-time data processing with a single computer.
Simultaneous multi-component seismic denoising and reconstruction via K-SVD
NASA Astrophysics Data System (ADS)
Hou, Sian; Zhang, Feng; Li, Xiangyang; Zhao, Qiang; Dai, Hengchang
2018-06-01
Data denoising and reconstruction play an increasingly significant role in seismic prospecting for their value in enhancing effective signals, dealing with surface obstacles and reducing acquisition costs. In this paper, we propose a novel method to denoise and reconstruct multicomponent seismic data simultaneously. This method lies within the framework of machine learning and the key points are defining a suitable weight function and a modified inner product operator. The purpose of these two processes are to perform missing data machine learning when the random noise deviation is unknown, and building a mathematical relationship for each component to incorporate all the information of multi-component data. Two examples, using synthetic and real multicomponent data, demonstrate that the new method is a feasible alternative for multi-component seismic data processing.
NASA Astrophysics Data System (ADS)
Zheng, Y.; Chen, J.
2017-09-01
A modified multi-objective particle swarm optimization method is proposed for obtaining Pareto-optimal solutions effectively. Different from traditional multi-objective particle swarm optimization methods, Kriging meta-models and the trapezoid index are introduced and integrated with the traditional one. Kriging meta-models are built to match expensive or black-box functions. By applying Kriging meta-models, function evaluation numbers are decreased and the boundary Pareto-optimal solutions are identified rapidly. For bi-objective optimization problems, the trapezoid index is calculated as the sum of the trapezoid's area formed by the Pareto-optimal solutions and one objective axis. It can serve as a measure whether the Pareto-optimal solutions converge to the Pareto front. Illustrative examples indicate that to obtain Pareto-optimal solutions, the method proposed needs fewer function evaluations than the traditional multi-objective particle swarm optimization method and the non-dominated sorting genetic algorithm II method, and both the accuracy and the computational efficiency are improved. The proposed method is also applied to the design of a deepwater composite riser example in which the structural performances are calculated by numerical analysis. The design aim was to enhance the tension strength and minimize the cost. Under the buckling constraint, the optimal trade-off of tensile strength and material volume is obtained. The results demonstrated that the proposed method can effectively deal with multi-objective optimizations with black-box functions.
Robotic tool positioning process using a multi-line off-axis laser triangulation sensor
NASA Astrophysics Data System (ADS)
Pinto, T. C.; Matos, G.
2018-03-01
Proper positioning of a friction stir welding head for pin insertion, driven by a closed chain robot, is important to ensure quality repair of cracks. A multi-line off-axis laser triangulation sensor was designed to be integrated to the robot, allowing relative measurements of the surface to be repaired. This work describes the sensor characteristics, its evaluation and the measurement process for tool positioning to a surface point of interest. The developed process uses a point of interest image and a measured point cloud to define the translation and rotation for tool positioning. Sensor evaluation and tests are described. Keywords: laser triangulation, 3D measurement, tool positioning, robotics.
Experimental evaluation of the impact of packet capturing tools for web services.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choe, Yung Ryn; Mohapatra, Prasant; Chuah, Chen-Nee
Network measurement is a discipline that provides the techniques to collect data that are fundamental to many branches of computer science. While many capturing tools and comparisons have made available in the literature and elsewhere, the impact of these packet capturing tools on existing processes have not been thoroughly studied. While not a concern for collection methods in which dedicated servers are used, many usage scenarios of packet capturing now requires the packet capturing tool to run concurrently with operational processes. In this work we perform experimental evaluations of the performance impact that packet capturing process have on web-based services;more » in particular, we observe the impact on web servers. We find that packet capturing processes indeed impact the performance of web servers, but on a multi-core system the impact varies depending on whether the packet capturing and web hosting processes are co-located or not. In addition, the architecture and behavior of the web server and process scheduling is coupled with the behavior of the packet capturing process, which in turn also affect the web server's performance.« less
Morris, George K.; Steele, Carolyn D.; Wells, Joy G.
1972-01-01
We compared the relative advantages of using glass test tubes and plastic multi-well plates in the serological identification of Salmonella cultures by the Spicer-Edwards method, and we conclude that the advantages of multi-well plates outweigh those of test tubes. Images PMID:4640740
NASA Astrophysics Data System (ADS)
Han, Zhenyu; Sun, Shouzheng; Fu, Yunzhong; Fu, Hongya
2017-10-01
Viscidity is an important physical indicator for assessing fluidity of resin that is beneficial to contact resin with the fibers effectively and reduce manufacturing defects during automated fiber placement (AFP) process. However, the effect of processing parameters on viscidity evolution is rarely studied during AFP process. In this paper, viscidities under different scales are analyzed based on multi-scale analysis method. Firstly, viscous dissipation energy (VDE) within meso-unit under different processing parameters is assessed by using finite element method (FEM). According to multi-scale energy transfer model, meso-unit energy is used as the boundary condition for microscopic analysis. Furthermore, molecular structure of micro-system is built by molecular dynamics (MD) method. And viscosity curves are then obtained by integrating stress autocorrelation function (SACF) with time. Finally, the correlation characteristics of processing parameters to viscosity are revealed by using gray relational analysis method (GRAM). A group of processing parameters is found out to achieve the stability of viscosity and better fluidity of resin.
Özkan, Aysun
2013-02-01
Healthcare waste should be managed carefully because of infected, pathological, etc. content especially in developing countries. Applied management systems must be the most appropriate solution from a technical, environmental, economic and social point of view. The main objective of this study was to analyse the current status of healthcare waste management in Turkey, and to investigate the most appropriate treatment/disposal option by using different decision-making techniques. For this purpose, five different healthcare waste treatment/disposal alternatives including incineration, microwaving, on-site sterilization, off-site sterilization and landfill were evaluated according to two multi-criteria decision-making techniques: analytic network process (ANP) and ELECTRE. In this context, benefits, costs and risks for the alternatives were taken into consideration. Furthermore, the prioritization and ranking of the alternatives were determined and compared for both methods. According to the comparisons, the off-site sterilization technique was found to be the most appropriate solution in both cases.
NASA Astrophysics Data System (ADS)
Torres, V.; Quek, S.; Gaydecki, P.
2010-02-01
Aging and deterioration of the main functional parts in civil structures is one of the biggest problems that private and governmental institutions, dedicated to operate and maintain such structures, are facing now days. In the case of relatively old suspension bridges, problems emerge due to corrosion and break of wires in the main cables. Decisive information and a reliable monitoring and evaluation are factors of great relevance required to prevent significant or catastrophic damages caused to the structure, and more importantly, to people. The main challenge for the NDE methods of inspection arises in dealing with the steel wrapping barrier of the suspension cable, which main function is to shield, shape and hold the bundles. The following work, presents a study of a multi-Magnetoresistive sensors system aiming to support the monitoring and evaluation of suspension cables at some of its stages. Modelling, signal acquisition, signal processing, experiments and the initial phases of implementation are presented and discussed widely.
Generating description with multi-feature fusion and saliency maps of image
NASA Astrophysics Data System (ADS)
Liu, Lisha; Ding, Yuxuan; Tian, Chunna; Yuan, Bo
2018-04-01
Generating description for an image can be regard as visual understanding. It is across artificial intelligence, machine learning, natural language processing and many other areas. In this paper, we present a model that generates description for images based on RNN (recurrent neural network) with object attention and multi-feature of images. The deep recurrent neural networks have excellent performance in machine translation, so we use it to generate natural sentence description for images. The proposed method uses single CNN (convolution neural network) that is trained on ImageNet to extract image features. But we think it can not adequately contain the content in images, it may only focus on the object area of image. So we add scene information to image feature using CNN which is trained on Places205. Experiments show that model with multi-feature extracted by two CNNs perform better than which with a single feature. In addition, we make saliency weights on images to emphasize the salient objects in images. We evaluate our model on MSCOCO based on public metrics, and the results show that our model performs better than several state-of-the-art methods.
Sobieranski, Antonio C; Inci, Fatih; Tekin, H Cumhur; Yuksekkaya, Mehmet; Comunello, Eros; Cobra, Daniel; von Wangenheim, Aldo; Demirci, Utkan
2017-01-01
In this paper, an irregular displacement-based lensless wide-field microscopy imaging platform is presented by combining digital in-line holography and computational pixel super-resolution using multi-frame processing. The samples are illuminated by a nearly coherent illumination system, where the hologram shadows are projected into a complementary metal-oxide semiconductor-based imaging sensor. To increase the resolution, a multi-frame pixel resolution approach is employed to produce a single holographic image from multiple frame observations of the scene, with small planar displacements. Displacements are resolved by a hybrid approach: (i) alignment of the LR images by a fast feature-based registration method, and (ii) fine adjustment of the sub-pixel information using a continuous optimization approach designed to find the global optimum solution. Numerical method for phase-retrieval is applied to decode the signal and reconstruct the morphological details of the analyzed sample. The presented approach was evaluated with various biological samples including sperm and platelets, whose dimensions are in the order of a few microns. The obtained results demonstrate a spatial resolution of 1.55 µm on a field-of-view of ≈30 mm2. PMID:29657866
Endoscopic ultrasound guided fine needle aspiration and useful ancillary methods
Tadic, Mario; Stoos-Veic, Tajana; Kusec, Rajko
2014-01-01
The role of endoscopic ultrasound (EUS) in evaluating pancreatic pathology has been well documented from the beginning of its clinical use. High spatial resolution and the close proximity to the evaluated organs within the mediastinum and abdominal cavity allow detection of small focal lesions and precise tissue acquisition from suspected lesions within the reach of this method. Fine needle aspiration (FNA) is considered of additional value to EUS and is performed to obtain tissue diagnosis. Tissue acquisition from suspected lesions for cytological or histological analysis allows, not only the differentiation between malignant and non-malignant lesions, but, in most cases, also the accurate distinction between the various types of malignant lesions. It is well documented that the best results are achieved only if an adequate sample is obtained for further analysis, if the material is processed in an appropriate way, and if adequate ancillary methods are performed. This is a multi-step process and could be quite a challenge in some cases. In this article, we discuss the technical aspects of tissue acquisition by EUS-guided-FNA (EUS-FNA), as well as the role of an on-site cytopathologist, various means of specimen processing, and the selection of the appropriate ancillary method for providing an accurate tissue diagnosis and maximizing the yield of this method. The main goal of this review is to alert endosonographers, not only to the different possibilities of tissue acquisition, namely EUS-FNA, but also to bring to their attention the importance of proper sample processing in the evaluation of various lesions in the gastrointestinal tract and other accessible organs. All aspects of tissue acquisition (needles, suction, use of stylet, complications, etc.) have been well discussed lately. Adequate tissue samples enable comprehensive diagnoses, which answer the main clinical questions, thus enabling targeted therapy. PMID:25339816
NASA Astrophysics Data System (ADS)
Fuksa, Dariusz; Trzaskuś-Żak, Beata; Gałaś, Zdzisław; Utrata, Arkadiusz
2017-03-01
In the practice of mining companies, the vast majority of them produce more than one product. The analysis of the break-even, which is referred to as CVP (Cost-Volume-Profit) analysis (Wilkinson, 2005; Czopek, 2003) in their case is significantly constricted, given the necessity to include multi-assortment structure in the analysis, which may have more than 20 types of assortments (depending on the grain size) in their offer, as in the case of open-pit mines. The article presents methods of evaluation of break-even (volume and value) for both a single-assortment production and a multi-assortment production. The complexity of problem of break-even evaluation for multi-assortment production has resulted in formation of many methods, and, simultaneously, various approaches to its analysis, especially differences in accounting fixed costs, which may be either totally accounted for among particular assortments, relating to the whole company or partially accounted for among particular assortments and partially relating to the company, as a whole. The evaluation of the chosen methods of break-even analysis, given the availability of data, was based on two examples of mining companies: an open-pit mine of rock materials and an underground hard coal mine. The selection of methods was set by the available data provided by the companies. The data for the analysis comes from internal documentation of the mines - financial statements, breakdowns and cost calculations.
Periodical capacity setting methods for make-to-order multi-machine production systems
Altendorfer, Klaus; Hübl, Alexander; Jodlbauer, Herbert
2014-01-01
The paper presents different periodical capacity setting methods for make-to-order, multi-machine production systems with stochastic customer required lead times and stochastic processing times to improve service level and tardiness. These methods are developed as decision support when capacity flexibility exists, such as, a certain range of possible working hours a week for example. The methods differ in the amount of information used whereby all are based on the cumulated capacity demand at each machine. In a simulation study the methods’ impact on service level and tardiness is compared to a constant provided capacity for a single and a multi-machine setting. It is shown that the tested capacity setting methods can lead to an increase in service level and a decrease in average tardiness in comparison to a constant provided capacity. The methods using information on processing time and customer required lead time distribution perform best. The results found in this paper can help practitioners to make efficient use of their flexible capacity. PMID:27226649
Multi-kw dc power distribution system study program
NASA Technical Reports Server (NTRS)
Berkery, E. A.; Krausz, A.
1974-01-01
The first phase of the Multi-kw dc Power Distribution Technology Program is reported and involves the test and evaluation of a technology breadboard in a specifically designed test facility according to design concepts developed in a previous study on space vehicle electrical power processing, distribution, and control. The static and dynamic performance, fault isolation, reliability, electromagnetic interference characterisitics, and operability factors of high distribution systems were studied in order to gain a technology base for the use of high voltage dc systems in future aerospace vehicles. Detailed technical descriptions are presented and include data for the following: (1) dynamic interactions due to operation of solid state and electromechanical switchgear; (2) multiplexed and computer controlled supervision and checkout methods; (3) pulse width modulator design; and (4) cable design factors.
A New Method for Setting Calculation Sequence of Directional Relay Protection in Multi-Loop Networks
NASA Astrophysics Data System (ADS)
Haijun, Xiong; Qi, Zhang
2016-08-01
Workload of relay protection setting calculation in multi-loop networks may be reduced effectively by optimization setting calculation sequences. A new method of setting calculation sequences of directional distance relay protection in multi-loop networks based on minimum broken nodes cost vector (MBNCV) was proposed to solve the problem experienced in current methods. Existing methods based on minimum breakpoint set (MBPS) lead to more break edges when untying the loops in dependent relationships of relays leading to possibly more iterative calculation workloads in setting calculations. A model driven approach based on behavior trees (BT) was presented to improve adaptability of similar problems. After extending the BT model by adding real-time system characters, timed BT was derived and the dependency relationship in multi-loop networks was then modeled. The model was translated into communication sequence process (CSP) models and an optimization setting calculation sequence in multi-loop networks was finally calculated by tools. A 5-nodes multi-loop network was applied as an example to demonstrate effectiveness of the modeling and calculation method. Several examples were then calculated with results indicating the method effectively reduces the number of forced broken edges for protection setting calculation in multi-loop networks.
Multi-core processing and scheduling performance in CMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hernandez, J. M.; Evans, D.; Foulkes, S.
2012-01-01
Commodity hardware is going many-core. We might soon not be able to satisfy the job memory needs per core in the current single-core processing model in High Energy Physics. In addition, an ever increasing number of independent and incoherent jobs running on the same physical hardware not sharing resources might significantly affect processing performance. It will be essential to effectively utilize the multi-core architecture. CMS has incorporated support for multi-core processing in the event processing framework and the workload management system. Multi-core processing jobs share common data in memory, such us the code libraries, detector geometry and conditions data, resultingmore » in a much lower memory usage than standard single-core independent jobs. Exploiting this new processing model requires a new model in computing resource allocation, departing from the standard single-core allocation for a job. The experiment job management system needs to have control over a larger quantum of resource since multi-core aware jobs require the scheduling of multiples cores simultaneously. CMS is exploring the approach of using whole nodes as unit in the workload management system where all cores of a node are allocated to a multi-core job. Whole-node scheduling allows for optimization of the data/workflow management (e.g. I/O caching, local merging) but efficient utilization of all scheduled cores is challenging. Dedicated whole-node queues have been setup at all Tier-1 centers for exploring multi-core processing workflows in CMS. We present the evaluation of the performance scheduling and executing multi-core workflows in whole-node queues compared to the standard single-core processing workflows.« less
Multi-Objective Hybrid Optimal Control for Multiple-Flyby Low-Thrust Mission Design
NASA Technical Reports Server (NTRS)
Englander, Jacob A.; Vavrina, Matthew A.; Ghosh, Alexander R.
2015-01-01
Preliminary design of low-thrust interplanetary missions is a highly complex process. The mission designer must choose discrete parameters such as the number of flybys, the bodies at which those flybys are performed, and in some cases the final destination. In addition, a time-history of control variables must be chosen that defines the trajectory. There are often many thousands, if not millions, of possible trajectories to be evaluated. The customer who commissions a trajectory design is not usually interested in a point solution, but rather the exploration of the trade space of trajectories between several different objective functions. This can be a very expensive process in terms of the number of human analyst hours required. An automated approach is therefore very desirable. This work presents such an approach by posing the mission design problem as a multi-objective hybrid optimal control problem. The method is demonstrated on a hypothetical mission to the main asteroid belt.
A Patch-Based Approach for the Segmentation of Pathologies: Application to Glioma Labelling.
Cordier, Nicolas; Delingette, Herve; Ayache, Nicholas
2016-04-01
In this paper, we describe a novel and generic approach to address fully-automatic segmentation of brain tumors by using multi-atlas patch-based voting techniques. In addition to avoiding the local search window assumption, the conventional patch-based framework is enhanced through several simple procedures: an improvement of the training dataset in terms of both label purity and intensity statistics, augmented features to implicitly guide the nearest-neighbor-search, multi-scale patches, invariance to cube isometries, stratification of the votes with respect to cases and labels. A probabilistic model automatically delineates regions of interest enclosing high-probability tumor volumes, which allows the algorithm to achieve highly competitive running time despite minimal processing power and resources. This method was evaluated on Multimodal Brain Tumor Image Segmentation challenge datasets. State-of-the-art results are achieved, with a limited learning stage thus restricting the risk of overfit. Moreover, segmentation smoothness does not involve any post-processing.
NASA Technical Reports Server (NTRS)
Tavana, Madjid
2005-01-01
"To understand and protect our home planet, to explore the universe and search for life, and to inspire the next generation of explorers" is NASA's mission. The Systems Management Office at Johnson Space Center (JSC) is searching for methods to effectively manage the Center's resources to meet NASA's mission. D-Side is a group multi-criteria decision support system (GMDSS) developed to support facility decisions at JSC. D-Side uses a series of sequential and structured processes to plot facilities in a three-dimensional (3-D) graph on the basis of each facility alignment with NASA's mission and goals, the extent to which other facilities are dependent on the facility, and the dollar value of capital investments that have been postponed at the facility relative to the facility replacement value. A similarity factor rank orders facilities based on their Euclidean distance from Ideal and Nadir points. These similarity factors are then used to allocate capital improvement resources across facilities. We also present a parallel model that can be used to support decisions concerning allocation of human resources investments across workforce units. Finally, we present results from a pilot study where 12 experienced facility managers from NASA used D-Side and the organization's current approach to rank order and allocate funds for capital improvement across 20 facilities. Users evaluated D-Side favorably in terms of ease of use, the quality of the decision-making process, decision quality, and overall value-added. Their evaluations of D-Side were significantly more favorable than their evaluations of the current approach. Keywords: NASA, Multi-Criteria Decision Making, Decision Support System, AHP, Euclidean Distance, 3-D Modeling, Facility Planning, Workforce Planning.
Multi-Domain Transfer Learning for Early Diagnosis of Alzheimer’s Disease
Cheng, Bo; Liu, Mingxia; Li, Zuoyong
2017-01-01
Recently, transfer learning has been successfully applied in early diagnosis of Alzheimer’s Disease (AD) based on multi-domain data. However, most of existing methods only use data from a single auxiliary domain, and thus cannot utilize the intrinsic useful correlation information from multiple domains. Accordingly, in this paper, we consider the joint learning of tasks in multi-auxiliary domains and the target domain, and propose a novel Multi-Domain Transfer Learning (MDTL) framework for early diagnosis of AD. Specifically, the proposed MDTL framework consists of two key components: 1) a multi-domain transfer feature selection (MDTFS) model that selects the most informative feature subset from multi-domain data, and 2) a multidomain transfer classification (MDTC) model that can identify disease status for early AD detection. We evaluate our method on 807 subjects from the Alzheimer’s Disease Neuroimaging Initiative (ADNI) database using baseline magnetic resonance imaging (MRI) data. The experimental results show that the proposed MDTL method can effectively utilize multi-auxiliary domain data for improving the learning performance in the target domain, compared with several state-of-the-art methods. PMID:27928657
NASA Astrophysics Data System (ADS)
Luo, Lin
2017-08-01
In the practical selection of Wushu athletes, the objective evaluation of the level of athletes lacks sufficient technical indicators and often relies on the coach’s subjective judgments. It is difficult to accurately and objectively reflect the overall quality of the athletes without a fully quantified indicator system, thus affecting the level improvement of Wushu competition. The analytic hierarchy process (AHP) is a systemic analysis method combining quantitative and qualitative analysis. This paper realizes structured, hierarchized and quantified decision-making process of evaluating broadsword, rod, sword and spear athletes in the AHP. Combing characteristics of the athletes, analysis is carried out from three aspects, i.e., the athlete’s body shape, physical function and sports quality and 18 specific evaluation indicators established, and then combining expert advice and practical experience, pairwise comparison matrix is determined, and then the weight of the indicators and comprehensive evaluation coefficient are obtained to establish the evaluation model for the athletes, thus providing a scientific theoretical basis for the selection of Wushu athletes. The evaluation model proposed in this paper has realized the evaluation system of broadsword, rod, sword and spear athletes, which has effectively improved the scientific level of Wushu athletes selection in practical application.
Mesoscopic modeling of multi-physicochemical transport phenomena in porous media
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kang, Qinjin; Wang, Moran; Mukherjee, Partha P
2009-01-01
We present our recent progress on mesoscopic modeling of multi-physicochemical transport phenomena in porous media based on the lattice Boltzmann method. Simulation examples include injection of CO{sub 2} saturated brine into a limestone rock, two-phase behavior and flooding phenomena in polymer electrolyte fuel cells, and electroosmosis in homogeneously charged porous media. It is shown that the lattice Boltzmann method can account for multiple, coupled physicochemical processes in these systems and can shed some light on the underlying physics occuning at the fundamental scale. Therefore, it can be a potential powerful numerical tool to analyze multi-physicochemical processes in various energy, earth,more » and environmental systems.« less
Combinatorial Optimization in Project Selection Using Genetic Algorithm
NASA Astrophysics Data System (ADS)
Dewi, Sari; Sawaluddin
2018-01-01
This paper discusses the problem of project selection in the presence of two objective functions that maximize profit and minimize cost and the existence of some limitations is limited resources availability and time available so that there is need allocation of resources in each project. These resources are human resources, machine resources, raw material resources. This is treated as a consideration to not exceed the budget that has been determined. So that can be formulated mathematics for objective function (multi-objective) with boundaries that fulfilled. To assist the project selection process, a multi-objective combinatorial optimization approach is used to obtain an optimal solution for the selection of the right project. It then described a multi-objective method of genetic algorithm as one method of multi-objective combinatorial optimization approach to simplify the project selection process in a large scope.
Two Undergraduate Process Modeling Courses Taught Using Inductive Learning Methods
ERIC Educational Resources Information Center
Soroush, Masoud; Weinberger, Charles B.
2010-01-01
This manuscript presents a successful application of inductive learning in process modeling. It describes two process modeling courses that use inductive learning methods such as inquiry learning and problem-based learning, among others. The courses include a novel collection of multi-disciplinary complementary process modeling examples. They were…
Multi-fuel reformers for fuel cells used in transportation. Phase 1: Multi-fuel reformers
NASA Astrophysics Data System (ADS)
1994-05-01
DOE has established the goal, through the Fuel Cells in Transportation Program, of fostering the rapid development and commercialization of fuel cells as economic competitors for the internal combustion engine. Central to this goal is a safe feasible means of supplying hydrogen of the required purity to the vehicular fuel cell system. Two basic strategies are being considered: (1) on-board fuel processing whereby alternative fuels such as methanol, ethanol or natural gas stored on the vehicle undergo reformation and subsequent processing to produce hydrogen, and (2) on-board storage of pure hydrogen provided by stationary fuel processing plants. This report analyzes fuel processor technologies, types of fuel and fuel cell options for on-board reformation. As the Phase 1 of a multi-phased program to develop a prototype multi-fuel reformer system for a fuel cell powered vehicle, the objective of this program was to evaluate the feasibility of a multi-fuel reformer concept and to select a reforming technology for further development in the Phase 2 program, with the ultimate goal of integration with a DOE-designated fuel cell and vehicle configuration. The basic reformer processes examined in this study included catalytic steam reforming (SR), non-catalytic partial oxidation (POX) and catalytic partial oxidation (also known as Autothermal Reforming, or ATR). Fuels under consideration in this study included methanol, ethanol, and natural gas. A systematic evaluation of reforming technologies, fuels, and transportation fuel cell applications was conducted for the purpose of selecting a suitable multi-fuel processor for further development and demonstration in a transportation application.
Research on Logistics Service Providers Selection Based on AHP and VIKOR
NASA Astrophysics Data System (ADS)
Shan, Lu
The logistics service providers supply a kind of service which is a service product, thus there is a plenty of uncertainty and fuzzy in selecting logistics service providers. AHP is first used to calculate the weights of logistics services providers evaluations and then VIKOR method developed for multi-criteria optimization determining a compromise solution is applied to select the logistics services providers. The latter method provides a maximum "group utility" for the "majority" and minimum of an individual regret for the "opponent". This decision making process of logistics services providers selection is verified to be scientific and feasible through the empirical research.
Dolan, James G
2010-01-01
Current models of healthcare quality recommend that patient management decisions be evidence-based and patient-centered. Evidence-based decisions require a thorough understanding of current information regarding the natural history of disease and the anticipated outcomes of different management options. Patient-centered decisions incorporate patient preferences, values, and unique personal circumstances into the decision making process and actively involve both patients along with health care providers as much as possible. Fundamentally, therefore, evidence-based, patient-centered decisions are multi-dimensional and typically involve multiple decision makers.Advances in the decision sciences have led to the development of a number of multiple criteria decision making methods. These multi-criteria methods are designed to help people make better choices when faced with complex decisions involving several dimensions. They are especially helpful when there is a need to combine "hard data" with subjective preferences, to make trade-offs between desired outcomes, and to involve multiple decision makers. Evidence-based, patient-centered clinical decision making has all of these characteristics. This close match suggests that clinical decision support systems based on multi-criteria decision making techniques have the potential to enable patients and providers to carry out the tasks required to implement evidence-based, patient-centered care effectively and efficiently in clinical settings.The goal of this paper is to give readers a general introduction to the range of multi-criteria methods available and show how they could be used to support clinical decision-making. Methods discussed include the balance sheet, the even swap method, ordinal ranking methods, direct weighting methods, multi-attribute decision analysis, and the analytic hierarchy process (AHP).
Dolan, James G.
2010-01-01
Current models of healthcare quality recommend that patient management decisions be evidence-based and patient-centered. Evidence-based decisions require a thorough understanding of current information regarding the natural history of disease and the anticipated outcomes of different management options. Patient-centered decisions incorporate patient preferences, values, and unique personal circumstances into the decision making process and actively involve both patients along with health care providers as much as possible. Fundamentally, therefore, evidence-based, patient-centered decisions are multi-dimensional and typically involve multiple decision makers. Advances in the decision sciences have led to the development of a number of multiple criteria decision making methods. These multi-criteria methods are designed to help people make better choices when faced with complex decisions involving several dimensions. They are especially helpful when there is a need to combine “hard data” with subjective preferences, to make trade-offs between desired outcomes, and to involve multiple decision makers. Evidence-based, patient-centered clinical decision making has all of these characteristics. This close match suggests that clinical decision support systems based on multi-criteria decision making techniques have the potential to enable patients and providers to carry out the tasks required to implement evidence-based, patient-centered care effectively and efficiently in clinical settings. The goal of this paper is to give readers a general introduction to the range of multi-criteria methods available and show how they could be used to support clinical decision-making. Methods discussed include the balance sheet, the even swap method, ordinal ranking methods, direct weighting methods, multi-attribute decision analysis, and the analytic hierarchy process (AHP) PMID:21394218
NASA Astrophysics Data System (ADS)
Tian, Yu; Rao, Changhui; Wei, Kai
2008-07-01
The adaptive optics can only partially compensate the image blurred by atmospheric turbulence due to the observing condition and hardware restriction. A post-processing method based on frame selection and multi-frames blind deconvolution to improve images partially corrected by adaptive optics is proposed. The appropriate frames which are suitable for blind deconvolution from the recorded AO close-loop frames series are selected by the frame selection technique and then do the multi-frame blind deconvolution. There is no priori knowledge except for the positive constraint in blind deconvolution. It is benefit for the use of multi-frame images to improve the stability and convergence of the blind deconvolution algorithm. The method had been applied in the image restoration of celestial bodies which were observed by 1.2m telescope equipped with 61-element adaptive optical system at Yunnan Observatory. The results show that the method can effectively improve the images partially corrected by adaptive optics.
Lessons Learned for Collaborative Clinical Content Development
Collins, S.A.; Bavuso, K.; Zuccotti, G.; Rocha, R.A.
2013-01-01
Background Site-specific content configuration of vendor-based Electronic Health Records (EHRs) is a vital step in the development of standardized and interoperable content that can be used for clinical decision-support, reporting, care coordination, and information exchange. The multi-site, multi-stakeholder Acute Care Documentation (ACD) project at Partners Healthcare Systems (PHS) aimed to develop highly structured clinical content with adequate breadth and depth to meet the needs of all types of acute care clinicians at two academic medical centers. The Knowledge Management (KM) team at PHS led the informatics and knowledge management effort for the project. Objectives We aimed to evaluate the role, governance, and project management processes and resources for the KM team’s effort as part of the standardized clinical content creation. Methods We employed the Center for Disease Control’s six step Program Evaluation Framework to guide our evaluation steps. We administered a forty-four question, open-ended, semi-structured voluntary survey to gather focused, credible evidence from members of the KM team. Qualitative open-coding was performed to identify themes for lessons learned and concluding recommendations. Results Six surveys were completed. Qualitative data analysis informed five lessons learned and thirty specific recommendations associated with the lessons learned. The five lessons learned are: 1) Assess and meet knowledge needs and set expectations at the start of the project; 2) Define an accountable decision-making process; 3) Increase team meeting moderation skills; 4) Ensure adequate resources and competency training with online asynchronous collaboration tools; 5) Develop focused, goal-oriented teams and supportive, consultative service based teams. Conclusions Knowledge management requirements for the development of standardized clinical content within a vendor-based EHR among multi-stakeholder teams and sites include: 1) assessing and meeting informatics knowledge needs, 2) setting expectations and standardizing the process for decision-making, and 3) ensuring the availability of adequate resources and competency training. PMID:23874366
Evaluation of complex community-based childhood obesity prevention interventions.
Karacabeyli, D; Allender, S; Pinkney, S; Amed, S
2018-05-16
Multi-setting, multi-component community-based interventions have shown promise in preventing childhood obesity; however, evaluation of these complex interventions remains a challenge. The objective of the study is to systematically review published methodological approaches to outcome evaluation for multi-setting community-based childhood obesity prevention interventions and synthesize a set of pragmatic recommendations. MEDLINE, CINAHL and PsycINFO were searched from inception to 6 July 2017. Papers were included if the intervention targeted children ≤18 years, engaged at least two community sectors and described their outcome evaluation methodology. A single reviewer conducted title and abstract scans, full article review and data abstraction. Directed content analysis was performed by three reviewers to identify prevailing themes. Thirty-three studies were included, and of these, 26 employed a quasi-experimental design; the remaining were randomized control trials. Body mass index was the most commonly measured outcome, followed by health behaviour change and psychosocial outcomes. Six themes emerged, highlighting advantages and disadvantages of active vs. passive consent, quasi-experimental vs. randomized control trials, longitudinal vs. repeat cross-sectional designs and the roles of process evaluation and methodological flexibility in evaluating complex interventions. Selection of study designs and outcome measures compatible with community infrastructure, accompanied by process evaluation, may facilitate successful outcome evaluation. © 2018 World Obesity Federation.
On the use of multi-agent systems for the monitoring of industrial systems
NASA Astrophysics Data System (ADS)
Rezki, Nafissa; Kazar, Okba; Mouss, Leila Hayet; Kahloul, Laid; Rezki, Djamil
2016-03-01
The objective of the current paper is to present an intelligent system for complex process monitoring, based on artificial intelligence technologies. This system aims to realize with success all the complex process monitoring tasks that are: detection, diagnosis, identification and reconfiguration. For this purpose, the development of a multi-agent system that combines multiple intelligences such as: multivariate control charts, neural networks, Bayesian networks and expert systems has became a necessity. The proposed system is evaluated in the monitoring of the complex process Tennessee Eastman process.
Zhu, Hong; Tang, Xinming; Xie, Junfeng; Song, Weidong; Mo, Fan; Gao, Xiaoming
2018-01-01
There are many problems in existing reconstruction-based super-resolution algorithms, such as the lack of texture-feature representation and of high-frequency details. Multi-scale detail enhancement can produce more texture information and high-frequency information. Therefore, super-resolution reconstruction of remote-sensing images based on adaptive multi-scale detail enhancement (AMDE-SR) is proposed in this paper. First, the information entropy of each remote-sensing image is calculated, and the image with the maximum entropy value is regarded as the reference image. Subsequently, spatio-temporal remote-sensing images are processed using phase normalization, which is to reduce the time phase difference of image data and enhance the complementarity of information. The multi-scale image information is then decomposed using the L0 gradient minimization model, and the non-redundant information is processed by difference calculation and expanding non-redundant layers and the redundant layer by the iterative back-projection (IBP) technique. The different-scale non-redundant information is adaptive-weighted and fused using cross-entropy. Finally, a nonlinear texture-detail-enhancement function is built to improve the scope of small details, and the peak signal-to-noise ratio (PSNR) is used as an iterative constraint. Ultimately, high-resolution remote-sensing images with abundant texture information are obtained by iterative optimization. Real results show an average gain in entropy of up to 0.42 dB for an up-scaling of 2 and a significant promotion gain in enhancement measure evaluation for an up-scaling of 2. The experimental results show that the performance of the AMED-SR method is better than existing super-resolution reconstruction methods in terms of visual and accuracy improvements. PMID:29414893
Zhu, Hong; Tang, Xinming; Xie, Junfeng; Song, Weidong; Mo, Fan; Gao, Xiaoming
2018-02-07
There are many problems in existing reconstruction-based super-resolution algorithms, such as the lack of texture-feature representation and of high-frequency details. Multi-scale detail enhancement can produce more texture information and high-frequency information. Therefore, super-resolution reconstruction of remote-sensing images based on adaptive multi-scale detail enhancement (AMDE-SR) is proposed in this paper. First, the information entropy of each remote-sensing image is calculated, and the image with the maximum entropy value is regarded as the reference image. Subsequently, spatio-temporal remote-sensing images are processed using phase normalization, which is to reduce the time phase difference of image data and enhance the complementarity of information. The multi-scale image information is then decomposed using the L ₀ gradient minimization model, and the non-redundant information is processed by difference calculation and expanding non-redundant layers and the redundant layer by the iterative back-projection (IBP) technique. The different-scale non-redundant information is adaptive-weighted and fused using cross-entropy. Finally, a nonlinear texture-detail-enhancement function is built to improve the scope of small details, and the peak signal-to-noise ratio (PSNR) is used as an iterative constraint. Ultimately, high-resolution remote-sensing images with abundant texture information are obtained by iterative optimization. Real results show an average gain in entropy of up to 0.42 dB for an up-scaling of 2 and a significant promotion gain in enhancement measure evaluation for an up-scaling of 2. The experimental results show that the performance of the AMED-SR method is better than existing super-resolution reconstruction methods in terms of visual and accuracy improvements.
Topology Optimization - Engineering Contribution to Architectural Design
NASA Astrophysics Data System (ADS)
Tajs-Zielińska, Katarzyna; Bochenek, Bogdan
2017-10-01
The idea of the topology optimization is to find within a considered design domain the distribution of material that is optimal in some sense. Material, during optimization process, is redistributed and parts that are not necessary from objective point of view are removed. The result is a solid/void structure, for which an objective function is minimized. This paper presents an application of topology optimization to multi-material structures. The design domain defined by shape of a structure is divided into sub-regions, for which different materials are assigned. During design process material is relocated, but only within selected region. The proposed idea has been inspired by architectural designs like multi-material facades of buildings. The effectiveness of topology optimization is determined by proper choice of numerical optimization algorithm. This paper utilises very efficient heuristic method called Cellular Automata. Cellular Automata are mathematical, discrete idealization of a physical systems. Engineering implementation of Cellular Automata requires decomposition of the design domain into a uniform lattice of cells. It is assumed, that the interaction between cells takes place only within the neighbouring cells. The interaction is governed by simple, local update rules, which are based on heuristics or physical laws. The numerical studies show, that this method can be attractive alternative to traditional gradient-based algorithms. The proposed approach is evaluated by selected numerical examples of multi-material bridge structures, for which various material configurations are examined. The numerical studies demonstrated a significant influence the material sub-regions location on the final topologies. The influence of assumed volume fraction on final topologies for multi-material structures is also observed and discussed. The results of numerical calculations show, that this approach produces different results as compared with classical one-material problems.
A number of PCR-based methods for detecting human fecal material in environmental waters have been developed over the past decade, but these methods have rarely received independent comparative testing. Here, we evaluated ten of these methods (BacH, BacHum-UCD, B. thetaiotaomic...
Boutkhoum, Omar; Hanine, Mohamed; Agouti, Tarik; Tikniouine, Abdessadek
2015-01-01
In this paper, we examine the issue of strategic industrial location selection in uncertain decision making environments for implanting new industrial corporation. In fact, the industrial location issue is typically considered as a crucial factor in business research field which is related to many calculations about natural resources, distributors, suppliers, customers, and most other things. Based on the integration of environmental, economic and social decisive elements of sustainable development, this paper presents a hybrid decision making model combining fuzzy multi-criteria analysis with analytical capabilities that OLAP systems can provide for successful and optimal industrial location selection. The proposed model mainly consists in three stages. In the first stage, a decision-making committee has been established to identify the evaluation criteria impacting the location selection process. In the second stage, we develop fuzzy AHP software based on the extent analysis method to assign the importance weights to the selected criteria, which allows us to model the linguistic vagueness, ambiguity, and incomplete knowledge. In the last stage, OLAP analysis integrated with multi-criteria analysis employs these weighted criteria as inputs to evaluate, rank and select the strategic industrial location for implanting new business corporation in the region of Casablanca, Morocco. Finally, a sensitivity analysis is performed to evaluate the impact of criteria weights and the preferences given by decision makers on the final rankings of strategic industrial locations.
NASA Astrophysics Data System (ADS)
Siegert, Stefan
2017-04-01
Initialised climate forecasts on seasonal time scales, run several months or even years ahead, are now an integral part of the battery of products offered by climate services world-wide. The availability of seasonal climate forecasts from various modeling centres gives rise to multi-model ensemble forecasts. Post-processing such seasonal-to-decadal multi-model forecasts is challenging 1) because the cross-correlation structure between multiple models and observations can be complicated, 2) because the amount of training data to fit the post-processing parameters is very limited, and 3) because the forecast skill of numerical models tends to be low on seasonal time scales. In this talk I will review new statistical post-processing frameworks for multi-model ensembles. I will focus particularly on Bayesian hierarchical modelling approaches, which are flexible enough to capture commonly made assumptions about collective and model-specific biases of multi-model ensembles. Despite the advances in statistical methodology, it turns out to be very difficult to out-perform the simplest post-processing method, which just recalibrates the multi-model ensemble mean by linear regression. I will discuss reasons for this, which are closely linked to the specific characteristics of seasonal multi-model forecasts. I explore possible directions for improvements, for example using informative priors on the post-processing parameters, and jointly modelling forecasts and observations.
Evaluation of Professional Role Competency during Psychiatry Residency
ERIC Educational Resources Information Center
Grujich, Nikola N.; Razmy, Ajmal; Zaretsky, Ari; Styra, Rima G.; Sockalingam, Sanjeev
2012-01-01
Objective: The authors sought to determine psychiatry residents' perceptions on the current method of evaluating professional role competency and the use of multi-source feedback (MSF) as an assessment tool. Method: Authors disseminated a structured, anonymous survey to 128 University of Toronto psychiatry residents, evaluating the current mode of…
NASA Astrophysics Data System (ADS)
Konishi, Toshifumi; Yamane, Daisuke; Matsushima, Takaaki; Masu, Kazuya; Machida, Katsuyuki; Toshiyoshi, Hiroshi
2014-01-01
This paper reports the design and evaluation results of a capacitive CMOS-MEMS sensor that consists of the proposed sensor circuit and a capacitive MEMS device implemented on the circuit. To design a capacitive CMOS-MEMS sensor, a multi-physics simulation of the electromechanical behavior of both the MEMS structure and the sensing LSI was carried out simultaneously. In order to verify the validity of the design, we applied the capacitive CMOS-MEMS sensor to a MEMS accelerometer implemented by the post-CMOS process onto a 0.35-µm CMOS circuit. The experimental results of the CMOS-MEMS accelerometer exhibited good agreement with the simulation results within the input acceleration range between 0.5 and 6 G (1 G = 9.8 m/s2), corresponding to the output voltages between 908.6 and 915.4 mV, respectively. Therefore, we have confirmed that our capacitive CMOS-MEMS sensor and the multi-physics simulation will be beneficial method to realize integrated CMOS-MEMS technology.
ERIC Educational Resources Information Center
Torcasso, Gina; Hilt, Lori M.
2017-01-01
Background: Suicide is a leading cause of death among youth. Suicide screening programs aim to identify mental health issues and prevent death by suicide. Objective: The present study evaluated outcomes of a multi-stage screening program implemented over 3 school years in a moderately-sized Midwestern high school. Methods: One hundred ninety-three…
IMPACT OF LEAD ACID BATTERIES AND CADMIUM STABILIZERS ON INCINERATOR EMISSIONS
The Waste Analysis Sampling, Testing and Evaluation (WASTE) Program is a multi-year, multi-disciplinary program designed to elicit the source and fate of environmentally significant trace materials as a solid waste progresses through management processes. s part of the WASTE Prog...
A multi-criteria spatial deprivation index to support health inequality analyses.
Cabrera-Barona, Pablo; Murphy, Thomas; Kienberger, Stefan; Blaschke, Thomas
2015-03-20
Deprivation indices are useful measures to analyze health inequalities. There are several methods to construct these indices, however, few studies have used Geographic Information Systems (GIS) and Multi-Criteria methods to construct a deprivation index. Therefore, this study applies Multi-Criteria Evaluation to calculate weights for the indicators that make up the deprivation index and a GIS-based fuzzy approach to create different scenarios of this index is also implemented. The Analytical Hierarchy Process (AHP) is used to obtain the weights for the indicators of the index. The Ordered Weighted Averaging (OWA) method using linguistic quantifiers is applied in order to create different deprivation scenarios. Geographically Weighted Regression (GWR) and a Moran's I analysis are employed to explore spatial relationships between the different deprivation measures and two health factors: the distance to health services and the percentage of people that have never had a live birth. This last indicator was considered as the dependent variable in the GWR. The case study is Quito City, in Ecuador. The AHP-based deprivation index show medium and high levels of deprivation (0,511 to 1,000) in specific zones of the study area, even though most of the study area has low values of deprivation. OWA results show deprivation scenarios that can be evaluated considering the different attitudes of decision makers. GWR results indicate that the deprivation index and its OWA scenarios can be considered as local estimators for health related phenomena. Moran's I calculations demonstrate that several deprivation scenarios, in combination with the 'distance to health services' factor, could be explanatory variables to predict the percentage of people that have never had a live birth. The AHP-based deprivation index and the OWA deprivation scenarios developed in this study are Multi-Criteria instruments that can support the identification of highly deprived zones and can support health inequalities analysis in combination with different health factors. The methodology described in this study can be applied in other regions of the world to develop spatial deprivation indices based on Multi-Criteria analysis.
Bromuri, Stefano; Zufferey, Damien; Hennebert, Jean; Schumacher, Michael
2014-10-01
This research is motivated by the issue of classifying illnesses of chronically ill patients for decision support in clinical settings. Our main objective is to propose multi-label classification of multivariate time series contained in medical records of chronically ill patients, by means of quantization methods, such as bag of words (BoW), and multi-label classification algorithms. Our second objective is to compare supervised dimensionality reduction techniques to state-of-the-art multi-label classification algorithms. The hypothesis is that kernel methods and locality preserving projections make such algorithms good candidates to study multi-label medical time series. We combine BoW and supervised dimensionality reduction algorithms to perform multi-label classification on health records of chronically ill patients. The considered algorithms are compared with state-of-the-art multi-label classifiers in two real world datasets. Portavita dataset contains 525 diabetes type 2 (DT2) patients, with co-morbidities of DT2 such as hypertension, dyslipidemia, and microvascular or macrovascular issues. MIMIC II dataset contains 2635 patients affected by thyroid disease, diabetes mellitus, lipoid metabolism disease, fluid electrolyte disease, hypertensive disease, thrombosis, hypotension, chronic obstructive pulmonary disease (COPD), liver disease and kidney disease. The algorithms are evaluated using multi-label evaluation metrics such as hamming loss, one error, coverage, ranking loss, and average precision. Non-linear dimensionality reduction approaches behave well on medical time series quantized using the BoW algorithm, with results comparable to state-of-the-art multi-label classification algorithms. Chaining the projected features has a positive impact on the performance of the algorithm with respect to pure binary relevance approaches. The evaluation highlights the feasibility of representing medical health records using the BoW for multi-label classification tasks. The study also highlights that dimensionality reduction algorithms based on kernel methods, locality preserving projections or both are good candidates to deal with multi-label classification tasks in medical time series with many missing values and high label density. Copyright © 2014 Elsevier Inc. All rights reserved.
Sadeghi-Tehran, Pouria; Virlet, Nicolas; Sabermanesh, Kasra; Hawkesford, Malcolm J
2017-01-01
Accurately segmenting vegetation from the background within digital images is both a fundamental and a challenging task in phenotyping. The performance of traditional methods is satisfactory in homogeneous environments, however, performance decreases when applied to images acquired in dynamic field environments. In this paper, a multi-feature learning method is proposed to quantify vegetation growth in outdoor field conditions. The introduced technique is compared with the state-of the-art and other learning methods on digital images. All methods are compared and evaluated with different environmental conditions and the following criteria: (1) comparison with ground-truth images, (2) variation along a day with changes in ambient illumination, (3) comparison with manual measurements and (4) an estimation of performance along the full life cycle of a wheat canopy. The method described is capable of coping with the environmental challenges faced in field conditions, with high levels of adaptiveness and without the need for adjusting a threshold for each digital image. The proposed method is also an ideal candidate to process a time series of phenotypic information throughout the crop growth acquired in the field. Moreover, the introduced method has an advantage that it is not limited to growth measurements only but can be applied on other applications such as identifying weeds, diseases, stress, etc.
Wibowo, Santoso; Deng, Hepu
2015-06-01
This paper presents a multi-criteria group decision making approach for effectively evaluating the performance of e-waste recycling programs under uncertainty in an organization. Intuitionistic fuzzy numbers are used for adequately representing the subjective and imprecise assessments of the decision makers in evaluating the relative importance of evaluation criteria and the performance of individual e-waste recycling programs with respect to individual criteria in a given situation. An interactive fuzzy multi-criteria decision making algorithm is developed for facilitating consensus building in a group decision making environment to ensure that all the interest of individual decision makers have been appropriately considered in evaluating alternative e-waste recycling programs with respect to their corporate sustainability performance. The developed algorithm is then incorporated into a multi-criteria decision support system for making the overall performance evaluation process effectively and simple to use. Such a multi-criteria decision making system adequately provides organizations with a proactive mechanism for incorporating the concept of corporate sustainability into their regular planning decisions and business practices. An example is presented for demonstrating the applicability of the proposed approach in evaluating the performance of e-waste recycling programs in organizations. Copyright © 2015 Elsevier Ltd. All rights reserved.
Positioning stability improvement with inter-system biases on multi-GNSS PPP
NASA Astrophysics Data System (ADS)
Choi, Byung-Kyu; Yoon, Hasu
2018-07-01
The availability of multiple signals from different Global Navigation Satellite System (GNSS) constellations provides opportunities for improving positioning accuracy and initial convergence time. With dual-frequency observations from the four constellations (GPS, GLONASS, Galileo, and BeiDou), it is possible to investigate combined GNSS precise point positioning (PPP) accuracy and stability. The differences between GNSS systems result in inter-system biases (ISBs). We consider several ISB values such as GPS-GLONASS, GPS-Galileo, and GPS-BeiDou. These biases are compliant with key parameters defined in the multi-GNSS PPP processing. In this study, we present a unified PPP method that sets ISB values as fixed or constant. A comprehensive analysis that includes satellite visibility, position dilution of precision, position accuracy is performed to evaluate a unified PPP method with constrained cut-off elevation angles. Compared to the conventional PPP solutions, our approach shows more stable positioning at a constrained cut-off elevation angle of 50 degrees.
Cormode, Graham; Dasgupta, Anirban; Goyal, Amit; Lee, Chi Hoon
2018-01-01
Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users' queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with "vanilla" LSH, even when using the same amount of space.
Multi-dimension feature fusion for action recognition
NASA Astrophysics Data System (ADS)
Dong, Pei; Li, Jie; Dong, Junyu; Qi, Lin
2018-04-01
Typical human actions last several seconds and exhibit characteristic spatio-temporal structure. The challenge for action recognition is to capture and fuse the multi-dimension information in video data. In order to take into account these characteristics simultaneously, we present a novel method that fuses multiple dimensional features, such as chromatic images, depth and optical flow fields. We built our model based on the multi-stream deep convolutional networks with the help of temporal segment networks and extract discriminative spatial and temporal features by fusing ConvNets towers multi-dimension, in which different feature weights are assigned in order to take full advantage of this multi-dimension information. Our architecture is trained and evaluated on the currently largest and most challenging benchmark NTU RGB-D dataset. The experiments demonstrate that the performance of our method outperforms the state-of-the-art methods.
Hayati, Elyas; Majnounian, Baris; Abdi, Ehsan; Sessions, John; Makhdoum, Majid
2013-02-01
Changes in forest landscapes resulting from road construction have increased remarkably in the last few years. On the other hand, the sustainable management of forest resources can only be achieved through a well-organized road network. In order to minimize the environmental impacts of forest roads, forest road managers must design the road network efficiently and environmentally as well. Efficient planning methodologies can assist forest road managers in considering the technical, economic, and environmental factors that affect forest road planning. This paper describes a three-stage methodology using the Delphi method for selecting the important criteria, the Analytic Hierarchy Process for obtaining the relative importance of the criteria, and finally, a spatial multi-criteria evaluation in a geographic information system (GIS) environment for identifying the lowest-impact road network alternative. Results of the Delphi method revealed that ground slope, lithology, distance from stream network, distance from faults, landslide susceptibility, erosion susceptibility, geology, and soil texture are the most important criteria for forest road planning in the study area. The suitability map for road planning was then obtained by combining the fuzzy map layers of these criteria with respect to their weights. Nine road network alternatives were designed using PEGGER, an ArcView GIS extension, and finally, their values were extracted from the suitability map. Results showed that the methodology was useful for identifying road that met environmental and cost considerations. Based on this work, we suggest future work in forest road planning using multi-criteria evaluation and decision making be considered in other regions and that the road planning criteria identified in this study may be useful.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yin; Wang, Wen; Wysocki, Gerard, E-mail: gwysocki@princeton.edu
In this Letter, we present a method of performing broadband mid-infrared spectroscopy with conventional, free-running, continuous wave Fabry-Perot quantum cascade lasers (FP-QCLs). The measurement method is based on multi-heterodyne down-conversion of optical signals. The sample transmission spectrum probed by one multi-mode FP-QCL is down-converted to the radio-frequency domain through an optical multi-heterodyne process using a second FP-QCL as the local oscillator. Both a broadband multi-mode spectral measurement as well as high-resolution (∼15 MHz) spectroscopy of molecular absorption are demonstrated and show great potential for development of high performance FP-laser-based spectrometers for chemical sensing.
Visual enhancement of unmixed multispectral imagery using adaptive smoothing
Lemeshewsky, G.P.; Rahman, Z.-U.; Schowengerdt, R.A.; Reichenbach, S.E.
2004-01-01
Adaptive smoothing (AS) has been previously proposed as a method to smooth uniform regions of an image, retain contrast edges, and enhance edge boundaries. The method is an implementation of the anisotropic diffusion process which results in a gray scale image. This paper discusses modifications to the AS method for application to multi-band data which results in a color segmented image. The process was used to visually enhance the three most distinct abundance fraction images produced by the Lagrange constraint neural network learning-based unmixing of Landsat 7 Enhanced Thematic Mapper Plus multispectral sensor data. A mutual information-based method was applied to select the three most distinct fraction images for subsequent visualization as a red, green, and blue composite. A reported image restoration technique (partial restoration) was applied to the multispectral data to reduce unmixing error, although evaluation of the performance of this technique was beyond the scope of this paper. The modified smoothing process resulted in a color segmented image with homogeneous regions separated by sharpened, coregistered multiband edges. There was improved class separation with the segmented image, which has importance to subsequent operations involving data classification.
A comparison of representations for discrete multi-criteria decision problems☆
Gettinger, Johannes; Kiesling, Elmar; Stummer, Christian; Vetschera, Rudolf
2013-01-01
Discrete multi-criteria decision problems with numerous Pareto-efficient solution candidates place a significant cognitive burden on the decision maker. An interactive, aspiration-based search process that iteratively progresses toward the most preferred solution can alleviate this task. In this paper, we study three ways of representing such problems in a DSS, and compare them in a laboratory experiment using subjective and objective measures of the decision process as well as solution quality and problem understanding. In addition to an immediate user evaluation, we performed a re-evaluation several weeks later. Furthermore, we consider several levels of problem complexity and user characteristics. Results indicate that different problem representations have a considerable influence on search behavior, although long-term consistency appears to remain unaffected. We also found interesting discrepancies between subjective evaluations and objective measures. Conclusions from our experiments can help designers of DSS for large multi-criteria decision problems to fit problem representations to the goals of their system and the specific task at hand. PMID:24882912
Multispectral image fusion for target detection
NASA Astrophysics Data System (ADS)
Leviner, Marom; Maltz, Masha
2009-09-01
Various different methods to perform multi-spectral image fusion have been suggested, mostly on the pixel level. However, the jury is still out on the benefits of a fused image compared to its source images. We present here a new multi-spectral image fusion method, multi-spectral segmentation fusion (MSSF), which uses a feature level processing paradigm. To test our method, we compared human observer performance in an experiment using MSSF against two established methods: Averaging and Principle Components Analysis (PCA), and against its two source bands, visible and infrared. The task that we studied was: target detection in the cluttered environment. MSSF proved superior to the other fusion methods. Based on these findings, current speculation about the circumstances in which multi-spectral image fusion in general and specific fusion methods in particular would be superior to using the original image sources can be further addressed.
Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith
2015-01-01
Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment. PMID:26368541
Multi criteria evaluation for universal soil loss equation based on geographic information system
NASA Astrophysics Data System (ADS)
Purwaamijaya, I. M.
2018-05-01
The purpose of this research were to produce(l) a conceptual, functional model designed and implementation for universal soil loss equation (usle), (2) standard operational procedure for multi criteria evaluation of universal soil loss equation (usle) using geographic information system, (3) overlay land cover, slope, soil and rain fall layers to gain universal soil loss equation (usle) using multi criteria evaluation, (4) thematic map of universal soil loss equation (usle) in watershed, (5) attribute table of universal soil loss equation (usle) in watershed. Descriptive and formal correlation methods are used for this research. Cikapundung Watershed, Bandung, West Java, Indonesia was study location. This research was conducted on January 2016 to May 2016. A spatial analysis is used to superimposed land cover, slope, soil and rain layers become universal soil loss equation (usle). Multi criteria evaluation for universal soil loss equation (usle) using geographic information system could be used for conservation program.
Efficient Credit Assignment through Evaluation Function Decomposition
NASA Technical Reports Server (NTRS)
Agogino, Adrian; Turner, Kagan; Mikkulainen, Risto
2005-01-01
Evolutionary methods are powerful tools in discovering solutions for difficult continuous tasks. When such a solution is encoded over multiple genes, a genetic algorithm faces the difficult credit assignment problem of evaluating how a single gene in a chromosome contributes to the full solution. Typically a single evaluation function is used for the entire chromosome, implicitly giving each gene in the chromosome the same evaluation. This method is inefficient because a gene will get credit for the contribution of all the other genes as well. Accurately measuring the fitness of individual genes in such a large search space requires many trials. This paper instead proposes turning this single complex search problem into a multi-agent search problem, where each agent has the simpler task of discovering a suitable gene. Gene-specific evaluation functions can then be created that have better theoretical properties than a single evaluation function over all genes. This method is tested in the difficult double-pole balancing problem, showing that agents using gene-specific evaluation functions can create a successful control policy in 20 percent fewer trials than the best existing genetic algorithms. The method is extended to more distributed problems, achieving 95 percent performance gains over tradition methods in the multi-rover domain.
Young Kim, Eun; Johnson, Hans J
2013-01-01
A robust multi-modal tool, for automated registration, bias correction, and tissue classification, has been implemented for large-scale heterogeneous multi-site longitudinal MR data analysis. This work focused on improving the an iterative optimization framework between bias-correction, registration, and tissue classification inspired from previous work. The primary contributions are robustness improvements from incorporation of following four elements: (1) utilize multi-modal and repeated scans, (2) incorporate high-deformable registration, (3) use extended set of tissue definitions, and (4) use of multi-modal aware intensity-context priors. The benefits of these enhancements were investigated by a series of experiments with both simulated brain data set (BrainWeb) and by applying to highly-heterogeneous data from a 32 site imaging study with quality assessments through the expert visual inspection. The implementation of this tool is tailored for, but not limited to, large-scale data processing with great data variation with a flexible interface. In this paper, we describe enhancements to a joint registration, bias correction, and the tissue classification, that improve the generalizability and robustness for processing multi-modal longitudinal MR scans collected at multi-sites. The tool was evaluated by using both simulated and simulated and human subject MRI images. With these enhancements, the results showed improved robustness for large-scale heterogeneous MRI processing.
Hu, Peijun; Wu, Fa; Peng, Jialin; Bao, Yuanyuan; Chen, Feng; Kong, Dexing
2017-03-01
Multi-organ segmentation from CT images is an essential step for computer-aided diagnosis and surgery planning. However, manual delineation of the organs by radiologists is tedious, time-consuming and poorly reproducible. Therefore, we propose a fully automatic method for the segmentation of multiple organs from three-dimensional abdominal CT images. The proposed method employs deep fully convolutional neural networks (CNNs) for organ detection and segmentation, which is further refined by a time-implicit multi-phase evolution method. Firstly, a 3D CNN is trained to automatically localize and delineate the organs of interest with a probability prediction map. The learned probability map provides both subject-specific spatial priors and initialization for subsequent fine segmentation. Then, for the refinement of the multi-organ segmentation, image intensity models, probability priors as well as a disjoint region constraint are incorporated into an unified energy functional. Finally, a novel time-implicit multi-phase level-set algorithm is utilized to efficiently optimize the proposed energy functional model. Our method has been evaluated on 140 abdominal CT scans for the segmentation of four organs (liver, spleen and both kidneys). With respect to the ground truth, average Dice overlap ratios for the liver, spleen and both kidneys are 96.0, 94.2 and 95.4%, respectively, and average symmetric surface distance is less than 1.3 mm for all the segmented organs. The computation time for a CT volume is 125 s in average. The achieved accuracy compares well to state-of-the-art methods with much higher efficiency. A fully automatic method for multi-organ segmentation from abdominal CT images was developed and evaluated. The results demonstrated its potential in clinical usage with high effectiveness, robustness and efficiency.
Lajnef, Tarek; Chaibi, Sahbi; Ruby, Perrine; Aguera, Pierre-Emmanuel; Eichenlaub, Jean-Baptiste; Samet, Mounir; Kachouri, Abdennaceur; Jerbi, Karim
2015-07-30
Sleep staging is a critical step in a range of electrophysiological signal processing pipelines used in clinical routine as well as in sleep research. Although the results currently achievable with automatic sleep staging methods are promising, there is need for improvement, especially given the time-consuming and tedious nature of visual sleep scoring. Here we propose a sleep staging framework that consists of a multi-class support vector machine (SVM) classification based on a decision tree approach. The performance of the method was evaluated using polysomnographic data from 15 subjects (electroencephalogram (EEG), electrooculogram (EOG) and electromyogram (EMG) recordings). The decision tree, or dendrogram, was obtained using a hierarchical clustering technique and a wide range of time and frequency-domain features were extracted. Feature selection was carried out using forward sequential selection and classification was evaluated using k-fold cross-validation. The dendrogram-based SVM (DSVM) achieved mean specificity, sensitivity and overall accuracy of 0.92, 0.74 and 0.88 respectively, compared to expert visual scoring. Restricting DSVM classification to data where both experts' scoring was consistent (76.73% of the data) led to a mean specificity, sensitivity and overall accuracy of 0.94, 0.82 and 0.92 respectively. The DSVM framework outperforms classification with more standard multi-class "one-against-all" SVM and linear-discriminant analysis. The promising results of the proposed methodology suggest that it may be a valuable alternative to existing automatic methods and that it could accelerate visual scoring by providing a robust starting hypnogram that can be further fine-tuned by expert inspection. Copyright © 2015 Elsevier B.V. All rights reserved.
Liu, Yingchun; Liu, Zhongbo; Sun, Guoxiang; Wang, Yan; Ling, Junhong; Gao, Jiayue; Huang, Jiahao
2015-01-01
A combination method of multi-wavelength fingerprinting and multi-component quantification by high performance liquid chromatography (HPLC) coupled with diode array detector (DAD) was developed and validated to monitor and evaluate the quality consistency of herbal medicines (HM) in the classical preparation Compound Bismuth Aluminate tablets (CBAT). The validation results demonstrated that our method met the requirements of fingerprint analysis and quantification analysis with suitable linearity, precision, accuracy, limits of detection (LOD) and limits of quantification (LOQ). In the fingerprint assessments, rather than using conventional qualitative "Similarity" as a criterion, the simple quantified ratio fingerprint method (SQRFM) was recommended, which has an important quantified fingerprint advantage over the "Similarity" approach. SQRFM qualitatively and quantitatively offers the scientific criteria for traditional Chinese medicines (TCM)/HM quality pyramid and warning gate in terms of three parameters. In order to combine the comprehensive characterization of multi-wavelength fingerprints, an integrated fingerprint assessment strategy based on information entropy was set up involving a super-information characteristic digitized parameter of fingerprints, which reveals the total entropy value and absolute information amount about the fingerprints and, thus, offers an excellent method for fingerprint integration. The correlation results between quantified fingerprints and quantitative determination of 5 marker compounds, including glycyrrhizic acid (GLY), liquiritin (LQ), isoliquiritigenin (ILG), isoliquiritin (ILQ) and isoliquiritin apioside (ILA), indicated that multi-component quantification could be replaced by quantified fingerprints. The Fenton reaction was employed to determine the antioxidant activities of CBAT samples in vitro, and they were correlated with HPLC fingerprint components using the partial least squares regression (PLSR) method. In summary, the method of multi-wavelength fingerprints combined with antioxidant activities has been proved to be a feasible and scientific procedure for monitoring and evaluating the quality consistency of CBAT.
Enriching semantic knowledge bases for opinion mining in big data applications.
Weichselbraun, A; Gindl, S; Scharl, A
2014-10-01
This paper presents a novel method for contextualizing and enriching large semantic knowledge bases for opinion mining with a focus on Web intelligence platforms and other high-throughput big data applications. The method is not only applicable to traditional sentiment lexicons, but also to more comprehensive, multi-dimensional affective resources such as SenticNet. It comprises the following steps: (i) identify ambiguous sentiment terms, (ii) provide context information extracted from a domain-specific training corpus, and (iii) ground this contextual information to structured background knowledge sources such as ConceptNet and WordNet. A quantitative evaluation shows a significant improvement when using an enriched version of SenticNet for polarity classification. Crowdsourced gold standard data in conjunction with a qualitative evaluation sheds light on the strengths and weaknesses of the concept grounding, and on the quality of the enrichment process.
Waste management barriers in developing country hospitals: Case study and AHP analysis.
Delmonico, Diego V de Godoy; Santos, Hugo H Dos; Pinheiro, Marco Ap; de Castro, Rosani; de Souza, Regiane M
2018-01-01
Healthcare waste management is an essential field for both researchers and practitioners. Although there have been few studies using statistical methods for its evaluation, it has been the subject of several studies in different contexts. Furthermore, the known precarious practices for waste management in developing countries raise questions about its potential barriers. This study aims to investigate the barriers in healthcare waste management and their relevance. For this purpose, this paper analyses waste management practices in two Brazilian hospitals by using case study and the Analytic Hierarchy Process method. The barriers were organized into three categories - human factors, management, and infrastructure, and the main findings suggest that cost and employee awareness were the most significant barriers. These results highlight the main barriers to more sustainable waste management, and provide an empirical basis for multi-criteria evaluation of the literature.
Distributed Evaluation Functions for Fault Tolerant Multi-Rover Systems
NASA Technical Reports Server (NTRS)
Agogino, Adrian; Turner, Kagan
2005-01-01
The ability to evolve fault tolerant control strategies for large collections of agents is critical to the successful application of evolutionary strategies to domains where failures are common. Furthermore, while evolutionary algorithms have been highly successful in discovering single-agent control strategies, extending such algorithms to multiagent domains has proven to be difficult. In this paper we present a method for shaping evaluation functions for agents that provide control strategies that both are tolerant to different types of failures and lead to coordinated behavior in a multi-agent setting. This method neither relies of a centralized strategy (susceptible to single point of failures) nor a distributed strategy where each agent uses a system wide evaluation function (severe credit assignment problem). In a multi-rover problem, we show that agents using our agent-specific evaluation perform up to 500% better than agents using the system evaluation. In addition we show that agents are still able to maintain a high level of performance when up to 60% of the agents fail due to actuator, communication or controller faults.
NASA Astrophysics Data System (ADS)
Richardson, C. M.; Swarzenski, P. W.; Johnson, C.
2013-12-01
Coastal lagoons are highly productive systems with a strong dependence on the physico-chemical regime of their surrounding environment. Groundwater interactions with the nearshore environment can drive ecosystem stability and productivity. Lagoons with restricted surface connectivity interact with coastal waters via subsurface flow paths that follow natural hydraulic gradients, producing a dynamic freshwater-saltwater mixing zone with submarine groundwater discharge (SGD) regions that are tidally influenced. Recent studies demonstrate the importance of SGD in maintaining nearshore ecology through a number of processes, including enhanced chemical loadings, focused biogeochemical transformations, and complex water mixing scenarios (Slomp and Van Cappellen, 2004 and Taniguchi et al., 2002). Groundwater discharge to the coastal ocean is often slow, diffuse and site-specific. Traditional methods used to evaluate SGD fluxes operate at varying scales and typically result in over or underestimates of SGD. Novel monitoring and evaluation methods are required in order to better understand how coastal aquifer systems influence multi-scalar water and nutrient budgets. Recently developed methods to determine fluid exchange rates include the use of select U- and Th-series radionuclides, multi-channel resistivity imaging, as well as the integration of temperature data and 1-D analytical modeling. Groundwater fluxes were examined in a coastal lagoon system to characterize the physics of subsurface fluid transport evidenced by visible seepage faces at low tide. Fluid exchange rates were quantified to determine the spatial and temporal variability of groundwater movement using thermal time series, water level data, and a coupled radiotracer-geophysical method. Our investigation of subsurface characteristics and groundwater fluxes using both traditional and newly-developed methods indicated that seasonal water inputs and tidal controls on water table elevation significantly influence the magnitude and direction of seepage fluxes. Hydraulic gradients created focused discharge regions towards the seepage faces with average flow rates of up to 0.67 m3/day that were tidally influenced. Thermally-derived vertical groundwater flow rates ranged from -0.59 m3/day to -1.0 m3/day showing no correlation to tide. Radon-222 was used as a complimentary tracer and multi-channel resistivity surveys confirmed the presence of a freshwater conduit. Our time-series analyses of groundwater fluxes into and out of the lagoon demonstrate the importance of monitoring these dynamic systems for longer time periods with a multi-scale approach. Slomp, C. P., & Van Cappellen, P. (2004). Nutrient inputs to the coastal ocean through submarine groundwater discharge: controls and potential impact. Journal of Hydrology, 295(1), 64-86. Taniguchi, M., Burnett, W. C., Cable, J. E., & Turner, J. V. (2002). Investigation of submarine groundwater discharge. Hydrological Processes, 16(11), 2115-2129.
Han, Guanghui; Liu, Xiabi; Zheng, Guangyuan; Wang, Murong; Huang, Shan
2018-06-06
Ground-glass opacity (GGO) is a common CT imaging sign on high-resolution CT, which means the lesion is more likely to be malignant compared to common solid lung nodules. The automatic recognition of GGO CT imaging signs is of great importance for early diagnosis and possible cure of lung cancers. The present GGO recognition methods employ traditional low-level features and system performance improves slowly. Considering the high-performance of CNN model in computer vision field, we proposed an automatic recognition method of 3D GGO CT imaging signs through the fusion of hybrid resampling and layer-wise fine-tuning CNN models in this paper. Our hybrid resampling is performed on multi-views and multi-receptive fields, which reduces the risk of missing small or large GGOs by adopting representative sampling panels and processing GGOs with multiple scales simultaneously. The layer-wise fine-tuning strategy has the ability to obtain the optimal fine-tuning model. Multi-CNN models fusion strategy obtains better performance than any single trained model. We evaluated our method on the GGO nodule samples in publicly available LIDC-IDRI dataset of chest CT scans. The experimental results show that our method yields excellent results with 96.64% sensitivity, 71.43% specificity, and 0.83 F1 score. Our method is a promising approach to apply deep learning method to computer-aided analysis of specific CT imaging signs with insufficient labeled images. Graphical abstract We proposed an automatic recognition method of 3D GGO CT imaging signs through the fusion of hybrid resampling and layer-wise fine-tuning CNN models in this paper. Our hybrid resampling reduces the risk of missing small or large GGOs by adopting representative sampling panels and processing GGOs with multiple scales simultaneously. The layer-wise fine-tuning strategy has ability to obtain the optimal fine-tuning model. Our method is a promising approach to apply deep learning method to computer-aided analysis of specific CT imaging signs with insufficient labeled images.
Sun, Guoxiang; Zhang, Jingxian
2009-05-01
The three wavelength fusion high performance liquid chromatographic fingerprin (TWFFP) of Longdanxiegan pill (LDXGP) was established to identify the quality of LDXGP by the systematic quantified fingerprint method. The chromatographic fingerprints (CFPs) of the 12 batches of LDXGP were determined by reversed-phase high performance liquid chromatography. The technique of multi-wavelength fusion fingerprint was applied during processing the fingerprints. The TWFFPs containing 63 co-possessing peaks were obtained when choosing baicalin peak as the referential peak. The 12 batches of LDXGP were identified with hierarchical clustering analysis by using macro qualitative similarity (S(m)) as the variable. According to the results of classification, the referential fingerprint (RFP) was synthesized from 10 batches of LDXGP. Taking the RFP for the qualified model, all the 12 batches of LDXGP were evaluated by the systematic quantified fingerprint method. Among the 12 batches of LDXGP, 9 batches were completely qualified, the contents of 1 batch were obviously higher while the chemical constituents quantity and distributed proportion in 2 batches were not qualified. The systematic quantified fingerprint method based on the technique of multi-wavelength fusion fingerprint ca effectively identify the authentic quality of traditional Chinese medicine.
Extended depth of field integral imaging using multi-focus fusion
NASA Astrophysics Data System (ADS)
Piao, Yongri; Zhang, Miao; Wang, Xiaohui; Li, Peihua
2018-03-01
In this paper, we propose a new method for depth of field extension in integral imaging by realizing the image fusion method on the multi-focus elemental images. In the proposed method, a camera is translated on a 2D grid to take multi-focus elemental images by sweeping the focus plane across the scene. Simply applying an image fusion method on the elemental images holding rich parallax information does not work effectively because registration accuracy of images is the prerequisite for image fusion. To solve this problem an elemental image generalization method is proposed. The aim of this generalization process is to geometrically align the objects in all elemental images so that the correct regions of multi-focus elemental images can be exacted. The all-in focus elemental images are then generated by fusing the generalized elemental images using the block based fusion method. The experimental results demonstrate that the depth of field of synthetic aperture integral imaging system has been extended by realizing the generation method combined with the image fusion on multi-focus elemental images in synthetic aperture integral imaging system.
A new multi-spectral feature level image fusion method for human interpretation
NASA Astrophysics Data System (ADS)
Leviner, Marom; Maltz, Masha
2009-03-01
Various different methods to perform multi-spectral image fusion have been suggested, mostly on the pixel level. However, the jury is still out on the benefits of a fused image compared to its source images. We present here a new multi-spectral image fusion method, multi-spectral segmentation fusion (MSSF), which uses a feature level processing paradigm. To test our method, we compared human observer performance in a three-task experiment using MSSF against two established methods: averaging and principle components analysis (PCA), and against its two source bands, visible and infrared. The three tasks that we studied were: (1) simple target detection, (2) spatial orientation, and (3) camouflaged target detection. MSSF proved superior to the other fusion methods in all three tests; MSSF also outperformed the source images in the spatial orientation and camouflaged target detection tasks. Based on these findings, current speculation about the circumstances in which multi-spectral image fusion in general and specific fusion methods in particular would be superior to using the original image sources can be further addressed.
Single well surfactant test to evaluate surfactant floods using multi tracer method
Sheely, Clyde Q.
1979-01-01
Data useful for evaluating the effectiveness of or designing an enhanced recovery process said process involving mobilizing and moving hydrocarbons through a hydrocarbon bearing subterranean formation from an injection well to a production well by injecting a mobilizing fluid into the injection well, comprising (a) determining hydrocarbon saturation in a volume in the formation near a well bore penetrating formation, (b) injecting sufficient mobilizing fluid to mobilize and move hydrocarbons from a volume in the formation near the well bore, and (c) determining the hydrocarbon saturation in a volume including at least a part of the volume of (b) by an improved single well surfactant method comprising injecting 2 or more slugs of water containing the primary tracer separated by water slugs containing no primary tracer. Alternatively, the plurality of ester tracers can be injected in a single slug said tracers penetrating varying distances into the formation wherein the esters have different partition coefficients and essentially equal reaction times. The single well tracer method employed is disclosed in U.S. Pat. No. 3,623,842. This method designated the single well surfactant test (SWST) is useful for evaluating the effect of surfactant floods, polymer floods, carbon dioxide floods, micellar floods, caustic floods and the like in subterranean formations in much less time and at much reduced cost compared to conventional multiwell pilot tests.
Kwon, Min-Seok; Nam, Seungyoon; Lee, Sungyoung; Ahn, Young Zoo; Chang, Hae Ryung; Kim, Yon Hui; Park, Taesung
2017-01-01
The recent creation of enormous, cancer-related “Big Data” public depositories represents a powerful means for understanding tumorigenesis. However, a consistently accurate system for clinically evaluating single/multi-biomarkers remains lacking, and it has been asserted that oft-failed clinical advancement of biomarkers occurs within the very early stages of biomarker assessment. To address these challenges, we developed a clinically testable, web-based tool, CANcer-specific single/multi-biomarker Evaluation System (CANES), to evaluate biomarker effectiveness, across 2,134 whole transcriptome datasets, from 94,147 biological samples (from 18 tumor types). For user-provided single/multi-biomarkers, CANES evaluates the performance of single/multi-biomarker candidates, based on four classification methods, support vector machine, random forest, neural networks, and classification and regression trees. In addition, CANES offers several advantages over earlier analysis tools, including: 1) survival analysis; 2) evaluation of mature miRNAs as markers for user-defined diagnostic or prognostic purposes; and 3) provision of a “pan-cancer” summary view, based on each single marker. We believe that such “landscape” evaluation of single/multi-biomarkers, for diagnostic therapeutic/prognostic decision-making, will be highly valuable for the discovery and “repurposing” of existing biomarkers (and their specific targeted therapies), leading to improved patient therapeutic stratification, a key component of targeted therapy success for the avoidance of therapy resistance. PMID:29050243
A novel application of artificial neural network for wind speed estimation
NASA Astrophysics Data System (ADS)
Fang, Da; Wang, Jianzhou
2017-05-01
Providing accurate multi-steps wind speed estimation models has increasing significance, because of the important technical and economic impacts of wind speed on power grid security and environment benefits. In this study, the combined strategies for wind speed forecasting are proposed based on an intelligent data processing system using artificial neural network (ANN). Generalized regression neural network and Elman neural network are employed to form two hybrid models. The approach employs one of ANN to model the samples achieving data denoising and assimilation and apply the other to predict wind speed using the pre-processed samples. The proposed method is demonstrated in terms of the predicting improvements of the hybrid models compared with single ANN and the typical forecasting method. To give sufficient cases for the study, four observation sites with monthly average wind speed of four given years in Western China were used to test the models. Multiple evaluation methods demonstrated that the proposed method provides a promising alternative technique in monthly average wind speed estimation.
Multi-Connection Pattern Analysis: Decoding the representational content of neural communication.
Li, Yuanning; Richardson, Robert Mark; Ghuman, Avniel Singh
2017-11-15
The lack of multivariate methods for decoding the representational content of interregional neural communication has left it difficult to know what information is represented in distributed brain circuit interactions. Here we present Multi-Connection Pattern Analysis (MCPA), which works by learning mappings between the activity patterns of the populations as a factor of the information being processed. These maps are used to predict the activity from one neural population based on the activity from the other population. Successful MCPA-based decoding indicates the involvement of distributed computational processing and provides a framework for probing the representational structure of the interaction. Simulations demonstrate the efficacy of MCPA in realistic circumstances. In addition, we demonstrate that MCPA can be applied to different signal modalities to evaluate a variety of hypothesis associated with information coding in neural communications. We apply MCPA to fMRI and human intracranial electrophysiological data to provide a proof-of-concept of the utility of this method for decoding individual natural images and faces in functional connectivity data. We further use a MCPA-based representational similarity analysis to illustrate how MCPA may be used to test computational models of information transfer among regions of the visual processing stream. Thus, MCPA can be used to assess the information represented in the coupled activity of interacting neural circuits and probe the underlying principles of information transformation between regions. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Noh, Seong Jin; Rakovec, Oldrich; Kumar, Rohini; Samaniego, Luis
2016-04-01
There have been tremendous improvements in distributed hydrologic modeling (DHM) which made a process-based simulation with a high spatiotemporal resolution applicable on a large spatial scale. Despite of increasing information on heterogeneous property of a catchment, DHM is still subject to uncertainties inherently coming from model structure, parameters and input forcing. Sequential data assimilation (DA) may facilitate improved streamflow prediction via DHM using real-time observations to correct internal model states. In conventional DA methods such as state updating, parametric uncertainty is, however, often ignored mainly due to practical limitations of methodology to specify modeling uncertainty with limited ensemble members. If parametric uncertainty related with routing and runoff components is not incorporated properly, predictive uncertainty by DHM may be insufficient to capture dynamics of observations, which may deteriorate predictability. Recently, a multi-scale parameter regionalization (MPR) method was proposed to make hydrologic predictions at different scales using a same set of model parameters without losing much of the model performance. The MPR method incorporated within the mesoscale hydrologic model (mHM, http://www.ufz.de/mhm) could effectively represent and control uncertainty of high-dimensional parameters in a distributed model using global parameters. In this study, we present a global multi-parametric ensemble approach to incorporate parametric uncertainty of DHM in DA to improve streamflow predictions. To effectively represent and control uncertainty of high-dimensional parameters with limited number of ensemble, MPR method is incorporated with DA. Lagged particle filtering is utilized to consider the response times and non-Gaussian characteristics of internal hydrologic processes. The hindcasting experiments are implemented to evaluate impacts of the proposed DA method on streamflow predictions in multiple European river basins having different climate and catchment characteristics. Because augmentation of parameters is not required within an assimilation window, the approach could be stable with limited ensemble members and viable for practical uses.
Innovating Big Data Computing Geoprocessing for Analysis of Engineered-Natural Systems
NASA Astrophysics Data System (ADS)
Rose, K.; Baker, V.; Bauer, J. R.; Vasylkivska, V.
2016-12-01
Big data computing and analytical techniques offer opportunities to improve predictions about subsurface systems while quantifying and characterizing associated uncertainties from these analyses. Spatial analysis, big data and otherwise, of subsurface natural and engineered systems are based on variable resolution, discontinuous, and often point-driven data to represent continuous phenomena. We will present examples from two spatio-temporal methods that have been adapted for use with big datasets and big data geo-processing capabilities. The first approach uses regional earthquake data to evaluate spatio-temporal trends associated with natural and induced seismicity. The second algorithm, the Variable Grid Method (VGM), is a flexible approach that presents spatial trends and patterns, such as those resulting from interpolation methods, while simultaneously visualizing and quantifying uncertainty in the underlying spatial datasets. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analyses to efficiently consume and utilize large geospatial data in these custom analytical algorithms through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom `Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations.
Evaluation of a Multi-Case Participatory Action Research Project: The Case of SOLINSA
ERIC Educational Resources Information Center
Home, Robert; Rump, Niels
2015-01-01
Purpose: Scholars agree that evaluation of participatory action research is inherently valuable; however there have been few attempts at evaluating across methods and across interventions because the perceived success of a method is affected by context, researcher skills and the aims of the participants. This paper describes the systematic…
In Situ Monitoring of Particle Consolidation During Low Pressure Cold Spray by Ultrasonic Techniques
NASA Astrophysics Data System (ADS)
Maev, R. Gr.; Titov, S.; Leshchynsky, V.; Dzhurinskiy, D.; Lubrick, M.
2011-06-01
This study attempts to test the viability of the examination of the cold spray process using acoustic methods, specifically in situ testing during the actual spray process itself. Multiple composites studied by flat and multi-channel transducers as well as the results of actual online measurements are presented. It is shown that the final thickness as well as the dynamics of buildup can be evaluated (including plotting rates of buildup). Cross sections of the coating thickness are also easy to obtain and show true profiles of the coating. The data can also be used to generate real estimates for nozzle speed and spray diameter. Finally, comparisons of real thickness and acoustically estimated thickness show a close linear relationship. The data clearly show that online acoustic measurement is a viable method for estimating thickness buildup.
Evaluating the iterative development of VR/AR human factors tools for manual work.
Liston, Paul M; Kay, Alison; Cromie, Sam; Leva, Chiara; D'Cruz, Mirabelle; Patel, Harshada; Langley, Alyson; Sharples, Sarah; Aromaa, Susanna
2012-01-01
This paper outlines the approach taken to iteratively evaluate a set of VR/AR (virtual reality / augmented reality) applications for five different manual-work applications - terrestrial spacecraft assembly, assembly-line design, remote maintenance of trains, maintenance of nuclear reactors, and large-machine assembly process design - and examines the evaluation data for evidence of the effectiveness of the evaluation framework as well as the benefits to the development process of feedback from iterative evaluation. ManuVAR is an EU-funded research project that is working to develop an innovative technology platform and a framework to support high-value, high-knowledge manual work throughout the product lifecycle. The results of this study demonstrate the iterative improvements reached throughout the design cycles, observable through the trending of the quantitative results from three successive trials of the applications and the investigation of the qualitative interview findings. The paper discusses the limitations of evaluation in complex, multi-disciplinary development projects and finds evidence of the effectiveness of the use of the particular set of complementary evaluation methods incorporating a common inquiry structure used for the evaluation - particularly in facilitating triangulation of the data.
Beyond mind-reading: multi-voxel pattern analysis of fMRI data.
Norman, Kenneth A; Polyn, Sean M; Detre, Greg J; Haxby, James V
2006-09-01
A key challenge for cognitive neuroscience is determining how mental representations map onto patterns of neural activity. Recently, researchers have started to address this question by applying sophisticated pattern-classification algorithms to distributed (multi-voxel) patterns of functional MRI data, with the goal of decoding the information that is represented in the subject's brain at a particular point in time. This multi-voxel pattern analysis (MVPA) approach has led to several impressive feats of mind reading. More importantly, MVPA methods constitute a useful new tool for advancing our understanding of neural information processing. We review how researchers are using MVPA methods to characterize neural coding and information processing in domains ranging from visual perception to memory search.
Semi-Supervised Multi-View Learning for Gene Network Reconstruction
Ceci, Michelangelo; Pio, Gianvito; Kuzmanovski, Vladimir; Džeroski, Sašo
2015-01-01
The task of gene regulatory network reconstruction from high-throughput data is receiving increasing attention in recent years. As a consequence, many inference methods for solving this task have been proposed in the literature. It has been recently observed, however, that no single inference method performs optimally across all datasets. It has also been shown that the integration of predictions from multiple inference methods is more robust and shows high performance across diverse datasets. Inspired by this research, in this paper, we propose a machine learning solution which learns to combine predictions from multiple inference methods. While this approach adds additional complexity to the inference process, we expect it would also carry substantial benefits. These would come from the automatic adaptation to patterns on the outputs of individual inference methods, so that it is possible to identify regulatory interactions more reliably when these patterns occur. This article demonstrates the benefits (in terms of accuracy of the reconstructed networks) of the proposed method, which exploits an iterative, semi-supervised ensemble-based algorithm. The algorithm learns to combine the interactions predicted by many different inference methods in the multi-view learning setting. The empirical evaluation of the proposed algorithm on a prokaryotic model organism (E. coli) and on a eukaryotic model organism (S. cerevisiae) clearly shows improved performance over the state of the art methods. The results indicate that gene regulatory network reconstruction for the real datasets is more difficult for S. cerevisiae than for E. coli. The software, all the datasets used in the experiments and all the results are available for download at the following link: http://figshare.com/articles/Semi_supervised_Multi_View_Learning_for_Gene_Network_Reconstruction/1604827. PMID:26641091
Kushniruk, Andre; Senathirajah, Yalini; Borycki, Elizabeth
2017-01-01
The usability and safety of health information systems have become major issues in the design and implementation of useful healthcare IT. In this paper we describe a multi-phased multi-method approach to integrating usability engineering methods into system testing to ensure both usability and safety of healthcare IT upon widespread deployment. The approach involves usability testing followed by clinical simulation (conducted in-situ) and "near-live" recording of user interactions with systems. At key stages in this process, usability problems are identified and rectified forming a usability and technology-induced error "safety net" that catches different types of usability and safety problems prior to releasing systems widely in healthcare settings.
Analysis and evaluation of the applicability of green energy technology
NASA Astrophysics Data System (ADS)
Xu, Z. J.; Song, Y. K.
2017-11-01
With the seriousness of environmental issues and the shortage of resources, the applicability of green energy technology has been paid more and more attention by scholars in different fields. However, the current researches are often single in perspective and simple in method. According to the Theory of Applicable Technology, this paper analyzes and defines the green energy technology and its applicability from the all-around perspectives of symbiosis of economy, society, environment and science & technology etc., and correspondingly constructs the evaluation index system. The paper further applies the Fuzzy Comprehensive Evaluation to the evaluation of its applicability, discusses in depth the evaluation models and methods, and explains in detail with an example. The author holds that the applicability of green energy technology involves many aspects of economy, society, environment and science & technology and can be evaluated comprehensively by an index system composed of a number of independent indexes. The evaluation is multi-object, multi-factor, multi-level and fuzzy comprehensive, which is undoubtedly correct, effective and feasible by the Fuzzy Comprehensive Evaluation. It is of vital theoretical and practical significance to understand and evaluate comprehensively the applicability of green energy technology for the rational development and utilization of green energy technology and for the better promotion of sustainable development of human and nature.
Digital Image Correlation of 2D X-ray Powder Diffraction Data for Lattice Strain Evaluation
Zhang, Hongjia; Sui, Tan; Daisenberger, Dominik; Fong, Kai Soon
2018-01-01
High energy 2D X-ray powder diffraction experiments are widely used for lattice strain measurement. The 2D to 1D conversion of diffraction patterns is a necessary step used to prepare the data for full pattern refinement, but is inefficient when only peak centre position information is required for lattice strain evaluation. The multi-step conversion process is likely to lead to increased errors associated with the ‘caking’ (radial binning) or fitting procedures. A new method is proposed here that relies on direct Digital Image Correlation analysis of 2D X-ray powder diffraction patterns (XRD-DIC, for short). As an example of using XRD-DIC, residual strain values along the central line in a Mg AZ31B alloy bar after 3-point bending are calculated by using both XRD-DIC and the conventional ‘caking’ with fitting procedures. Comparison of the results for strain values in different azimuthal angles demonstrates excellent agreement between the two methods. The principal strains and directions are calculated using multiple direction strain data, leading to full in-plane strain evaluation. It is therefore concluded that XRD-DIC provides a reliable and robust method for strain evaluation from 2D powder diffraction data. The XRD-DIC approach simplifies the analysis process by skipping 2D to 1D conversion, and opens new possibilities for robust 2D powder diffraction data analysis for full in-plane strain evaluation. PMID:29543728
USDA-ARS?s Scientific Manuscript database
Process evaluations of large-scale school based programs are necessary to aid in the interpretation of the outcome data. The Louisiana Health (LA Health) study is a multi-component childhood obesity prevention study for middle school children. The Physical Education (PEQ), Intervention (IQ), and F...
USDA-ARS?s Scientific Manuscript database
The process evaluation of HEALTHY, a large multi-center trial to decrease type 2 diabetes mellitus in middle school children, monitored the implementation of the intervention to ascertain the extent that components were delivered and received as intended. The purpose of this article is to report the...
Botti, Mari; Redley, Bernice; Nguyen, Lemai; Coleman, Kimberley; Wickramasinghe, Nilmini
2015-01-01
This research focuses on a major health priority for Australia by addressing existing gaps in the implementation of nursing informatics solutions in healthcare. It serves to inform the successful deployment of IT solutions designed to support patient-centered, frontline acute healthcare delivery by multidisciplinary care teams. The outcomes can guide future evaluations of the contribution of IT solutions to the efficiency, safety and quality of care delivery in acute hospital settings.
Application of advanced structure to multi-tone mask for FPD process
NASA Astrophysics Data System (ADS)
Song, Jin-Han; Jeong, Jin-Woong; Kim, Kyu-Sik; Jeong, Woo-Gun; Yun, Sang-Pil; Lee, Dong-Heok; Choi, Sang-Soo
2017-07-01
In accordance with improvement of FPD technology, masks such as phase shift mask (PSM) and multi-tone mask (MTM) for a particular purpose also have been developed. Above all, the MTM consisted of more than tri-tone transmittance has a substantial advantage which enables to reduce the number of mask demand in FPD fabrication process contrast to normal mask of two-tone transmittance.[1,2] A chromium (Cr)-based MTM (Typically top type) is being widely employed because of convenience of etch process caused by its only Cr-based structure consisted of Cr absorber layer and Cr half-tone layer. However, the top type of Cr-based MTM demands two Cr sputtering processes after each layer etching process and writing process. For this reason, a different material from the Cr-based MTM is required for reduction of mask fabrication time and cost. In this study, we evaluate a MTM which has a structure combined Cr with molybdenum silicide (MoSi) to resolve the issues mentioned above. The MoSi which is demonstrated by integrated circuit (IC) process is a suitable material for MTM evaluation. This structure could realize multi-transmittance in common with the Cr-based MTM. Moreover, it enables to reduce the number of sputtering process. We investigate a optimized structure upon consideration of productivity along with performance such as critical dimension (CD) variation and transmittance range of each structure. The transmittance is targeted at h-line wavelength (405 nm) in the evaluation. Compared with Cr-based MTM, the performances of all Cr-/MoSi-based MTMs are considered.
Chen, Zhe; Zhang, Fumin; Qu, Xinghua; Liang, Baoqiu
2015-01-01
In this paper, we propose a new approach for the measurement and reconstruction of large workpieces with freeform surfaces. The system consists of a handheld laser scanning sensor and a position sensor. The laser scanning sensor is used to acquire the surface and geometry information, and the position sensor is utilized to unify the scanning sensors into a global coordinate system. The measurement process includes data collection, multi-sensor data fusion and surface reconstruction. With the multi-sensor data fusion, errors accumulated during the image alignment and registration process are minimized, and the measuring precision is significantly improved. After the dense accurate acquisition of the three-dimensional (3-D) coordinates, the surface is reconstructed using a commercial software piece, based on the Non-Uniform Rational B-Splines (NURBS) surface. The system has been evaluated, both qualitatively and quantitatively, using reference measurements provided by a commercial laser scanning sensor. The method has been applied for the reconstruction of a large gear rim and the accuracy is up to 0.0963 mm. The results prove that this new combined method is promising for measuring and reconstructing the large-scale objects with complex surface geometry. Compared with reported methods of large-scale shape measurement, it owns high freedom in motion, high precision and high measurement speed in a wide measurement range. PMID:26091396
A Low Power, Parallel Wearable Multi-Sensor System for Human Activity Evaluation.
Li, Yuecheng; Jia, Wenyan; Yu, Tianjian; Luan, Bo; Mao, Zhi-Hong; Zhang, Hong; Sun, Mingui
2015-04-01
In this paper, the design of a low power heterogeneous wearable multi-sensor system, built with Zynq System-on-Chip (SoC), for human activity evaluation is presented. The powerful data processing capability and flexibility of this SoC represent significant improvements over our previous ARM based system designs. The new system captures and compresses multiple color images and sensor data simultaneously. Several strategies are adopted to minimize power consumption. Our wearable system provides a new tool for the evaluation of human activity, including diet, physical activity and lifestyle.
Design of multi-energy Helds coupling testing system of vertical axis wind power system
NASA Astrophysics Data System (ADS)
Chen, Q.; Yang, Z. X.; Li, G. S.; Song, L.; Ma, C.
2016-08-01
The conversion efficiency of wind energy is the focus of researches and concerns as one of the renewable energy. The present methods of enhancing the conversion efficiency are mostly improving the wind rotor structure, optimizing the generator parameters and energy storage controller and so on. Because the conversion process involves in energy conversion of multi-energy fields such as wind energy, mechanical energy and electrical energy, the coupling effect between them will influence the overall conversion efficiency. In this paper, using system integration analysis technology, a testing system based on multi-energy field coupling (MEFC) of vertical axis wind power system is proposed. When the maximum efficiency of wind rotor is satisfied, it can match to the generator function parameters according to the output performance of wind rotor. The voltage controller can transform the unstable electric power to the battery on the basis of optimizing the parameters such as charging times, charging voltage. Through the communication connection and regulation of the upper computer system (UCS), it can make the coupling parameters configure to an optimal state, and it improves the overall conversion efficiency. This method can test the whole wind turbine (WT) performance systematically and evaluate the design parameters effectively. It not only provides a testing method for system structure design and parameter optimization of wind rotor, generator and voltage controller, but also provides a new testing method for the whole performance optimization of vertical axis wind energy conversion system (WECS).
Center for Nondestructive Evaluation - Center for Nondestructive Evaluation
available for the full range of inspection methods, housed in a 52,000 sq. ft. facility with over $5M in - 1990): Development of NDE methods for application to DOE energy and weapons programs, including multi for enhanced frequency bandwidth and improved flaw reconstruction, and novel methods for poling
A GIS-BASED METHOD FOR MULTI-OBJECTIVE EVALUATION OF PARK VEGETATION. (R824766)
In this paper we describe a method for evaluating the concordance between a set of mapped landscape attributes and a set of quantitatively expressed management priorities. The method has proved to be useful in planning urban green areas, allowing objectively d...
Diabetes Bingo: Research Prioritization with the Filipino Community
Oculto, Tessie; Ramones, Emilyn; Caagbay, Cedric R
2010-01-01
This community-based participatory research, conducted in partnership between a European-American academic researcher and a professional group of Filipino nurses, aimed to determine the diabetes research priority for the Filipino community on the island of O‘ahu in Hawai‘i, and to evaluate the multi-voting technique to seek input from the community. The study design was a qualitative, cross-sectional interactive process consisting of an educational presentation followed by data collection from the audience. Ten community presentations about the impact of diabetes on the Filipino community were conducted by a Filipino nurse with participants (N = 265). Following the educational session, the participants selected priorities for research using a multi-vote technique developed as a Diabetes Bingo card. Community voting results identified prevention and a focus on adults as important priorities for research. Based on the results of the multi-voting, the research partners were able to come to consensus on a research priority area of prevention of type 2 diabetes in adults. Multi-voting using a Diabetes Bingo card, preceded by an educational presentation by a Filipino nurse, was a culturally competent community-based participatory research method that gave voice to the participants and direction to the research partners for future projects. The multi-voting technique was readily accepted and enjoyed by participants. PMID:21229487
Research on intelligent machine self-perception method based on LSTM
NASA Astrophysics Data System (ADS)
Wang, Qiang; Cheng, Tao
2018-05-01
In this paper, we use the advantages of LSTM in feature extraction and processing high-dimensional and complex nonlinear data, and apply it to the autonomous perception of intelligent machines. Compared with the traditional multi-layer neural network, this model has memory, can handle time series information of any length. Since the multi-physical domain signals of processing machines have a certain timing relationship, and there is a contextual relationship between states and states, using this deep learning method to realize the self-perception of intelligent processing machines has strong versatility and adaptability. The experiment results show that the method proposed in this paper can obviously improve the sensing accuracy under various working conditions of the intelligent machine, and also shows that the algorithm can well support the intelligent processing machine to realize self-perception.
Liu, Yang; Luo, Zhi-Qiang; Lv, Bei-Ran; Zhao, Hai-Yu; Dong, Ling
2016-04-01
The multiple components in Chinese herbal medicines (CHMS) will experience complex absorption and metabolism before entering the blood system. Previous studies often lay emphasis on the components in blood. However, the dynamic and sequential absorption and metabolism process following multi-component oral administration has not been studied. In this study, the in situ closed-loop method combined with LC-MS techniques were employed to study the sequential process of Chuanxiong Rhizoma decoction (RCD). A total of 14 major components were identified in RCD. Among them, ferulic acid, senkyunolide J, senkyunolide I, senkyunolide F, senkyunolide G, and butylidenephthalide were detected in all of the samples, indicating that the six components could be absorbed into blood in prototype. Butylphthalide, E-ligustilide, Z-ligustilide, cnidilide, senkyunolide A and senkyunolide Q were not detected in all the samples, suggesting that the six components may not be absorbed or metabolized before entering the hepatic portal vein. Senkyunolide H could be metabolized by the liver, while senkyunolide M could be metabolized by both liver and intestinal flora. This study clearly demonstrated the changes in the absorption and metabolism process following multi-component oral administration of RCD, so as to convert the static multi-component absorption process into a comprehensive dynamic and continuous absorption and metabolism process. Copyright© by the Chinese Pharmaceutical Association.
Stainbrook, Kristin; Penney, Darby; Elwyn, Laura
2015-06-01
Multi-site evaluations, particularly of federally funded service programs, pose a special set of challenges for program evaluation. Not only are there contextual differences related to project location, there are often relatively few programmatic requirements, which results in variations in program models, target populations and services. The Jail Diversion and Trauma Recovery-Priority to Veterans (JDTR) National Cross-Site Evaluation was tasked with conducting a multi-site evaluation of thirteen grantee programs that varied along multiple domains. This article describes the use of a mixed methods evaluation design to understand the jail diversion programs and client outcomes for veterans with trauma, mental health and/or substance use problems. We discuss the challenges encountered in evaluating diverse programs, the benefits of the evaluation in the face of these challenges, and offer lessons learned for other evaluators undertaking this type of evaluation. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Xiaohui; Couwenhoven, Mary E.; Foos, David H.; Doran, James; Yankelevitz, David F.; Henschke, Claudia I.
2008-03-01
An image-processing method has been developed to improve the visibility of tube and catheter features in portable chest x-ray (CXR) images captured in the intensive care unit (ICU). The image-processing method is based on a multi-frequency approach, wherein the input image is decomposed into different spatial frequency bands, and those bands that contain the tube and catheter signals are individually enhanced by nonlinear boosting functions. Using a random sampling strategy, 50 cases were retrospectively selected for the study from a large database of portable CXR images that had been collected from multiple institutions over a two-year period. All images used in the study were captured using photo-stimulable, storage phosphor computed radiography (CR) systems. Each image was processed two ways. The images were processed with default image processing parameters such as those used in clinical settings (control). The 50 images were then separately processed using the new tube and catheter enhancement algorithm (test). Three board-certified radiologists participated in a reader study to assess differences in both detection-confidence performance and diagnostic efficiency between the control and test images. Images were evaluated on a diagnostic-quality, 3-megapixel monochrome monitor. Two scenarios were studied: the baseline scenario, representative of today's workflow (a single-control image presented with the window/level adjustments enabled) vs. the test scenario (a control/test image pair presented with a toggle enabled and the window/level settings disabled). The radiologists were asked to read the images in each scenario as they normally would for clinical diagnosis. Trend analysis indicates that the test scenario offers improved reading efficiency while providing as good or better detection capability compared to the baseline scenario.
Development of Chemical Process Design and Control for ...
This contribution describes a novel process systems engineering framework that couples advanced control with sustainability evaluation and decision making for the optimization of process operations to minimize environmental impacts associated with products, materials, and energy. The implemented control strategy combines a biologically inspired method with optimal control concepts for finding more sustainable operating trajectories. The sustainability assessment of process operating points is carried out by using the U.S. E.P.A.’s Gauging Reaction Effectiveness for the ENvironmental Sustainability of Chemistries with a multi-Objective Process Evaluator (GREENSCOPE) tool that provides scores for the selected indicators in the economic, material efficiency, environmental and energy areas. The indicator scores describe process performance on a sustainability measurement scale, effectively determining which operating point is more sustainable if there are more than several steady states for one specific product manufacturing. Through comparisons between a representative benchmark and the optimal steady-states obtained through implementation of the proposed controller, a systematic decision can be made in terms of whether the implementation of the controller is moving the process towards a more sustainable operation. The effectiveness of the proposed framework is illustrated through a case study of a continuous fermentation process for fuel production, whose materi
High-frequency stock linkage and multi-dimensional stationary processes
NASA Astrophysics Data System (ADS)
Wang, Xi; Bao, Si; Chen, Jingchao
2017-02-01
In recent years, China's stock market has experienced dramatic fluctuations; in particular, in the second half of 2014 and 2015, the market rose sharply and fell quickly. Many classical financial phenomena, such as stock plate linkage, appeared repeatedly during this period. In general, these phenomena have usually been studied using daily-level data or minute-level data. Our paper focuses on the linkage phenomenon in Chinese stock 5-second-level data during this extremely volatile period. The method used to select the linkage points and the arbitrage strategy are both based on multi-dimensional stationary processes. A new program method for testing the multi-dimensional stationary process is proposed in our paper, and the detailed program is presented in the paper's appendix. Because of the existence of the stationary process, the strategy's logarithmic cumulative average return will converge under the condition of the strong ergodic theorem, and this ensures the effectiveness of the stocks' linkage points and the more stable statistical arbitrage strategy.
Sáez, Carlos; Robles, Montserrat; García-Gómez, Juan M
2017-02-01
Biomedical data may be composed of individuals generated from distinct, meaningful sources. Due to possible contextual biases in the processes that generate data, there may exist an undesirable and unexpected variability among the probability distribution functions (PDFs) of the source subsamples, which, when uncontrolled, may lead to inaccurate or unreproducible research results. Classical statistical methods may have difficulties to undercover such variabilities when dealing with multi-modal, multi-type, multi-variate data. This work proposes two metrics for the analysis of stability among multiple data sources, robust to the aforementioned conditions, and defined in the context of data quality assessment. Specifically, a global probabilistic deviation and a source probabilistic outlyingness metrics are proposed. The first provides a bounded degree of the global multi-source variability, designed as an estimator equivalent to the notion of normalized standard deviation of PDFs. The second provides a bounded degree of the dissimilarity of each source to a latent central distribution. The metrics are based on the projection of a simplex geometrical structure constructed from the Jensen-Shannon distances among the sources PDFs. The metrics have been evaluated and demonstrated their correct behaviour on a simulated benchmark and with real multi-source biomedical data using the UCI Heart Disease data set. The biomedical data quality assessment based on the proposed stability metrics may improve the efficiency and effectiveness of biomedical data exploitation and research.
NASA Astrophysics Data System (ADS)
Zhu, Ming; Liu, Tingting; Wang, Shu; Zhang, Kesheng
2017-08-01
Existing two-frequency reconstructive methods can only capture primary (single) molecular relaxation processes in excitable gases. In this paper, we present a reconstructive method based on the novel decomposition of frequency-dependent acoustic relaxation spectra to capture the entire molecular multimode relaxation process. This decomposition of acoustic relaxation spectra is developed from the frequency-dependent effective specific heat, indicating that a multi-relaxation process is the sum of the interior single-relaxation processes. Based on this decomposition, we can reconstruct the entire multi-relaxation process by capturing the relaxation times and relaxation strengths of N interior single-relaxation processes, using the measurements of acoustic absorption and sound speed at 2N frequencies. Experimental data for the gas mixtures CO2-N2 and CO2-O2 validate our decomposition and reconstruction approach.
Zhao, Ming; Li, Yu; Peng, Leilei
2014-01-01
We report a fast non-iterative lifetime data analysis method for the Fourier multiplexed frequency-sweeping confocal FLIM (Fm-FLIM) system [ Opt. Express22, 10221 ( 2014)24921725]. The new method, named R-method, allows fast multi-channel lifetime image analysis in the system’s FPGA data processing board. Experimental tests proved that the performance of the R-method is equivalent to that of single-exponential iterative fitting, and its sensitivity is well suited for time-lapse FLIM-FRET imaging of live cells, for example cyclic adenosine monophosphate (cAMP) level imaging with GFP-Epac-mCherry sensors. With the R-method and its FPGA implementation, multi-channel lifetime images can now be generated in real time on the multi-channel frequency-sweeping FLIM system, and live readout of FRET sensors can be performed during time-lapse imaging. PMID:25321778
Miller, Julie M; Dewey, Marc; Vavere, Andrea L; Rochitte, Carlos E; Niinuma, Hiroyuki; Arbab-Zadeh, Armin; Paul, Narinder; Hoe, John; de Roos, Albert; Yoshioka, Kunihiro; Lemos, Pedro A; Bush, David E; Lardo, Albert C; Texter, John; Brinker, Jeffery; Cox, Christopher; Clouse, Melvin E; Lima, João A C
2009-04-01
Multislice computed tomography (MSCT) for the noninvasive detection of coronary artery stenoses is a promising candidate for widespread clinical application because of its non-invasive nature and high sensitivity and negative predictive value as found in several previous studies using 16 to 64 simultaneous detector rows. A multi-centre study of CT coronary angiography using 16 simultaneous detector rows has shown that 16-slice CT is limited by a high number of nondiagnostic cases and a high false-positive rate. A recent meta-analysis indicated a significant interaction between the size of the study sample and the diagnostic odds ratios suggestive of small study bias, highlighting the importance of evaluating MSCT using 64 simultaneous detector rows in a multi-centre approach with a larger sample size. In this manuscript we detail the objectives and methods of the prospective "CORE-64" trial ("Coronary Evaluation Using Multidetector Spiral Computed Tomography Angiography using 64 Detectors"). This multi-centre trial was unique in that it assessed the diagnostic performance of 64-slice CT coronary angiography in nine centres worldwide in comparison to conventional coronary angiography. In conclusion, the multi-centre, multi-institutional and multi-continental trial CORE-64 has great potential to ultimately assess the per-patient diagnostic performance of coronary CT angiography using 64 simultaneous detector rows.
NASA Astrophysics Data System (ADS)
Ayadi, Omar; Felfel, Houssem; Masmoudi, Faouzi
2017-07-01
The current manufacturing environment has changed from traditional single-plant to multi-site supply chain where multiple plants are serving customer demands. In this article, a tactical multi-objective, multi-period, multi-product, multi-site supply-chain planning problem is proposed. A corresponding optimization model aiming to simultaneously minimize the total cost, maximize product quality and maximize the customer satisfaction demand level is developed. The proposed solution approach yields to a front of Pareto-optimal solutions that represents the trade-offs among the different objectives. Subsequently, the analytic hierarchy process method is applied to select the best Pareto-optimal solution according to the preferences of the decision maker. The robustness of the solutions and the proposed approach are discussed based on a sensitivity analysis and an application to a real case from the textile and apparel industry.
2014-01-01
Background Systematic planning could improve the generally moderate effectiveness of interventions to enhance adherence to clinical practice guidelines. The aim of our study was to demonstrate how the process of Intervention Mapping was used to develop an intervention to address the lack of adherence to the national CPG for low back pain by Dutch physical therapists. Methods We systematically developed a program to improve adherence to the Dutch physical therapy guidelines for low back pain. Based on multi-method formative research, we formulated program and change objectives. Selected theory-based methods of change and practical applications were combined into an intervention program. Implementation and evaluation plans were developed. Results Formative research revealed influential determinants for physical therapists and practice quality managers. Self-regulation was appropriate because both the physical therapists and the practice managers needed to monitor current practice and make and implement plans for change. The program stimulated interaction between practice levels by emphasizing collective goal setting. It combined practical applications, such as knowledge transfer and discussion-and-feedback, based on theory-based methods, such as consciousness raising and active learning. The implementation plan incorporated the wider environment. The evaluation plan included an effect and process evaluation. Conclusions Intervention Mapping is a useful framework for formative data in program planning in the field of clinical guideline implementation. However, a decision aid to select determinants of guideline adherence identified in the formative research to analyse the problem may increase the efficiency of the application of the Intervention Mapping process. PMID:24428945
Liu, Jinjun; Leng, Yonggang; Lai, Zhihui; Fan, Shengbo
2018-04-25
Mechanical fault diagnosis usually requires not only identification of the fault characteristic frequency, but also detection of its second and/or higher harmonics. However, it is difficult to detect a multi-frequency fault signal through the existing Stochastic Resonance (SR) methods, because the characteristic frequency of the fault signal as well as its second and higher harmonics frequencies tend to be large parameters. To solve the problem, this paper proposes a multi-frequency signal detection method based on Frequency Exchange and Re-scaling Stochastic Resonance (FERSR). In the method, frequency exchange is implemented using filtering technique and Single SideBand (SSB) modulation. This new method can overcome the limitation of "sampling ratio" which is the ratio of the sampling frequency to the frequency of target signal. It also ensures that the multi-frequency target signals can be processed to meet the small-parameter conditions. Simulation results demonstrate that the method shows good performance for detecting a multi-frequency signal with low sampling ratio. Two practical cases are employed to further validate the effectiveness and applicability of this method.
Bi, Sheng; Zeng, Xiao; Tang, Xin; Qin, Shujia; Lai, King Wai Chiu
2016-01-01
Compressive sensing (CS) theory has opened up new paths for the development of signal processing applications. Based on this theory, a novel single pixel camera architecture has been introduced to overcome the current limitations and challenges of traditional focal plane arrays. However, video quality based on this method is limited by existing acquisition and recovery methods, and the method also suffers from being time-consuming. In this paper, a multi-frame motion estimation algorithm is proposed in CS video to enhance the video quality. The proposed algorithm uses multiple frames to implement motion estimation. Experimental results show that using multi-frame motion estimation can improve the quality of recovered videos. To further reduce the motion estimation time, a block match algorithm is used to process motion estimation. Experiments demonstrate that using the block match algorithm can reduce motion estimation time by 30%. PMID:26950127
Abrahamse, Mariëlle E; Jonkman, Caroline S; Harting, Janneke
2018-04-10
The large number of children that grow up in poverty is concerning, especially given the negative developmental outcomes that can persist into adulthood. Poverty has been found as a risk factor to negatively affect academic achievement and health outcomes in children. Interdisciplinary interventions can be an effective way to promote health and academic achievement. The present study aims to evaluate a school-based interdisciplinary approach on child health, poverty, and academic achievement using a mixed-method design. Generally taken, outcomes of this study increase the knowledge about effective ways to give disadvantaged children equal chances early in their lives. An observational study with a mixed-methods design including both quantitative and qualitative data collection methods will be used to evaluate the interdisciplinary approach. The overall research project exists of three study parts including a longitudinal study, a cross-sectional study, and a process evaluation. Using a multi-source approach we will assess child health as the primary outcome. Child poverty and child academic achievement will be assessed as secondary outcomes. The process evaluation will observe the program's effects on the school environment and the program's implementation in order to obtain more knowledge on how to disseminate the interdisciplinary approach to other schools and neighborhoods. The implementation of a school-based interdisciplinary approach via primary schools combining the cross-sectoral domains health, poverty, and academic achievement is innovative and a step forward to reach an ethnic minority population. However, the large variety of the interventions and activities within the approach can limit the validity of the study. Including a process evaluation will therefore help to improve the interpretation of our findings. In order to contribute to policy and practice focusing on decreasing the unequal chances of children growing up in deprived neighborhoods, it is important to study whether the intervention leads to positive developmental outcomes in children. ( NTR 6571 ) (retrospectively registered on August 4, 2017).
Thermodynamic Modelling of Phase Transformation in a Multi-Component System
NASA Astrophysics Data System (ADS)
Vala, J.
2007-09-01
Diffusion in multi-component alloys can be characterized by the vacancy mechanism for substitutional components, by the existence of sources and sinks for vacancies and by the motion of atoms of interstitial components. The description of diffusive and massive phase transformation of a multi-component system is based on the thermodynamic extremal principle by Onsager; the finite thickness of the interface between both phases is respected. The resulting system of partial differential equations of evolution with integral terms for unknown mole fractions (and additional variables in case of non-ideal sources and sinks for vacancies), can be analyzed using the method of lines and the finite difference technique (or, alternatively, the finite element one) together with the semi-analytic and numerical integration formulae and with certain iteration procedure, making use of the spectral properties of linear operators. The original software code for the numerical evaluation of solutions of such systems, written in MATLAB, offers a chance to simulate various real processes of diffusional phase transformation. Some results for the (nearly) steady-state real processes in substitutional alloys have been published yet. The aim of this paper is to demonstrate that the same approach can handle both substitutional and interstitial components even in case of a general system of evolution.
NASA Astrophysics Data System (ADS)
Li, Y. Chao; Ding, Q.; Gao, Y.; Ran, L. Ling; Yang, J. Ru; Liu, C. Yu; Wang, C. Hui; Sun, J. Feng
2014-07-01
This paper proposes a novel method of multi-beam laser heterodyne measurement for Young modulus. Based on Doppler effect and heterodyne technology, loaded the information of length variation to the frequency difference of the multi-beam laser heterodyne signal by the frequency modulation of the oscillating mirror, this method can obtain many values of length variation caused by mass variation after the multi-beam laser heterodyne signal demodulation simultaneously. Processing these values by weighted-average, it can obtain length variation accurately, and eventually obtain value of Young modulus of the sample by the calculation. This novel method is used to simulate measurement for Young modulus of wire under different mass by MATLAB, the obtained result shows that the relative measurement error of this method is just 0.3%.
Effect of liquid-to-solid ratio on semi-solid Fenton process in hazardous solid waste detoxication.
Hu, Li-Fang; Feng, Hua-Jun; Long, Yu-Yang; Zheng, Yuan-Ge; Fang, Cheng-Ran; Shen, Dong-Sheng
2011-01-01
The liquid-to-solid ratio (L/S) of semi-solid Fenton process (SSFP) designated for hazardous solid waste detoxication was investigated. The removal and minimization effects of o-nitroaniline (ONA) in simulate solid waste residue (SSWR) from organic arsenic industry was evaluated by total organic carbon (TOC) and ONA removal efficiency, respectively. Initially, Box-Behnken design (BBD) and response surface methodology (RSM) were used to optimize the key factors of SSFP. Results showed that the removal rates of TOC and ONA decreased as L/S increased. Subsequently, four target initial ONA concentrations including 100 mg kg(-1), 1 g kg(-1), 10 g kg(-1), and 100 gk g(-1) on a dry basis were evaluated for the effect of L/S. A significant cubic empirical model between the initial ONA concentration and L/S was successfully developed to predict the optimal L/S for given initial ONA concentration for SSFP. Moreover, an optimized operation strategy of multi-SSFP for different cases was determined based on the residual target pollutant concentration and the corresponding environmental conditions. It showed that the total L/S of multi-SSFP in all tested scenarios was no greater than 3.8, which is lower than the conventional slurry systems (L/S ≥ 5). The multi-SSFP is environment-friendly when it used for detoxication of hazardous solid waste contaminated by ONA and provides a potential method for the detoxication of hazardous solid waste contaminated by organics. Copyright © 2010 Elsevier Ltd. All rights reserved.
Abdulqader Hussein, Ahmed; Rahman, Tharek A.; Leow, Chee Yen
2015-01-01
Localization is an apparent aspect of a wireless sensor network, which is the focus of much interesting research. One of the severe conditions that needs to be taken into consideration is localizing a mobile target through a dispersed sensor network in the presence of physical barrier attacks. These attacks confuse the localization process and cause location estimation errors. Range-based methods, like the received signal strength indication (RSSI), face the major influence of this kind of attack. This paper proposes a solution based on a combination of multi-frequency multi-power localization (C-MFMPL) and step function multi-frequency multi-power localization (SF-MFMPL), including the fingerprint matching technique and lateration, to provide a robust and accurate localization technique. In addition, this paper proposes a grid coloring algorithm to detect the signal hole map in the network, which refers to the attack-prone regions, in order to carry out corrective actions. The simulation results show the enhancement and robustness of RSS localization performance in the face of log normal shadow fading effects, besides the presence of physical barrier attacks, through detecting, filtering and eliminating the effect of these attacks. PMID:26690159
Su, Min; Ge, Lei; Kong, Qingkun; Zheng, Xiaoxiao; Ge, Shenguang; Li, Nianqiang; Yu, Jinghua; Yan, Mei
2015-01-15
A novel electrochemical lab-on-paper cyto-device (ELPCD) was fabricated to demonstrate sensitive and specific cancer cell detection as well as in-situ monitoring of multi-glycans on living cancer cells. In this ELPCD, aptamers modified three-dimensional macroporous Au-paper electrode (Au-PE) was employed as the working electrode for specific and efficient cancer cell capture. Using a sandwich format, sensitive and reproducible cell detection was achieved in this ELPCD on the basis of the electrochemical signal amplification of the Au-PE and the horseradish peroxidase-lectin electrochemical probe. The ELPCD displayed excellent analytical performance for the detection of four K562 cells with a wide linear calibration range from 550 to 2.0×10(7) cells mL(-1). Then, this ELPCD was successfully applied to determine cell-surface multi-glycans in parallel and in-situ monitor multi-glycans expression on living cells in response to drug treatment through in-electrode 3D cell culture. The proposed method provides promising application in decipherment of the glycomic codes as well as clinical diagnosis and treatment in early process of cancer. Copyright © 2014 Elsevier B.V. All rights reserved.
Hussein, Ahmed Abdulqader; Rahman, Tharek A; Leow, Chee Yen
2015-12-04
Localization is an apparent aspect of a wireless sensor network, which is the focus of much interesting research. One of the severe conditions that needs to be taken into consideration is localizing a mobile target through a dispersed sensor network in the presence of physical barrier attacks. These attacks confuse the localization process and cause location estimation errors. Range-based methods, like the received signal strength indication (RSSI), face the major influence of this kind of attack. This paper proposes a solution based on a combination of multi-frequency multi-power localization (C-MFMPL) and step function multi-frequency multi-power localization (SF-MFMPL), including the fingerprint matching technique and lateration, to provide a robust and accurate localization technique. In addition, this paper proposes a grid coloring algorithm to detect the signal hole map in the network, which refers to the attack-prone regions, in order to carry out corrective actions. The simulation results show the enhancement and robustness of RSS localization performance in the face of log normal shadow fading effects, besides the presence of physical barrier attacks, through detecting, filtering and eliminating the effect of these attacks.
Ren, Yuanqiang; Qiu, Lei; Yuan, Shenfang; Bao, Qiao
2017-05-11
Structural health monitoring (SHM) of aircraft composite structure is helpful to increase reliability and reduce maintenance costs. Due to the great effectiveness in distinguishing particular guided wave modes and identifying the propagation direction, the spatial-wavenumber filter technique has emerged as an interesting SHM topic. In this paper, a new scanning spatial-wavenumber filter (SSWF) based imaging method for multiple damages is proposed to conduct on-line monitoring of aircraft composite structures. Firstly, an on-line multi-damage SSWF is established, including the fundamental principle of SSWF for multiple damages based on a linear piezoelectric (PZT) sensor array, and a corresponding wavenumber-time imaging mechanism by using the multi-damage scattering signal. Secondly, through combining the on-line multi-damage SSWF and a PZT 2D cross-shaped array, an image-mapping method is proposed to conduct wavenumber synthesis and convert the two wavenumber-time images obtained by the PZT 2D cross-shaped array to an angle-distance image, from which the multiple damages can be directly recognized and located. In the experimental validation, both simulated multi-damage and real multi-damage introduced by repeated impacts are performed on a composite plate structure. The maximum localization error is less than 2 cm, which shows good performance of the multi-damage imaging method. Compared with the existing spatial-wavenumber filter based damage evaluation methods, the proposed method requires no more than the multi-damage scattering signal and can be performed without depending on any wavenumber modeling or measuring. Besides, this method locates multiple damages by imaging instead of the geometric method, which helps to improve the signal-to-noise ratio. Thus, it can be easily applied to on-line multi-damage monitoring of aircraft composite structures.
Ren, Yuanqiang; Qiu, Lei; Yuan, Shenfang; Bao, Qiao
2017-01-01
Structural health monitoring (SHM) of aircraft composite structure is helpful to increase reliability and reduce maintenance costs. Due to the great effectiveness in distinguishing particular guided wave modes and identifying the propagation direction, the spatial-wavenumber filter technique has emerged as an interesting SHM topic. In this paper, a new scanning spatial-wavenumber filter (SSWF) based imaging method for multiple damages is proposed to conduct on-line monitoring of aircraft composite structures. Firstly, an on-line multi-damage SSWF is established, including the fundamental principle of SSWF for multiple damages based on a linear piezoelectric (PZT) sensor array, and a corresponding wavenumber-time imaging mechanism by using the multi-damage scattering signal. Secondly, through combining the on-line multi-damage SSWF and a PZT 2D cross-shaped array, an image-mapping method is proposed to conduct wavenumber synthesis and convert the two wavenumber-time images obtained by the PZT 2D cross-shaped array to an angle-distance image, from which the multiple damages can be directly recognized and located. In the experimental validation, both simulated multi-damage and real multi-damage introduced by repeated impacts are performed on a composite plate structure. The maximum localization error is less than 2 cm, which shows good performance of the multi-damage imaging method. Compared with the existing spatial-wavenumber filter based damage evaluation methods, the proposed method requires no more than the multi-damage scattering signal and can be performed without depending on any wavenumber modeling or measuring. Besides, this method locates multiple damages by imaging instead of the geometric method, which helps to improve the signal-to-noise ratio. Thus, it can be easily applied to on-line multi-damage monitoring of aircraft composite structures. PMID:28772879
Application of Fuzzy TOPSIS for evaluating machining techniques using sustainability metrics
NASA Astrophysics Data System (ADS)
Digalwar, Abhijeet K.
2018-04-01
Sustainable processes and techniques are getting increased attention over the last few decades due to rising concerns over the environment, improved focus on productivity and stringency in environmental as well as occupational health and safety norms. The present work analyzes the research on sustainable machining techniques and identifies techniques and parameters on which sustainability of a process is evaluated. Based on the analysis these parameters are then adopted as criteria’s to evaluate different sustainable machining techniques such as Cryogenic Machining, Dry Machining, Minimum Quantity Lubrication (MQL) and High Pressure Jet Assisted Machining (HPJAM) using a fuzzy TOPSIS framework. In order to facilitate easy arithmetic, the linguistic variables represented by fuzzy numbers are transformed into crisp numbers based on graded mean representation. Cryogenic machining was found to be the best alternative sustainable technique as per the fuzzy TOPSIS framework adopted. The paper provides a method to deal with multi criteria decision making problems in a complex and linguistic environment.
NASA Astrophysics Data System (ADS)
Tacnet, Jean-Marc; Dupouy, Guillaume; Carladous, Simon; Dezert, Jean; Batton-Hubert, Mireille
2017-04-01
In mountain areas, natural phenomena such as snow avalanches, debris-flows and rock-falls, put people and objects at risk with sometimes dramatic consequences. Risk is classically considered as a combination of hazard, the combination of the intensity and frequency of the phenomenon, and vulnerability which corresponds to the consequences of the phenomenon on exposed people and material assets. Risk management consists in identifying the risk level as well as choosing the best strategies for risk prevention, i.e. mitigation. In the context of natural phenomena in mountainous areas, technical and scientific knowledge is often lacking. Risk management decisions are therefore based on imperfect information. This information comes from more or less reliable sources ranging from historical data, expert assessments, numerical simulations etc. Finally, risk management decisions are the result of complex knowledge management and reasoning processes. Tracing the information and propagating information quality from data acquisition to decisions are therefore important steps in the decision-making process. One major goal today is therefore to assist decision-making while considering the availability, quality and reliability of information content and sources. A global integrated framework is proposed to improve the risk management process in a context of information imperfection provided by more or less reliable sources: uncertainty as well as imprecision, inconsistency and incompleteness are considered. Several methods are used and associated in an original way: sequential decision context description, development of specific multi-criteria decision-making methods, imperfection propagation in numerical modeling and information fusion. This framework not only assists in decision-making but also traces the process and evaluates the impact of information quality on decision-making. We focus and present two main developments. The first one relates to uncertainty and imprecision propagation in numerical modeling using both classical Monte-Carlo probabilistic approach and also so-called Hybrid approach using possibility theory. Second approach deals with new multi-criteria decision-making methods which consider information imperfection, source reliability, importance and conflict, using fuzzy sets as well as possibility and belief function theories. Implemented methods consider information imperfection propagation and information fusion in total aggregation methods such as AHP (Saaty, 1980) or partial aggregation methods such as the Electre outranking method (see Soft Electre Tri ) or decisions in certain but also risky or uncertain contexts (see new COWA-ER and FOWA-ER- Cautious and Fuzzy Ordered Weighted Averaging-Evidential Reasoning). For example, the ER-MCDA methodology considers expert assessment as a multi-criteria decision process based on imperfect information provided by more or less heterogeneous, reliable and conflicting sources: it mixes AHP, fuzzy sets theory, possibility theory and belief function theory using DSmT (Dezert-Smarandache Theory) framework which provides powerful fusion rules.
Hospital site selection using fuzzy AHP and its derivatives.
Vahidnia, Mohammad H; Alesheikh, Ali A; Alimohammadi, Abbas
2009-07-01
Environmental managers are commonly faced with sophisticated decisions, such as choosing the location of a new facility subject to multiple conflicting criteria. This paper considers the specific problem of creating a well-distributed network of hospitals that delivers its services to the target population with minimal time, pollution and cost. We develop a Multi-Criteria Decision Analysis process that combines Geographical Information System (GIS) analysis with the Fuzzy Analytical Hierarchy Process (FAHP), and use this process to determine the optimum site for a new hospital in the Tehran urban area. The GIS was used to calculate and classify governing criteria, while FAHP was used to evaluate the decision factors and their impacts on alternative sites. Three methods were used to estimate the total weights and priorities of the candidate sites: fuzzy extent analysis, center-of-area defuzzification, and the alpha-cut method. The three methods yield identical priorities for the five alternatives considered. Fuzzy extent analysis provides less discriminating power, but is simpler to implement and compute than the other two methods. The alpha-cut method is more complicated, but integrates the uncertainty and overall attitude of the decision-maker. The usefulness of the new hospital site is evaluated by computing an accessibility index for each pixel in the GIS, defined as the ratio of population density to travel time. With the addition of a new hospital at the optimum site, this index improved over about 6.5 percent of the geographical area.
A Preliminary Rubric Design to Evaluate Mixed Methods Research
ERIC Educational Resources Information Center
Burrows, Timothy J.
2013-01-01
With the increase in frequency of the use of mixed methods, both in research publications and in externally funded grants there are increasing calls for a set of standards to assess the quality of mixed methods research. The purpose of this mixed methods study was to conduct a multi-phase analysis to create a preliminary rubric to evaluate mixed…
Relaxation channels of multi-photon excited xenon clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Serdobintsev, P. Yu.; Melnikov, A. S.; Department of Physics, St. Petersburg State University, Saint Petersburg 198904
2015-09-21
The relaxation processes of the xenon clusters subjected to multi-photon excitation by laser radiation with quantum energies significantly lower than the thresholds of excitation of atoms and ionization of clusters were studied. Results obtained by means of the photoelectron spectroscopy method showed that desorption processes of excited atoms play a significant role in the decay of two-photon excited xenon clusters. A number of excited states of xenon atoms formed during this process were discovered and identified.
Cheng, Xiaoyin; Li, Zhoulei; Liu, Zhen; Navab, Nassir; Huang, Sung-Cheng; Keller, Ulrich; Ziegler, Sibylle; Shi, Kuangyu
2015-02-12
The separation of multiple PET tracers within an overlapping scan based on intrinsic differences of tracer pharmacokinetics is challenging, due to limited signal-to-noise ratio (SNR) of PET measurements and high complexity of fitting models. In this study, we developed a direct parametric image reconstruction (DPIR) method for estimating kinetic parameters and recovering single tracer information from rapid multi-tracer PET measurements. This is achieved by integrating a multi-tracer model in a reduced parameter space (RPS) into dynamic image reconstruction. This new RPS model is reformulated from an existing multi-tracer model and contains fewer parameters for kinetic fitting. Ordered-subsets expectation-maximization (OSEM) was employed to approximate log-likelihood function with respect to kinetic parameters. To incorporate the multi-tracer model, an iterative weighted nonlinear least square (WNLS) method was employed. The proposed multi-tracer DPIR (MTDPIR) algorithm was evaluated on dual-tracer PET simulations ([18F]FDG and [11C]MET) as well as on preclinical PET measurements ([18F]FLT and [18F]FDG). The performance of the proposed algorithm was compared to the indirect parameter estimation method with the original dual-tracer model. The respective contributions of the RPS technique and the DPIR method to the performance of the new algorithm were analyzed in detail. For the preclinical evaluation, the tracer separation results were compared with single [18F]FDG scans of the same subjects measured 2 days before the dual-tracer scan. The results of the simulation and preclinical studies demonstrate that the proposed MT-DPIR method can improve the separation of multiple tracers for PET image quantification and kinetic parameter estimations.
Deng, Bo; Shi, Yaoyao; Yu, Tao; Kang, Chao; Zhao, Pan
2018-01-31
The composite tape winding process, which utilizes a tape winding machine and prepreg tapes, provides a promising way to improve the quality of composite products. Nevertheless, the process parameters of composite tape winding have crucial effects on the tensile strength and void content, which are closely related to the performances of the winding products. In this article, two different object values of winding products, including mechanical performance (tensile strength) and a physical property (void content), were respectively calculated. Thereafter, the paper presents an integrated methodology by combining multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis to obtain the optimal intervals of the composite tape winding process. First, the global multi-parameter sensitivity analysis method was applied to investigate the sensitivity of each parameter in the tape winding processing. Then, the local single-parameter sensitivity analysis method was employed to calculate the sensitivity of a single parameter within the corresponding range. Finally, the stability and instability ranges of each parameter were distinguished. Meanwhile, the authors optimized the process parameter ranges and provided comprehensive optimized intervals of the winding parameters. The verification test validated that the optimized intervals of the process parameters were reliable and stable for winding products manufacturing.
Yu, Tao; Kang, Chao; Zhao, Pan
2018-01-01
The composite tape winding process, which utilizes a tape winding machine and prepreg tapes, provides a promising way to improve the quality of composite products. Nevertheless, the process parameters of composite tape winding have crucial effects on the tensile strength and void content, which are closely related to the performances of the winding products. In this article, two different object values of winding products, including mechanical performance (tensile strength) and a physical property (void content), were respectively calculated. Thereafter, the paper presents an integrated methodology by combining multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis to obtain the optimal intervals of the composite tape winding process. First, the global multi-parameter sensitivity analysis method was applied to investigate the sensitivity of each parameter in the tape winding processing. Then, the local single-parameter sensitivity analysis method was employed to calculate the sensitivity of a single parameter within the corresponding range. Finally, the stability and instability ranges of each parameter were distinguished. Meanwhile, the authors optimized the process parameter ranges and provided comprehensive optimized intervals of the winding parameters. The verification test validated that the optimized intervals of the process parameters were reliable and stable for winding products manufacturing. PMID:29385048
Experimental application of OMA solutions on the model of industrial structure
NASA Astrophysics Data System (ADS)
Mironov, A.; Mironovs, D.
2017-10-01
It is very important and sometimes even vital to maintain reliability of industrial structures. High quality control during production and structural health monitoring (SHM) in exploitation provides reliable functioning of large, massive and remote structures, like wind generators, pipelines, power line posts, etc. This paper introduces a complex of technological and methodical solutions for SHM and diagnostics of industrial structures, including those that are actuated by periodic forces. Solutions were verified on a wind generator scaled model with integrated system of piezo-film deformation sensors. Simultaneous and multi-patch Operational Modal Analysis (OMA) approaches were implemented as methodical means for structural diagnostics and monitoring. Specially designed data processing algorithms provide objective evaluation of structural state modification.
A hybrid fault diagnosis approach based on mixed-domain state features for rotating machinery.
Xue, Xiaoming; Zhou, Jianzhong
2017-01-01
To make further improvement in the diagnosis accuracy and efficiency, a mixed-domain state features data based hybrid fault diagnosis approach, which systematically blends both the statistical analysis approach and the artificial intelligence technology, is proposed in this work for rolling element bearings. For simplifying the fault diagnosis problems, the execution of the proposed method is divided into three steps, i.e., fault preliminary detection, fault type recognition and fault degree identification. In the first step, a preliminary judgment about the health status of the equipment can be evaluated by the statistical analysis method based on the permutation entropy theory. If fault exists, the following two processes based on the artificial intelligence approach are performed to further recognize the fault type and then identify the fault degree. For the two subsequent steps, mixed-domain state features containing time-domain, frequency-domain and multi-scale features are extracted to represent the fault peculiarity under different working conditions. As a powerful time-frequency analysis method, the fast EEMD method was employed to obtain multi-scale features. Furthermore, due to the information redundancy and the submergence of original feature space, a novel manifold learning method (modified LGPCA) is introduced to realize the low-dimensional representations for high-dimensional feature space. Finally, two cases with 12 working conditions respectively have been employed to evaluate the performance of the proposed method, where vibration signals were measured from an experimental bench of rolling element bearing. The analysis results showed the effectiveness and the superiority of the proposed method of which the diagnosis thought is more suitable for practical application. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
2018-01-01
Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users’ queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with “vanilla” LSH, even when using the same amount of space. PMID:29346410
Gao, Wen; Wang, Rui; Li, Dan; Liu, Ke; Chen, Jun; Li, Hui-Jun; Xu, Xiaojun; Li, Ping; Yang, Hua
2016-01-05
The flowers of Lonicera japonica Thunb. were extensively used to treat many diseases. As the demands for L. japonica increased, some related Lonicera plants were often confused or misused. Caffeoylquinic acids were always regarded as chemical markers in the quality control of L. japonica, but they could be found in all Lonicera species. Thus, a simple and reliable method for the evaluation of different Lonicera flowers is necessary to be established. In this work a method based on single standard to determine multi-components (SSDMC) combined with principal component analysis (PCA) for control and distinguish of Lonicera species flowers have been developed. Six components including three caffeoylquinic acids and three iridoid glycosides were assayed simultaneously using chlorogenic acid as the reference standard. The credibility and feasibility of the SSDMC method were carefully validated and the results demonstrated that there were no remarkable differences compared with external standard method. Finally, a total of fifty-one batches covering five Lonicera species were analyzed and PCA was successfully applied to distinguish the Lonicera species. This strategy simplifies the processes in the quality control of multiple-componential herbal medicine which effectively adapted for improving the quality control of those herbs belonging to closely related species. Copyright © 2015 Elsevier B.V. All rights reserved.
A Multi-touch Tool for Co-creation
NASA Astrophysics Data System (ADS)
Ludden, Geke D. S.; Broens, Tom
Multi-touch technology provides an attractive way for knowledge workers to collaborate. Co-creation is an important collaboration process in which collecting resources, creating results and distributing these results is essential. We propose a wall-based multi-touch system (called CoCreate) in which these steps are made easy due to the notion of connected private spaces and a shared co-create space. We present our ongoing work, expert evaluation of interaction scenarios and future plans.
NASA Astrophysics Data System (ADS)
Leskens, Johannes
2015-04-01
In modern water management, often transdisciplinary work sessions are organized in which various stakeholders participate to jointly define problems, choose measures and divide responsibilities to take actions. Involved stakeholders are for example policy analysts or decision-makers from municipalities, water boards or provinces, representatives of pressure groups and researchers from knowledge institutes. Parallel to this increasing attention for transdisciplinary work sessions, we see a growing availability of interactive IT-tools that can be applied during these sessions. For example, dynamic flood risk maps have become recently available that allow users during a work sessions to instantaneously assess the impact of storm surges or dam breaches, displayed on digital maps. Other examples are serious games, realistic visualizations and participatory simulations. However, the question is if and how these interactive IT-tools contribute to better decision-making. To assess this, we take the process of knowledge construction during a work session as a measure for the quality of decision-making. Knowledge construction can be defined as the process in which ideas, perspectives and opinions of different stakeholders, all having their own expertise and experience, are confronted with each other and new shared meanings towards water management issues are created. We present an assessment method to monitor the process of knowledge construction during work sessions in water management in which interactive IT tools are being used. The assessment method is based on a literature review, focusing on studies in which knowledge construction was monitored in other contexts that water management. To test the applicability of the assessment method, we applied it during a multi-stakeholder work session in Westland, located in the southwest of the Netherlands. The discussions during the work session were observed by camera. All statements, expressed by the various members of a stakeholder session, were classified according to our assessment method. We can draw the following preliminary conclusions. First, the case study showed that the method was useful to show the knowledge construction process over time, in terms of content and cognitive level of statements and interaction, attention and response between stakeholders. It was observed that the various aspects of knowledge construction all were influenced by the use of the 3Di model. The model focused discussions on technical issues of flood risk management, non-flood specialists were able to participate in discussions and in suggesting solutions and more topics could be evaluated in respect to non-interactive flood maps. Second, the method is considered useful as a benchmark for different interactive IT tools. The method is also considered useful to gain insight in how to optimally set-up multi-stakeholder meetings in which interactive IT-tools are being used. Further, the method can provide model developers insight in how to better meet the technical requirements of interactive IT tools to support the knowledge construction process during multi-stakeholder meeting
The Childhood Obesity Declines Project: Implications for Research and Evaluation Approaches.
Young-Hyman, Deborah; Morris, Kathryn; Kettel Khan, Laura; Dawkins-Lyn, Nicola; Dooyema, Carrie; Harris, Carole; Jernigan, Jan; Ottley, Phyllis; Kauh, Tina
2018-03-01
Childhood obesity remains prevalent and is increasing in some disadvantaged populations. Numerous research, policy and community initiatives are undertaken to impact this pandemic. Understudied are natural experiments. The need to learn from these efforts is paramount. Resulting evidence may not be readily available to inform future research, community initiatives, and policy development/implementation. We discuss the implications of using an adaptation of the Systematic Screening and Assessment (SSA) method to evaluate the Childhood Obesity Declines (COBD) project. The project examined successful initiatives, programs and policies in four diverse communities which were concurrent with significant declines in child obesity. In the context of other research designs and evaluation schemas, rationale for use of SSA is presented. Evidence generated by this method is highlighted and guidance suggested for evaluation of future studies of community-based childhood obesity prevention initiatives. Support for the role of stakeholder collaboratives, in particular the National Collaborative on Childhood Obesity Research, as a synergistic vehicle to accelerate research on childhood obesity is discussed. SSA mapped active processes and provided contextual understanding of multi-level/component simultaneous efforts to reduce rates of childhood obesity in community settings. Initiatives, programs and policies were not necessarily coordinated. And although direct attribution of intervention/initiative/policy components could not be made, the what, by who, how, to whom was temporally associated with statistically significant reductions in childhood obesity. SSA provides evidence for context and processes which are not often evaluated in other data analytic methods. SSA provides an additional tool to layer with other evaluation approaches.
Wu, Wenchuan; Fang, Sheng; Guo, Hua
2014-06-01
Aiming at motion artifacts and off-resonance artifacts in multi-shot diffusion magnetic resonance imaging (MRI), we proposed a joint correction method in this paper to correct the two kinds of artifacts simultaneously without additional acquisition of navigation data and field map. We utilized the proposed method using multi-shot variable density spiral sequence to acquire MRI data and used auto-focusing technique for image deblurring. We also used direct method or iterative method to correct motion induced phase errors in the process of deblurring. In vivo MRI experiments demonstrated that the proposed method could effectively suppress motion artifacts and off-resonance artifacts and achieve images with fine structures. In addition, the scan time was not increased in applying the proposed method.
GROUND WATER MONITORING AND SAMPLING: MULTI-LEVEL VERSUS TRADITIONAL METHODS – WHAT’S WHAT?
Recent studies have been conducted to evaluate different sampling techniques for determining VOC concentrations in groundwater. Samples were obtained using multi-level and traditional sampling techniques in three monitoring wells at the Raymark Superfund site in Stratford, CT. Ve...
Schmidt, Katharina; Aumann, Ines; Hollander, Ines; Damm, Kathrin; von der Schulenburg, J-Matthias Graf
2015-12-24
The Analytic Hierarchy Process (AHP), developed by Saaty in the late 1970s, is one of the methods for multi-criteria decision making. The AHP disaggregates a complex decision problem into different hierarchical levels. The weight for each criterion and alternative are judged in pairwise comparisons and priorities are calculated by the Eigenvector method. The slowly increasing application of the AHP was the motivation for this study to explore the current state of its methodology in the healthcare context. A systematic literature review was conducted by searching the Pubmed and Web of Science databases for articles with the following keywords in their titles or abstracts: "Analytic Hierarchy Process," "Analytical Hierarchy Process," "multi-criteria decision analysis," "multiple criteria decision," "stated preference," and "pairwise comparison." In addition, we developed reporting criteria to indicate whether the authors reported important aspects and evaluated the resulting studies' reporting. The systematic review resulted in 121 articles. The number of studies applying AHP has increased since 2005. Most studies were from Asia (almost 30%), followed by the US (25.6%). On average, the studies used 19.64 criteria throughout their hierarchical levels. Furthermore, we restricted a detailed analysis to those articles published within the last 5 years (n = 69). The mean of participants in these studies were 109, whereas we identified major differences in how the surveys were conducted. The evaluation of reporting showed that the mean of reported elements was about 6.75 out of 10. Thus, 12 out of 69 studies reported less than half of the criteria. The AHP has been applied inconsistently in healthcare research. A minority of studies described all the relevant aspects. Thus, the statements in this review may be biased, as they are restricted to the information available in the papers. Hence, further research is required to discover who should be interviewed and how, how inconsistent answers should be dealt with, and how the outcome and stability of the results should be presented. In addition, we need new insights to determine which target group can best handle the challenges of the AHP.
NASA Astrophysics Data System (ADS)
Jin, Juliang; Li, Lei; Wang, Wensheng; Zhang, Ming
2006-10-01
The optimal selection of schemes of water transportation projects is a process of choosing a relatively optimal scheme from a number of schemes of water transportation programming and management projects, which is of importance in both theory and practice in water resource systems engineering. In order to achieve consistency and eliminate the dimensions of fuzzy qualitative and fuzzy quantitative evaluation indexes, to determine the weights of the indexes objectively, and to increase the differences among the comprehensive evaluation index values of water transportation project schemes, a projection pursuit method, named FPRM-PP for short, was developed in this work for selecting the optimal water transportation project scheme based on the fuzzy preference relation matrix. The research results show that FPRM-PP is intuitive and practical, the correction range of the fuzzy preference relation matrix
Revisions to the JDL data fusion model
NASA Astrophysics Data System (ADS)
Steinberg, Alan N.; Bowman, Christopher L.; White, Franklin E.
1999-03-01
The Data Fusion Model maintained by the Joint Directors of Laboratories (JDL) Data Fusion Group is the most widely-used method for categorizing data fusion-related functions. This paper discusses the current effort to revise the expand this model to facilitate the cost-effective development, acquisition, integration and operation of multi- sensor/multi-source systems. Data fusion involves combining information - in the broadest sense - to estimate or predict the state of some aspect of the universe. These may be represented in terms of attributive and relational states. If the job is to estimate the state of a people, it can be useful to include consideration of informational and perceptual states in addition to the physical state. Developing cost-effective multi-source information systems requires a method for specifying data fusion processing and control functions, interfaces, and associate databases. The lack of common engineering standards for data fusion systems has been a major impediment to integration and re-use of available technology: current developments do not lend themselves to objective evaluation, comparison or re-use. This paper reports on proposed revisions and expansions of the JDL Data FUsion model to remedy some of these deficiencies. This involves broadening the functional model and related taxonomy beyond the original military focus, and integrating the Data Fusion Tree Architecture model for system description, design and development.
NASA Astrophysics Data System (ADS)
Olyazadeh, Roya; van Westen, Cees; Bakker, Wim H.; Aye, Zar Chi; Jaboyedoff, Michel; Derron, Marc-Henri
2014-05-01
Natural hazard risk management requires decision making in several stages. Decision making on alternatives for risk reduction planning starts with an intelligence phase for recognition of the decision problems and identifying the objectives. Development of the alternatives and assigning the variable by decision makers to each alternative are employed to the design phase. Final phase evaluates the optimal choice by comparing the alternatives, defining indicators, assigning a weight to each and ranking them. This process is referred to as Multi-Criteria Decision Making analysis (MCDM), Multi-Criteria Evaluation (MCE) or Multi-Criteria Analysis (MCA). In the framework of the ongoing 7th Framework Program "CHANGES" (2011-2014, Grant Agreement No. 263953) of the European Commission, a Spatial Decision Support System is under development, that has the aim to analyse changes in hydro-meteorological risk and provide support to selecting the best risk reduction alternative. This paper describes the module for Multi-Criteria Decision Making analysis (MCDM) that incorporates monetary and non-monetary criteria in the analysis of the optimal alternative. The MCDM module consists of several components. The first step is to define criteria (or Indicators) which are subdivided into disadvantages (criteria that indicate the difficulty for implementing the risk reduction strategy, also referred to as Costs) and advantages (criteria that indicate the favorability, also referred to as benefits). In the next step the stakeholders can use the developed web-based tool for prioritizing criteria and decision matrix. Public participation plays a role in decision making and this is also planned through the use of a mobile web-version where the general local public can indicate their agreement on the proposed alternatives. The application is being tested through a case study related to risk reduction of a mountainous valley in the Alps affected by flooding. Four alternatives are evaluated in this case study namely: construction of defense structures, relocation, implementation of an early warning system and spatial planning regulations. Some of the criteria are determined partly in other modules of the CHANGES SDSS, such as the costs for implementation, the risk reduction in monetary values, and societal risk. Other criteria, which could be environmental, economic, cultural, perception in nature, are defined by different stakeholders such as local authorities, expert organizations, private sector, and local public. In the next step, the stakeholders weight the importance of the criteria by pairwise comparison and visualize the decision matrix, which is a matrix based on criteria versus alternatives values. Finally alternatives are ranked by Analytic Hierarchy Process (AHP) method. We expect that this approach will help the decision makers to ease their works and reduce their costs, because the process is more transparent, more accurate and involves a group decision. In that way there will be more confidence in the overall decision making process. Keywords: MCDM, Analytic Hierarchy Process (AHP), SDSS, Natural Hazard Risk Management
Solving multi-objective optimization problems in conservation with the reference point method
Dujardin, Yann; Chadès, Iadine
2018-01-01
Managing the biodiversity extinction crisis requires wise decision-making processes able to account for the limited resources available. In most decision problems in conservation biology, several conflicting objectives have to be taken into account. Most methods used in conservation either provide suboptimal solutions or use strong assumptions about the decision-maker’s preferences. Our paper reviews some of the existing approaches to solve multi-objective decision problems and presents new multi-objective linear programming formulations of two multi-objective optimization problems in conservation, allowing the use of a reference point approach. Reference point approaches solve multi-objective optimization problems by interactively representing the preferences of the decision-maker with a point in the criteria (objectives) space, called the reference point. We modelled and solved the following two problems in conservation: a dynamic multi-species management problem under uncertainty and a spatial allocation resource management problem. Results show that the reference point method outperforms classic methods while illustrating the use of an interactive methodology for solving combinatorial problems with multiple objectives. The method is general and can be adapted to a wide range of ecological combinatorial problems. PMID:29293650
Experiences of Peer Evaluation of the Leicester Teenage Pregnancy Prevention Strategy
ERIC Educational Resources Information Center
Fleming, Jennie; Chong, Hannah Goodman; Skinner, Alison
2009-01-01
The Centre for Social Action was commissioned by the Leicester City Council to evaluate its Teenage Pregnancy Prevention Strategy. This was a multi-stage project with a central element of consulting with young people. This article outlines the process that was followed in order to recruit, train and support young people through the process of…
A Process Evaluation of Student Participation in a Whole School Food Programme
ERIC Educational Resources Information Center
Orme, Judy; Jones, Matthew; Salmon, Debra; Weitkamp, Emma; Kimberlee, Richard
2013-01-01
Purpose: Health promotion programmes are widely held to be more effective when the subjects of them actively participate in the process of change. The purpose of this paper is to report on an evaluation of the Food for Life Partnership programme, a multi-level initiative in England promoting healthier nutrition and food sustainability awareness…
Strategic planning decision making using fuzzy SWOT-TOPSIS with reliability factor
NASA Astrophysics Data System (ADS)
Mohamad, Daud; Afandi, Nur Syamimi; Kamis, Nor Hanimah
2015-10-01
Strategic planning is a process of decision making and action for long-term activities in an organization. The Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis has been commonly used to help organizations in strategizing their future direction by analyzing internal and external environment. However, SWOT analysis has some limitations as it is unable to prioritize appropriately the multiple alternative strategic decisions. Some efforts have been made to solve this problem by incorporating Multi Criteria Decision Making (MCDM) methods. Nevertheless, another important aspect has raised concerns on obtaining the decision that is the reliability of the information. Decision makers evaluate differently depending on their level of confidence or sureness in the evaluation. This study proposes a decision making procedure for strategic planning using SWOT-TOPSIS method by incorporating the reliability factor of the evaluation based on Z-number. An example using a local authority in the east coast of Malaysia is illustrated to determine the strategic options ranking and to prioritize factors in each SWOT category.
NASA Astrophysics Data System (ADS)
Nath, Surajit; Sarkar, Bijan
2017-08-01
Advanced Manufacturing Technologies (AMTs) offer opportunities for the manufacturing organizations to excel their competitiveness and in turn their effectiveness in manufacturing. Proper selection and evaluation of AMTs is the most significant task in today's modern world. But this involves a lot of uncertainty and vagueness as it requires many conflicting criteria to deal with. So the task of selection and evaluation of AMTs becomes very tedious for the evaluators as they are not able to provide crisp data for the criteria. Different Fuzzy Multi-criteria Decision Making (MCDM) methods help greatly in dealing with this problem. This paper focuses on the application of two very much potential Fuzzy MCDM methods namely COPRAS-G, EVAMIX and a comparative study between them on some rarely mentioned criteria. Each of the two methods is very powerful evaluation tool and has beauty in its own. Although, performance wise these two methods are almost at same level, but, the approach of each one of them are quite unique. This uniqueness is revealed by introducing a numerical example of selection of AMT.
Aerodynamics Characteristics of Multi-Element Airfoils at -90 Degrees Incidence
NASA Technical Reports Server (NTRS)
Stremel, Paul M.; Schmitz, Fredric H. (Technical Monitor)
1994-01-01
A developed method has been applied to calculate accurately the viscous flow about airfoils normal to the free-stream flow. This method has special application to the analysis of tilt rotor aircraft in the evaluation of download. In particular, the flow about an XV-15 airfoil with and without deflected leading and trailing edge flaps at -90 degrees incidence is evaluated. The multi-element aspect of the method provides for the evaluation of slotted flap configurations which may lead to decreased drag. The method solves for turbulent flow at flight Reynolds numbers. The flow about the XV-15 airfoil with and without flap deflections has been calculated and compared with experimental data at a Reynolds number of one million. The comparison between the calculated and measured pressure distributions are very good, thereby, verifying the method. The aerodynamic evaluation of multielement airfoils will be conducted to determine airfoil/flap configurations for reduced airfoil drag. Comparisons between the calculated lift, drag and pitching moment on the airfoil and the airfoil surface pressure will also be presented.
Parallelization strategies for continuum-generalized method of moments on the multi-thread systems
NASA Astrophysics Data System (ADS)
Bustamam, A.; Handhika, T.; Ernastuti, Kerami, D.
2017-07-01
Continuum-Generalized Method of Moments (C-GMM) covers the Generalized Method of Moments (GMM) shortfall which is not as efficient as Maximum Likelihood estimator by using the continuum set of moment conditions in a GMM framework. However, this computation would take a very long time since optimizing regularization parameter. Unfortunately, these calculations are processed sequentially whereas in fact all modern computers are now supported by hierarchical memory systems and hyperthreading technology, which allowing for parallel computing. This paper aims to speed up the calculation process of C-GMM by designing a parallel algorithm for C-GMM on the multi-thread systems. First, parallel regions are detected for the original C-GMM algorithm. There are two parallel regions in the original C-GMM algorithm, that are contributed significantly to the reduction of computational time: the outer-loop and the inner-loop. Furthermore, this parallel algorithm will be implemented with standard shared-memory application programming interface, i.e. Open Multi-Processing (OpenMP). The experiment shows that the outer-loop parallelization is the best strategy for any number of observations.
de Siqueira, Alexandre Fioravante; Cabrera, Flávio Camargo; Nakasuga, Wagner Massayuki; Pagamisse, Aylton; Job, Aldo Eloizo
2018-01-01
Image segmentation, the process of separating the elements within a picture, is frequently used for obtaining information from photomicrographs. Segmentation methods should be used with reservations, since incorrect results can mislead when interpreting regions of interest (ROI). This decreases the success rate of extra procedures. Multi-Level Starlet Segmentation (MLSS) and Multi-Level Starlet Optimal Segmentation (MLSOS) were developed to be an alternative for general segmentation tools. These methods gave rise to Jansen-MIDAS, an open-source software. A scientist can use it to obtain several segmentations of hers/his photomicrographs. It is a reliable alternative to process different types of photomicrographs: previous versions of Jansen-MIDAS were used to segment ROI in photomicrographs of two different materials, with an accuracy superior to 89%. © 2017 Wiley Periodicals, Inc.
Peripleural lung disease detection based on multi-slice CT images
NASA Astrophysics Data System (ADS)
Matsuhiro, M.; Suzuki, H.; Kawata, Y.; Niki, N.; Nakano, Y.; Ohmatsu, H.; Kusumoto, M.; Tsuchida, T.; Eguchi, K.; Kaneko, M.
2015-03-01
With the development of multi-slice CT technology, obtaining accurate 3D images of lung field in a short time become possible. To support that, a lot of image processing methods need to be developed. Detection peripleural lung disease is difficult due to its existence out of lung region, because lung extraction is often performed based on threshold processing. The proposed method uses thoracic inner region extracted by inner cavity of bone as well as air region, covers peripleural lung diseased cases such as lung nodule, calcification, pleural effusion and pleural plaque. We applied this method to 50 cases including 39 peripleural lung diseased cases. This method was able to detect 39 peripleural lung disease with 2.9 false positive per case.
ERIC Educational Resources Information Center
England, Lenore; Fu, Li
2011-01-01
A critical part of electronic resources management, the electronic resources evaluation process is multi-faceted and includes a seemingly endless range of resources and tools involving numerous library staff. A solution is to build a Web site to bring all of the components together that can be implemented quickly and result in an organizational…
Xi, Beidou; He, Xiaosong; Dang, Qiuling; Yang, Tianxue; Li, Mingxiao; Wang, Xiaowei; Li, Dan; Tang, Jun
2015-11-01
In this study, PCR-DGGE method was applied to investigate the impact of multi-stage inoculation treatment on the community composition of bacterial and fungal during municipal solid wastes (MSW) composting process. The results showed that the high temperature period was extended by the multi-stage inoculation treatment, 1day longer than initial-stage inoculation treatment, and 5days longer than non-inoculation treatment. The temperature of the secondary fermentation increased to 51°C with multi-stage inoculation treatment. The multi-stage inoculation method improved the community diversity of bacteria and fungi that the diversity indexes reached the maximum on the 17days and 20days respectively, avoided the competition between inoculations and indigenous microbes, and enhanced the growth of dominant microorganisms. The DNA sequence indicated that various kinds of uncultured microorganisms with determined ratios were detected, which were dominant microbes during the whole fermentation process. These findings call for further researches of compost microbial cultivation technology. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhu, Ming; Liu, Tingting; Zhang, Xiangqun; Li, Caiyun
2018-01-01
Recently, a decomposition method of acoustic relaxation absorption spectra was used to capture the entire molecular multimode relaxation process of gas. In this method, the acoustic attenuation and phase velocity were measured jointly based on the relaxation absorption spectra. However, fast and accurate measurements of the acoustic attenuation remain challenging. In this paper, we present a method of capturing the molecular relaxation process by only measuring acoustic velocity, without the necessity of obtaining acoustic absorption. The method is based on the fact that the frequency-dependent velocity dispersion of a multi-relaxation process in a gas is the serial connection of the dispersions of interior single-relaxation processes. Thus, one can capture the relaxation times and relaxation strengths of N decomposed single-relaxation dispersions to reconstruct the entire multi-relaxation dispersion using the measurements of acoustic velocity at 2N + 1 frequencies. The reconstructed dispersion spectra are in good agreement with experimental data for various gases and mixtures. The simulations also demonstrate the robustness of our reconstructive method.
Uniform competency-based local feature extraction for remote sensing images
NASA Astrophysics Data System (ADS)
Sedaghat, Amin; Mohammadi, Nazila
2018-01-01
Local feature detectors are widely used in many photogrammetry and remote sensing applications. The quantity and distribution of the local features play a critical role in the quality of the image matching process, particularly for multi-sensor high resolution remote sensing image registration. However, conventional local feature detectors cannot extract desirable matched features either in terms of the number of correct matches or the spatial and scale distribution in multi-sensor remote sensing images. To address this problem, this paper proposes a novel method for uniform and robust local feature extraction for remote sensing images, which is based on a novel competency criterion and scale and location distribution constraints. The proposed method, called uniform competency (UC) local feature extraction, can be easily applied to any local feature detector for various kinds of applications. The proposed competency criterion is based on a weighted ranking process using three quality measures, including robustness, spatial saliency and scale parameters, which is performed in a multi-layer gridding schema. For evaluation, five state-of-the-art local feature detector approaches, namely, scale-invariant feature transform (SIFT), speeded up robust features (SURF), scale-invariant feature operator (SFOP), maximally stable extremal region (MSER) and hessian-affine, are used. The proposed UC-based feature extraction algorithms were successfully applied to match various synthetic and real satellite image pairs, and the results demonstrate its capability to increase matching performance and to improve the spatial distribution. The code to carry out the UC feature extraction is available from href="https://www.researchgate.net/publication/317956777_UC-Feature_Extraction.
Chalmers, Rachel M; Pérez-Cordón, Gregorio; Cacció, Simone M; Klotz, Christian; Robertson, Lucy J
2018-06-13
Due to the occurrence of genetic recombination, a reliable and discriminatory method to genotype Cryptosporidium isolates at the intra-species level requires the analysis of multiple loci, but a standardised scheme is not currently available. A workshop was held at the Robert Koch Institute, Berlin in 2016 that gathered 23 scientists with appropriate expertise (in either Cryptosporidium genotyping and/or surveillance, epidemiology or outbreaks) to discuss the processes for the development of a robust, standardised, multi-locus genotyping (MLG) scheme and propose an approach. The background evidence and main conclusions were outlined in a previously published report; the objectives of this further report are to describe 1) the current use of Cryptosporidium genotyping, 2) the elicitation and synthesis of the participants' opinions, and 3) the agreed processes and criteria for the development, evaluation and validation of a standardised MLG scheme for Cryptosporidium surveillance and outbreak investigations. Cryptosporidium was characterised to the species level in 7/12 (58%) participating European countries, mostly for human outbreak investigations. Further genotyping was mostly by sequencing the gp60 gene. A ranking exercise of performance and convenience criteria found that portability, biological robustness, typeability, and discriminatory power were considered by participants as the most important attributes in developing a multilocus scheme. The major barrier to implementation was lack of funding. A structured process for marker identification, evaluation, validation, implementation, and maintenance was proposed and outlined for application to Cryptosporidium, with prioritisation of Cryptosporidium parvum to support investigation of transmission in Europe. Copyright © 2018 Elsevier Inc. All rights reserved.
Miller-Graff, Laura E; Cummings, E Mark; Bergman, Kathleen N
2016-10-01
The role of emotional security in promoting positive adjustment following exposure to marital conflict has been identified in a large number of empirical investigations, yet to date, no interventions have explicitly addressed the processes that predict child adjustment after marital conflict. The current study evaluated a randomized controlled trial of a family intervention program aimed at promoting constructive marital conflict behaviors thereby increasing adolescent emotional security and adjustment. Families (n = 225) were randomized into 1 of 4 conditions: Parent-Adolescent (n = 75), Parent-Only (n = 75), Self-Study (n = 38) and No Treatment (n = 37). Multi-informant and multi-method assessments were conducted at baseline, post-treatment and 6-month follow-up. Effects of treatment on destructive and constructive conflict behaviors were evaluated using multilevel models where observations were nested within individuals over time. Process models assessing the impact of constructive and destructive conflict behaviors on emotional insecurity and adolescent adjustment were evaluated using path modeling. Results indicated that the treatment was effective in increasing constructive conflict behaviors (d = 0.89) and decreasing destructive conflict behaviors (d = -0.30). For the Parent-Only Group, post-test constructive conflict behaviors directly predicted lower levels of adolescent externalizing behaviors at 6-month follow-up. Post-test constructive conflict skills also indirectly affected adolescent internalizing behaviors through adolescent emotional security. These findings support the use of a brief psychoeducational intervention in improving post-treatment conflict and emotional security about interparental relationships.