Sample records for proposed methodology based

  1. A Novel Consensus-Based Particle Swarm Optimization-Assisted Trust-Tech Methodology for Large-Scale Global Optimization.

    PubMed

    Zhang, Yong-Feng; Chiang, Hsiao-Dong

    2017-09-01

    A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.

  2. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    NASA Astrophysics Data System (ADS)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  3. A Proposed Theory Seeded Methodology for Design Based Research into Effective Use of MUVES in Vocational Education Contexts

    ERIC Educational Resources Information Center

    Cochrane, Todd; Davis, Niki; Morrow, Donna

    2013-01-01

    A methodology for design based research (DBR) into effective development and use of Multi-User Virtual Environments (MUVE) in vocational education is proposed. It blends software development with DBR with two theories selected to inform the methodology. Legitimate peripheral participation LPP (Lave & Wenger, 1991) provides a filter when…

  4. A deviation based assessment methodology for multiple machine health patterns classification and fault detection

    NASA Astrophysics Data System (ADS)

    Jia, Xiaodong; Jin, Chao; Buzza, Matt; Di, Yuan; Siegel, David; Lee, Jay

    2018-01-01

    Successful applications of Diffusion Map (DM) in machine failure detection and diagnosis have been reported in several recent studies. DM provides an efficient way to visualize the high-dimensional, complex and nonlinear machine data, and thus suggests more knowledge about the machine under monitoring. In this paper, a DM based methodology named as DM-EVD is proposed for machine degradation assessment, abnormality detection and diagnosis in an online fashion. Several limitations and challenges of using DM for machine health monitoring have been analyzed and addressed. Based on the proposed DM-EVD, a deviation based methodology is then proposed to include more dimension reduction methods. In this work, the incorporation of Laplacian Eigen-map and Principal Component Analysis (PCA) are explored, and the latter algorithm is named as PCA-Dev and is validated in the case study. To show the successful application of the proposed methodology, case studies from diverse fields are presented and investigated in this work. Improved results are reported by benchmarking with other machine learning algorithms.

  5. Automatic food intake detection based on swallowing sounds.

    PubMed

    Makeyev, Oleksandr; Lopez-Meyer, Paulo; Schuckers, Stephanie; Besio, Walter; Sazonov, Edward

    2012-11-01

    This paper presents a novel fully automatic food intake detection methodology, an important step toward objective monitoring of ingestive behavior. The aim of such monitoring is to improve our understanding of eating behaviors associated with obesity and eating disorders. The proposed methodology consists of two stages. First, acoustic detection of swallowing instances based on mel-scale Fourier spectrum features and classification using support vector machines is performed. Principal component analysis and a smoothing algorithm are used to improve swallowing detection accuracy. Second, the frequency of swallowing is used as a predictor for detection of food intake episodes. The proposed methodology was tested on data collected from 12 subjects with various degrees of adiposity. Average accuracies of >80% and >75% were obtained for intra-subject and inter-subject models correspondingly with a temporal resolution of 30s. Results obtained on 44.1 hours of data with a total of 7305 swallows show that detection accuracies are comparable for obese and lean subjects. They also suggest feasibility of food intake detection based on swallowing sounds and potential of the proposed methodology for automatic monitoring of ingestive behavior. Based on a wearable non-invasive acoustic sensor the proposed methodology may potentially be used in free-living conditions.

  6. Automatic food intake detection based on swallowing sounds

    PubMed Central

    Makeyev, Oleksandr; Lopez-Meyer, Paulo; Schuckers, Stephanie; Besio, Walter; Sazonov, Edward

    2012-01-01

    This paper presents a novel fully automatic food intake detection methodology, an important step toward objective monitoring of ingestive behavior. The aim of such monitoring is to improve our understanding of eating behaviors associated with obesity and eating disorders. The proposed methodology consists of two stages. First, acoustic detection of swallowing instances based on mel-scale Fourier spectrum features and classification using support vector machines is performed. Principal component analysis and a smoothing algorithm are used to improve swallowing detection accuracy. Second, the frequency of swallowing is used as a predictor for detection of food intake episodes. The proposed methodology was tested on data collected from 12 subjects with various degrees of adiposity. Average accuracies of >80% and >75% were obtained for intra-subject and inter-subject models correspondingly with a temporal resolution of 30s. Results obtained on 44.1 hours of data with a total of 7305 swallows show that detection accuracies are comparable for obese and lean subjects. They also suggest feasibility of food intake detection based on swallowing sounds and potential of the proposed methodology for automatic monitoring of ingestive behavior. Based on a wearable non-invasive acoustic sensor the proposed methodology may potentially be used in free-living conditions. PMID:23125873

  7. A Security Assessment Mechanism for Software-Defined Networking-Based Mobile Networks.

    PubMed

    Luo, Shibo; Dong, Mianxiong; Ota, Kaoru; Wu, Jun; Li, Jianhua

    2015-12-17

    Software-Defined Networking-based Mobile Networks (SDN-MNs) are considered the future of 5G mobile network architecture. With the evolving cyber-attack threat, security assessments need to be performed in the network management. Due to the distinctive features of SDN-MNs, such as their dynamic nature and complexity, traditional network security assessment methodologies cannot be applied directly to SDN-MNs, and a novel security assessment methodology is needed. In this paper, an effective security assessment mechanism based on attack graphs and an Analytic Hierarchy Process (AHP) is proposed for SDN-MNs. Firstly, this paper discusses the security assessment problem of SDN-MNs and proposes a methodology using attack graphs and AHP. Secondly, to address the diversity and complexity of SDN-MNs, a novel attack graph definition and attack graph generation algorithm are proposed. In order to quantify security levels, the Node Minimal Effort (NME) is defined to quantify attack cost and derive system security levels based on NME. Thirdly, to calculate the NME of an attack graph that takes the dynamic factors of SDN-MN into consideration, we use AHP integrated with the Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) as the methodology. Finally, we offer a case study to validate the proposed methodology. The case study and evaluation show the advantages of the proposed security assessment mechanism.

  8. A Security Assessment Mechanism for Software-Defined Networking-Based Mobile Networks

    PubMed Central

    Luo, Shibo; Dong, Mianxiong; Ota, Kaoru; Wu, Jun; Li, Jianhua

    2015-01-01

    Software-Defined Networking-based Mobile Networks (SDN-MNs) are considered the future of 5G mobile network architecture. With the evolving cyber-attack threat, security assessments need to be performed in the network management. Due to the distinctive features of SDN-MNs, such as their dynamic nature and complexity, traditional network security assessment methodologies cannot be applied directly to SDN-MNs, and a novel security assessment methodology is needed. In this paper, an effective security assessment mechanism based on attack graphs and an Analytic Hierarchy Process (AHP) is proposed for SDN-MNs. Firstly, this paper discusses the security assessment problem of SDN-MNs and proposes a methodology using attack graphs and AHP. Secondly, to address the diversity and complexity of SDN-MNs, a novel attack graph definition and attack graph generation algorithm are proposed. In order to quantify security levels, the Node Minimal Effort (NME) is defined to quantify attack cost and derive system security levels based on NME. Thirdly, to calculate the NME of an attack graph that takes the dynamic factors of SDN-MN into consideration, we use AHP integrated with the Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) as the methodology. Finally, we offer a case study to validate the proposed methodology. The case study and evaluation show the advantages of the proposed security assessment mechanism. PMID:26694409

  9. An ontological case base engineering methodology for diabetes management.

    PubMed

    El-Sappagh, Shaker H; El-Masri, Samir; Elmogy, Mohammed; Riad, A M; Saddik, Basema

    2014-08-01

    Ontology engineering covers issues related to ontology development and use. In Case Based Reasoning (CBR) system, ontology plays two main roles; the first as case base and the second as domain ontology. However, the ontology engineering literature does not provide adequate guidance on how to build, evaluate, and maintain ontologies. This paper proposes an ontology engineering methodology to generate case bases in the medical domain. It mainly focuses on the research of case representation in the form of ontology to support the case semantic retrieval and enhance all knowledge intensive CBR processes. A case study on diabetes diagnosis case base will be provided to evaluate the proposed methodology.

  10. [Methodological and operational notes for the assessment and management of the risk of work-related stress].

    PubMed

    De Ambrogi, Francesco; Ratti, Elisabetta Ceppi

    2011-01-01

    Today the Italian national debate over the Work-Related Stress Risk Assessment methodology is rather heated. Several methodological proposals and guidelines have been published in recent months, not least those by the "Commissione Consultiva". But despite this wide range of proposals, it appears that there is still a lack of attention to some of the basic methodological issues that must be taken into account in order to correctly implement the above-mentioned guidelines. The aim of this paper is to outline these methodological issues. In order to achieve this, the most authoritative methodological proposals and guidelines have been reviewed. The study focuses in particular on the methodological issues that could lead to important biases if not considered properly. The study leads to some considerations about the methodological validity of a Work-Related Stress Risk Assessment based exclusively on the literal interpretation of the considered proposals. Furthermore, the study provides some hints and working hypotheses on how to overcome these methodological limits. This study should be considered as a starting point for further investigations and debate on the Work-Related Stress Risk Assessment methodology on a national level.

  11. Modern proposal of methodology for retrieval of characteristic synthetic rainfall hyetographs

    NASA Astrophysics Data System (ADS)

    Licznar, Paweł; Burszta-Adamiak, Ewa; Łomotowski, Janusz; Stańczyk, Justyna

    2017-11-01

    Modern engineering workshop of designing and modelling complex drainage systems is based on hydrodynamic modelling and has a probabilistic character. Its practical application requires a change regarding rainfall models accepted at the input. Previously used artificial rainfall models of simplified form, e.g. block precipitation or Euler's type II model rainfall are no longer sufficient. It is noticeable that urgent clarification is needed as regards the methodology of standardized rainfall hyetographs that would take into consideration the specifics of local storm rainfall temporal dynamics. The aim of the paper is to present a proposal for innovative methodology for determining standardized rainfall hyetographs, based on statistical processing of the collection of actual local precipitation characteristics. Proposed methodology is based on the classification of standardized rainfall hyetographs with the use of cluster analysis. Its application is presented on the example of selected rain gauges localized in Poland. Synthetic rainfall hyetographs achieved as a final result may be used for hydrodynamic modelling of sewerage systems, including probabilistic detection of necessary capacity of retention reservoirs.

  12. A Methodological Conundrum: Comparing Schools in Scotland and England

    ERIC Educational Resources Information Center

    Marshall, Bethan; Gibbons, Simon

    2015-01-01

    This article considers a conundrum in research methodology; the fact that, in the main, you have to use a social science-based research methodology if you want to look at what goes on in a classroom. This article proposes an alternative arts-based research method instead based on the work of Eisner, and before him Dewey, where one can use the more…

  13. Damage detection methodology on beam-like structures based on combined modal Wavelet Transform strategy

    NASA Astrophysics Data System (ADS)

    Serra, Roger; Lopez, Lautaro

    2018-05-01

    Different approaches on the detection of damages based on dynamic measurement of structures have appeared in the last decades. They were based, amongst others, on changes in natural frequencies, modal curvatures, strain energy or flexibility. Wavelet analysis has also been used to detect the abnormalities on modal shapes induced by damages. However the majority of previous work was made with non-corrupted by noise signals. Moreover, the damage influence for each mode shape was studied separately. This paper proposes a new methodology based on combined modal wavelet transform strategy to cope with noisy signals, while at the same time, able to extract the relevant information from each mode shape. The proposed methodology will be then compared with the most frequently used and wide-studied methods from the bibliography. To evaluate the performance of each method, their capacity to detect and localize damage will be analyzed in different cases. The comparison will be done by simulating the oscillations of a cantilever steel beam with and without defect as a numerical case. The proposed methodology proved to outperform classical methods in terms of noisy signals.

  14. Analysis of a Proposal to Implement the Readiness Based Sparing Process in the Brazilian Navy

    DTIC Science & Technology

    2017-06-01

    determine inventory levels. This research investigates whether implementing the U.S. DOD readiness-based sparing (RBS) methodology could provide the...suggested by applying the methodology first for determining reparable spares initial provisioning. 14. SUBJECT TERMS reparable, system-approach...This research investigates whether implementing the U.S. DOD readiness-based sparing (RBS) methodology could provide the Brazilian Navy with greater

  15. Mechanical modulation method for ultrasensitive phase measurements in photonics biosensing.

    PubMed

    Patskovsky, S; Maisonneuve, M; Meunier, M; Kabashin, A V

    2008-12-22

    A novel polarimetry methodology for phase-sensitive measurements in single reflection geometry is proposed for applications in optical transduction-based biological sensing. The methodology uses altering step-like chopper-based mechanical phase modulation for orthogonal s- and p- polarizations of light reflected from the sensing interface and the extraction of phase information at different harmonics of the modulation. We show that even under a relatively simple experimental arrangement, the methodology provides the resolution of phase measurements as low as 0.007 deg. We also examine the proposed approach using Total Internal Reflection (TIR) and Surface Plasmon Resonance (SPR) geometries. For TIR geometry, the response appears to be strongly dependent on the prism material with the best values for high refractive index Si. The detection limit for Si-based TIR is estimated as 10(-5) in terms Refractive Index Units (RIU) change. SPR geometry offers much stronger phase response due to a much sharper phase characteristics. With the detection limit of 3.2*10(-7) RIU, the proposed methodology provides one of best sensitivities for phase-sensitive SPR devices. Advantages of the proposed method include high sensitivity, simplicity of experimental setup and noise immunity as a result of a high stability modulation.

  16. A temperature compensation methodology for piezoelectric based sensor devices

    NASA Astrophysics Data System (ADS)

    Wang, Dong F.; Lou, Xueqiao; Bao, Aijian; Yang, Xu; Zhao, Ji

    2017-08-01

    A temperature compensation methodology comprising a negative temperature coefficient thermistor with the temperature characteristics of a piezoelectric material is proposed to improve the measurement accuracy of piezoelectric sensing based devices. The piezoelectric disk is characterized by using a disk-shaped structure and is also used to verify the effectiveness of the proposed compensation method. The measured output voltage shows a nearly linear relationship with respect to the applied pressure by introducing the proposed temperature compensation method in a temperature range of 25-65 °C. As a result, the maximum measurement accuracy is observed to be improved by 40%, and the higher the temperature, the more effective the method. The effective temperature range of the proposed method is theoretically analyzed by introducing the constant coefficient of the thermistor (B), the resistance of initial temperature (R0), and the paralleled resistance (Rx). The proposed methodology can not only eliminate the influence of piezoelectric temperature dependent characteristics on the sensing accuracy but also decrease the power consumption of piezoelectric sensing based devices by the simplified sensing structure.

  17. On multi-site damage identification using single-site training data

    NASA Astrophysics Data System (ADS)

    Barthorpe, R. J.; Manson, G.; Worden, K.

    2017-11-01

    This paper proposes a methodology for developing multi-site damage location systems for engineering structures that can be trained using single-site damaged state data only. The methodology involves training a sequence of binary classifiers based upon single-site damage data and combining the developed classifiers into a robust multi-class damage locator. In this way, the multi-site damage identification problem may be decomposed into a sequence of binary decisions. In this paper Support Vector Classifiers are adopted as the means of making these binary decisions. The proposed methodology represents an advancement on the state of the art in the field of multi-site damage identification which require either: (1) full damaged state data from single- and multi-site damage cases or (2) the development of a physics-based model to make multi-site model predictions. The potential benefit of the proposed methodology is that a significantly reduced number of recorded damage states may be required in order to train a multi-site damage locator without recourse to physics-based model predictions. In this paper it is first demonstrated that Support Vector Classification represents an appropriate approach to the multi-site damage location problem, with methods for combining binary classifiers discussed. Next, the proposed methodology is demonstrated and evaluated through application to a real engineering structure - a Piper Tomahawk trainer aircraft wing - with its performance compared to classifiers trained using the full damaged-state dataset.

  18. A new methodology to integrate planetary quarantine requirements into mission planning, with application to a Jupiter orbiter

    NASA Technical Reports Server (NTRS)

    Howard, R. A.; North, D. W.; Pezier, J. P.

    1975-01-01

    A new methodology is proposed for integrating planetary quarantine objectives into space exploration planning. This methodology is designed to remedy the major weaknesses inherent in the current formulation of planetary quarantine requirements. Application of the methodology is illustrated by a tutorial analysis of a proposed Jupiter Orbiter mission. The proposed methodology reformulates planetary quarantine planning as a sequential decision problem. Rather than concentrating on a nominal plan, all decision alternatives and possible consequences are laid out in a decision tree. Probabilities and values are associated with the outcomes, including the outcome of contamination. The process of allocating probabilities, which could not be made perfectly unambiguous and systematic, is replaced by decomposition and optimization techniques based on principles of dynamic programming. Thus, the new methodology provides logical integration of all available information and allows selection of the best strategy consistent with quarantine and other space exploration goals.

  19. Surrogate based wind farm layout optimization using manifold mapping

    NASA Astrophysics Data System (ADS)

    Kaja Kamaludeen, Shaafi M.; van Zuijle, Alexander; Bijl, Hester

    2016-09-01

    High computational cost associated with the high fidelity wake models such as RANS or LES serves as a primary bottleneck to perform a direct high fidelity wind farm layout optimization (WFLO) using accurate CFD based wake models. Therefore, a surrogate based multi-fidelity WFLO methodology (SWFLO) is proposed. The surrogate model is built using an SBO method referred as manifold mapping (MM). As a verification, optimization of spacing between two staggered wind turbines was performed using the proposed surrogate based methodology and the performance was compared with that of direct optimization using high fidelity model. Significant reduction in computational cost was achieved using MM: a maximum computational cost reduction of 65%, while arriving at the same optima as that of direct high fidelity optimization. The similarity between the response of models, the number of mapping points and its position, highly influences the computational efficiency of the proposed method. As a proof of concept, realistic WFLO of a small 7-turbine wind farm is performed using the proposed surrogate based methodology. Two variants of Jensen wake model with different decay coefficients were used as the fine and coarse model. The proposed SWFLO method arrived at the same optima as that of the fine model with very less number of fine model simulations.

  20. 77 FR 55737 - Small Business Size Standards: Finance and Insurance and Management of Companies and Enterprises

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-11

    ...The U.S. Small Business Administration (SBA) proposes to increase small business size standards for 37 industries in North American Industry Classification System (NAICS) Sector 52, Finance and Insurance, and for two industries in NAICS Sector 55, Management of Companies and Enterprises. In addition, SBA proposes to change the measure of size from average assets to average receipts for NAICS 522293, International Trade Financing. As part of its ongoing comprehensive size standards review, SBA evaluated all receipts based and assets based size standards in NAICS Sectors 52 and 55 to determine whether they should be retained or revised. This proposed rule is one of a series of proposed rules that will review size standards of industries grouped by NAICS Sector. SBA issued a White Paper entitled ``Size Standards Methodology'' and published a notice in the October 21, 2009 issue of the Federal Register to advise the public that the document is available on its Web site at www.sba.gov/size for public review and comments. The ``Size Standards Methodology'' White Paper explains how SBA establishes, reviews, and modifies its receipts based and employee based small business size standards. In this proposed rule, SBA has applied its methodology that pertains to establishing, reviewing, and modifying a receipts based size standard.

  1. A Methodological Proposal for Learning Games Selection and Quality Assessment

    ERIC Educational Resources Information Center

    Dondi, Claudio; Moretti, Michela

    2007-01-01

    This paper presents a methodological proposal elaborated in the framework of two European projects dealing with game-based learning, both of which have focused on "quality" aspects in order to create suitable tools that support European educators, practitioners and lifelong learners in selecting and assessing learning games for use in…

  2. Generalized Predictive and Neural Generalized Predictive Control of Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Kelkar, Atul G.

    2000-01-01

    The research work presented in this thesis addresses the problem of robust control of uncertain linear and nonlinear systems using Neural network-based Generalized Predictive Control (NGPC) methodology. A brief overview of predictive control and its comparison with Linear Quadratic (LQ) control is given to emphasize advantages and drawbacks of predictive control methods. It is shown that the Generalized Predictive Control (GPC) methodology overcomes the drawbacks associated with traditional LQ control as well as conventional predictive control methods. It is shown that in spite of the model-based nature of GPC it has good robustness properties being special case of receding horizon control. The conditions for choosing tuning parameters for GPC to ensure closed-loop stability are derived. A neural network-based GPC architecture is proposed for the control of linear and nonlinear uncertain systems. A methodology to account for parametric uncertainty in the system is proposed using on-line training capability of multi-layer neural network. Several simulation examples and results from real-time experiments are given to demonstrate the effectiveness of the proposed methodology.

  3. Substructuring of multibody systems for numerical transfer path analysis in internal combustion engines

    NASA Astrophysics Data System (ADS)

    Acri, Antonio; Offner, Guenter; Nijman, Eugene; Rejlek, Jan

    2016-10-01

    Noise legislations and the increasing customer demands determine the Noise Vibration and Harshness (NVH) development of modern commercial vehicles. In order to meet the stringent legislative requirements for the vehicle noise emission, exact knowledge of all vehicle noise sources and their acoustic behavior is required. Transfer path analysis (TPA) is a fairly well established technique for estimating and ranking individual low-frequency noise or vibration contributions via the different transmission paths. Transmission paths from different sources to target points of interest and their contributions can be analyzed by applying TPA. This technique is applied on test measurements, which can only be available on prototypes, at the end of the designing process. In order to overcome the limits of TPA, a numerical transfer path analysis methodology based on the substructuring of a multibody system is proposed in this paper. Being based on numerical simulation, this methodology can be performed starting from the first steps of the designing process. The main target of the proposed methodology is to get information of noise sources contributions of a dynamic system considering the possibility to have multiple forces contemporary acting on the system. The contributions of these forces are investigated with particular focus on distribute or moving forces. In this paper, the mathematical basics of the proposed methodology and its advantages in comparison with TPA will be discussed. Then, a dynamic system is investigated with a combination of two methods. Being based on the dynamic substructuring (DS) of the investigated model, the methodology proposed requires the evaluation of the contact forces at interfaces, which are computed with a flexible multi-body dynamic (FMBD) simulation. Then, the structure-borne noise paths are computed with the wave based method (WBM). As an example application a 4-cylinder engine is investigated and the proposed methodology is applied on the engine block. The aim is to get accurate and clear relationships between excitations and responses of the simulated dynamic system, analyzing the noise and vibrational sources inside a car engine, showing the main advantages of a numerical methodology.

  4. Scalability of a Methodology for Generating Technical Trading Rules with GAPs Based on Risk-Return Adjustment and Incremental Training

    NASA Astrophysics Data System (ADS)

    de La Cal, E. A.; Fernández, E. M.; Quiroga, R.; Villar, J. R.; Sedano, J.

    In previous works a methodology was defined, based on the design of a genetic algorithm GAP and an incremental training technique adapted to the learning of series of stock market values. The GAP technique consists in a fusion of GP and GA. The GAP algorithm implements the automatic search for crisp trading rules taking as objectives of the training both the optimization of the return obtained and the minimization of the assumed risk. Applying the proposed methodology, rules have been obtained for a period of eight years of the S&P500 index. The achieved adjustment of the relation return-risk has generated rules with returns very superior in the testing period to those obtained applying habitual methodologies and even clearly superior to Buy&Hold. This work probes that the proposed methodology is valid for different assets in a different market than previous work.

  5. A methodology of SiP testing based on boundary scan

    NASA Astrophysics Data System (ADS)

    Qin, He; Quan, Haiyang; Han, Yifei; Zhu, Tianrui; Zheng, Tuo

    2017-10-01

    System in Package (SiP) play an important role in portable, aerospace and military electronic with the microminiaturization, light weight, high density, and high reliability. At present, SiP system test has encountered the problem on system complexity and malfunction location with the system scale exponentially increase. For SiP system, this paper proposed a testing methodology and testing process based on the boundary scan technology. Combining the character of SiP system and referencing the boundary scan theory of PCB circuit and embedded core test, the specific testing methodology and process has been proposed. The hardware requirement of the under test SiP system has been provided, and the hardware platform of the testing has been constructed. The testing methodology has the character of high test efficiency and accurate malfunction location.

  6. Local deformation for soft tissue simulation

    PubMed Central

    Omar, Nadzeri; Zhong, Yongmin; Smith, Julian; Gu, Chengfan

    2016-01-01

    ABSTRACT This paper presents a new methodology to localize the deformation range to improve the computational efficiency for soft tissue simulation. This methodology identifies the local deformation range from the stress distribution in soft tissues due to an external force. A stress estimation method is used based on elastic theory to estimate the stress in soft tissues according to a depth from the contact surface. The proposed methodology can be used with both mass-spring and finite element modeling approaches for soft tissue deformation. Experimental results show that the proposed methodology can improve the computational efficiency while maintaining the modeling realism. PMID:27286482

  7. The Common Topoi of STEM Discourse: An Apologia and Methodological Proposal, with Pilot Survey

    ERIC Educational Resources Information Center

    Walsh, Lynda

    2010-01-01

    In this article, the author proposes a methodology for the rhetorical analysis of scientific, technical, mathematical, and engineering (STEM) discourse based on the common topics (topoi) of this discourse. Beginning with work by Miller, Prelli, and other rhetoricians of STEM discourse--but factoring in related studies in cognitive linguistics--she…

  8. Optimization of lamp arrangement in a closed-conduit UV reactor based on a genetic algorithm.

    PubMed

    Sultan, Tipu; Ahmad, Zeshan; Cho, Jinsoo

    2016-01-01

    The choice for the arrangement of the UV lamps in a closed-conduit ultraviolet (CCUV) reactor significantly affects the performance. However, a systematic methodology for the optimal lamp arrangement within the chamber of the CCUV reactor is not well established in the literature. In this research work, we propose a viable systematic methodology for the lamp arrangement based on a genetic algorithm (GA). In addition, we analyze the impacts of the diameter, angle, and symmetry of the lamp arrangement on the reduction equivalent dose (RED). The results are compared based on the simulated RED values and evaluated using the computational fluid dynamics simulations software ANSYS FLUENT. The fluence rate was calculated using commercial software UVCalc3D, and the GA-based lamp arrangement optimization was achieved using MATLAB. The simulation results provide detailed information about the GA-based methodology for the lamp arrangement, the pathogen transport, and the simulated RED values. A significant increase in the RED values was achieved by using the GA-based lamp arrangement methodology. This increase in RED value was highest for the asymmetric lamp arrangement within the chamber of the CCUV reactor. These results demonstrate that the proposed GA-based methodology for symmetric and asymmetric lamp arrangement provides a viable technical solution to the design and optimization of the CCUV reactor.

  9. Applying operational research and data mining to performance based medical personnel motivation system.

    PubMed

    Niaksu, Olegas; Zaptorius, Jonas

    2014-01-01

    This paper presents the methodology suitable for creation of a performance related remuneration system in healthcare sector, which would meet requirements for efficiency and sustainable quality of healthcare services. Methodology for performance indicators selection, ranking and a posteriori evaluation has been proposed and discussed. Priority Distribution Method is applied for unbiased performance criteria weighting. Data mining methods are proposed to monitor and evaluate the results of motivation system.We developed a method for healthcare specific criteria selection consisting of 8 steps; proposed and demonstrated application of Priority Distribution Method for the selected criteria weighting. Moreover, a set of data mining methods for evaluation of the motivational system outcomes was proposed. The described methodology for calculating performance related payment needs practical approbation. We plan to develop semi-automated tools for institutional and personal performance indicators monitoring. The final step would be approbation of the methodology in a healthcare facility.

  10. Seismic Hazard Estimates Using Ill-defined Macroseismic Data at Site

    NASA Astrophysics Data System (ADS)

    Albarello, D.; Mucciarelli, M.

    - A new approach is proposed to the seismic hazard estimate based on documentary data concerning local history of seismic effects. The adopted methodology allows for the use of ``poor'' data, such as the macroseismic ones, within a formally coherent approach that permits overcoming a number of problems connected to the forcing of available information in the frame of ``standard'' methodologies calibrated on the use of instrumental data. The use of the proposed methodology allows full exploitation of all the available information (that for many towns in Italy covers several centuries) making possible a correct use of macroseismic data characterized by different levels of completeness and reliability. As an application of the proposed methodology, seismic hazard estimates are presented for two towns located in Northern Italy: Bologna and Carpi.

  11. A new approach to subjectively assess quality of plenoptic content

    NASA Astrophysics Data System (ADS)

    Viola, Irene; Řeřábek, Martin; Ebrahimi, Touradj

    2016-09-01

    Plenoptic content is becoming increasingly popular thanks to the availability of acquisition and display devices. Thanks to image-based rendering techniques, a plenoptic content can be rendered in real time in an interactive manner allowing virtual navigation through the captured scenes. This way of content consumption enables new experiences, and therefore introduces several challenges in terms of plenoptic data processing, transmission and consequently visual quality evaluation. In this paper, we propose a new methodology to subjectively assess the visual quality of plenoptic content. We also introduce a prototype software to perform subjective quality assessment according to the proposed methodology. The proposed methodology is further applied to assess the visual quality of a light field compression algorithm. Results show that this methodology can be successfully used to assess the visual quality of plenoptic content.

  12. Multi-Sensor Optimal Data Fusion Based on the Adaptive Fading Unscented Kalman Filter

    PubMed Central

    Gao, Bingbing; Hu, Gaoge; Gao, Shesheng; Gu, Chengfan

    2018-01-01

    This paper presents a new optimal data fusion methodology based on the adaptive fading unscented Kalman filter for multi-sensor nonlinear stochastic systems. This methodology has a two-level fusion structure: at the bottom level, an adaptive fading unscented Kalman filter based on the Mahalanobis distance is developed and serves as local filters to improve the adaptability and robustness of local state estimations against process-modeling error; at the top level, an unscented transformation-based multi-sensor optimal data fusion for the case of N local filters is established according to the principle of linear minimum variance to calculate globally optimal state estimation by fusion of local estimations. The proposed methodology effectively refrains from the influence of process-modeling error on the fusion solution, leading to improved adaptability and robustness of data fusion for multi-sensor nonlinear stochastic systems. It also achieves globally optimal fusion results based on the principle of linear minimum variance. Simulation and experimental results demonstrate the efficacy of the proposed methodology for INS/GNSS/CNS (inertial navigation system/global navigation satellite system/celestial navigation system) integrated navigation. PMID:29415509

  13. Multi-Sensor Optimal Data Fusion Based on the Adaptive Fading Unscented Kalman Filter.

    PubMed

    Gao, Bingbing; Hu, Gaoge; Gao, Shesheng; Zhong, Yongmin; Gu, Chengfan

    2018-02-06

    This paper presents a new optimal data fusion methodology based on the adaptive fading unscented Kalman filter for multi-sensor nonlinear stochastic systems. This methodology has a two-level fusion structure: at the bottom level, an adaptive fading unscented Kalman filter based on the Mahalanobis distance is developed and serves as local filters to improve the adaptability and robustness of local state estimations against process-modeling error; at the top level, an unscented transformation-based multi-sensor optimal data fusion for the case of N local filters is established according to the principle of linear minimum variance to calculate globally optimal state estimation by fusion of local estimations. The proposed methodology effectively refrains from the influence of process-modeling error on the fusion solution, leading to improved adaptability and robustness of data fusion for multi-sensor nonlinear stochastic systems. It also achieves globally optimal fusion results based on the principle of linear minimum variance. Simulation and experimental results demonstrate the efficacy of the proposed methodology for INS/GNSS/CNS (inertial navigation system/global navigation satellite system/celestial navigation system) integrated navigation.

  14. Global polar geospatial information service retrieval based on search engine and ontology reasoning

    USGS Publications Warehouse

    Chen, Nengcheng; E, Dongcheng; Di, Liping; Gong, Jianya; Chen, Zeqiang

    2007-01-01

    In order to improve the access precision of polar geospatial information service on web, a new methodology for retrieving global spatial information services based on geospatial service search and ontology reasoning is proposed, the geospatial service search is implemented to find the coarse service from web, the ontology reasoning is designed to find the refined service from the coarse service. The proposed framework includes standardized distributed geospatial web services, a geospatial service search engine, an extended UDDI registry, and a multi-protocol geospatial information service client. Some key technologies addressed include service discovery based on search engine and service ontology modeling and reasoning in the Antarctic geospatial context. Finally, an Antarctica multi protocol OWS portal prototype based on the proposed methodology is introduced.

  15. A methodology towards virtualisation-based high performance simulation platform supporting multidisciplinary design of complex products

    NASA Astrophysics Data System (ADS)

    Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin

    2012-08-01

    Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.

  16. 76 FR 72134 - Annual Charges for Use of Government Lands

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-22

    ... revise the methodology used to compute these annual charges. Under the proposed rule, the Commission would create a fee schedule based on the U.S. Bureau of Land Management's (BLM) methodology for calculating rental rates for linear rights of way. This methodology includes a land value per acre, an...

  17. A risk-based decision support framework for selection of appropriate safety measure system for underground coal mines.

    PubMed

    Samantra, Chitrasen; Datta, Saurav; Mahapatra, Siba Sankar

    2017-03-01

    In the context of underground coal mining industry, the increased economic issues regarding implementation of additional safety measure systems, along with growing public awareness to ensure high level of workers safety, have put great pressure on the managers towards finding the best solution to ensure safe as well as economically viable alternative selection. Risk-based decision support system plays an important role in finding such solutions amongst candidate alternatives with respect to multiple decision criteria. Therefore, in this paper, a unified risk-based decision-making methodology has been proposed for selecting an appropriate safety measure system in relation to an underground coal mining industry with respect to multiple risk criteria such as financial risk, operating risk, and maintenance risk. The proposed methodology uses interval-valued fuzzy set theory for modelling vagueness and subjectivity in the estimates of fuzzy risk ratings for making appropriate decision. The methodology is based on the aggregative fuzzy risk analysis and multi-criteria decision making. The selection decisions are made within the context of understanding the total integrated risk that is likely to incur while adapting the particular safety system alternative. Effectiveness of the proposed methodology has been validated through a real-time case study. The result in the context of final priority ranking is seemed fairly consistent.

  18. The Development of a Checklist to Enhance Methodological Quality in Intervention Programs.

    PubMed

    Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Sánchez-Martín, Milagrosa

    2016-01-01

    The methodological quality of primary studies is an important issue when performing meta-analyses or systematic reviews. Nevertheless, there are no clear criteria for how methodological quality should be analyzed. Controversies emerge when considering the various theoretical and empirical definitions, especially in relation to three interrelated problems: the lack of representativeness, utility, and feasibility. In this article, we (a) systematize and summarize the available literature about methodological quality in primary studies; (b) propose a specific, parsimonious, 12-items checklist to empirically define the methodological quality of primary studies based on a content validity study; and (c) present an inter-coder reliability study for the resulting 12-items. This paper provides a precise and rigorous description of the development of this checklist, highlighting the clearly specified criteria for the inclusion of items and a substantial inter-coder agreement in the different items. Rather than simply proposing another checklist, however, it then argues that the list constitutes an assessment tool with respect to the representativeness, utility, and feasibility of the most frequent methodological quality items in the literature, one that provides practitioners and researchers with clear criteria for choosing items that may be adequate to their needs. We propose individual methodological features as indicators of quality, arguing that these need to be taken into account when designing, implementing, or evaluating an intervention program. This enhances methodological quality of intervention programs and fosters the cumulative knowledge based on meta-analyses of these interventions. Future development of the checklist is discussed.

  19. The Development of a Checklist to Enhance Methodological Quality in Intervention Programs

    PubMed Central

    Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Sánchez-Martín, Milagrosa

    2016-01-01

    The methodological quality of primary studies is an important issue when performing meta-analyses or systematic reviews. Nevertheless, there are no clear criteria for how methodological quality should be analyzed. Controversies emerge when considering the various theoretical and empirical definitions, especially in relation to three interrelated problems: the lack of representativeness, utility, and feasibility. In this article, we (a) systematize and summarize the available literature about methodological quality in primary studies; (b) propose a specific, parsimonious, 12-items checklist to empirically define the methodological quality of primary studies based on a content validity study; and (c) present an inter-coder reliability study for the resulting 12-items. This paper provides a precise and rigorous description of the development of this checklist, highlighting the clearly specified criteria for the inclusion of items and a substantial inter-coder agreement in the different items. Rather than simply proposing another checklist, however, it then argues that the list constitutes an assessment tool with respect to the representativeness, utility, and feasibility of the most frequent methodological quality items in the literature, one that provides practitioners and researchers with clear criteria for choosing items that may be adequate to their needs. We propose individual methodological features as indicators of quality, arguing that these need to be taken into account when designing, implementing, or evaluating an intervention program. This enhances methodological quality of intervention programs and fosters the cumulative knowledge based on meta-analyses of these interventions. Future development of the checklist is discussed. PMID:27917143

  20. Diffusion and decay chain of radioisotopes in stagnant water in saturated porous media.

    PubMed

    Guzmán, Juan; Alvarez-Ramirez, Jose; Escarela-Pérez, Rafael; Vargas, Raúl Alejandro

    2014-09-01

    The analysis of the diffusion of radioisotopes in stagnant water in saturated porous media is important to validate the performance of barrier systems used in radioactive repositories. In this work a methodology is developed to determine the radioisotope concentration in a two-reservoir configuration: a saturated porous medium with stagnant water is surrounded by two reservoirs. The concentrations are obtained for all the radioisotopes of the decay chain using the concept of overvalued concentration. A methodology, based on the variable separation method, is proposed for the solution of the transport equation. The novelty of the proposed methodology involves the factorization of the overvalued concentration in two factors: one that describes the diffusion without decay and another one that describes the decay without diffusion. It is possible with the proposed methodology to determine the required time to obtain equal injective and diffusive concentrations in reservoirs. In fact, this time is inversely proportional to the diffusion coefficient. In addition, the proposed methodology allows finding the required time to get a linear and constant space distribution of the concentration in porous mediums. This time is inversely proportional to the diffusion coefficient. In order to validate the proposed methodology, the distributions in the radioisotope concentrations are compared with other experimental and numerical works. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Design and analysis of sustainable computer mouse using design for disassembly methodology

    NASA Astrophysics Data System (ADS)

    Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia

    2017-12-01

    This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.

  2. A new hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; Railkar, Sudhir B.

    1988-01-01

    This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.

  3. Methodological challenges to human medical study.

    PubMed

    Zhong, Yixin; Liu, Baoyan; Qu, Hua; Xie, Qi

    2014-09-01

    With the transformation of modern medicinal pattern, medical studies are confronted with methodological challenges. By analyzing two methodologies existing in the study of physical matter system and information system, the article points out that traditional Chinese medicine (TCM), especially the treatment based on syndrome differentiation, embodies information conception of methodological positions, while western medicine represents matter conception of methodological positions. It proposes a new way of thinking about combination of TCM and western medicine by combinating two kinds of methodological methods.

  4. Proposed Objective Odor Control Test Methodology for Waste Containment

    NASA Technical Reports Server (NTRS)

    Vos, Gordon

    2010-01-01

    The Orion Cockpit Working Group has requested that an odor control testing methodology be proposed to evaluate the odor containment effectiveness of waste disposal bags to be flown on the Orion Crew Exploration Vehicle. As a standardized "odor containment" test does not appear to be a matter of record for the project, a new test method is being proposed. This method is based on existing test methods used in industrial hygiene for the evaluation of respirator fit in occupational settings, and takes into consideration peer reviewed documentation of human odor thresholds for standardized contaminates, industry stardnard atmostpheric testing methodologies, and established criteria for laboratory analysis. The proposed methodology is quantitative, though it can readily be complimented with a qualitative subjective assessment. Isoamyl acetate (IAA - also known at isopentyl acetate) is commonly used in respirator fit testing, and there are documented methodologies for both measuring its quantitative airborne concentrations. IAA is a clear, colorless liquid with a banana-like odor, documented detectable smell threshold for humans of 0.025 PPM, and a 15 PPB level of quantation limit.

  5. A 3D model retrieval approach based on Bayesian networks lightfield descriptor

    NASA Astrophysics Data System (ADS)

    Xiao, Qinhan; Li, Yanjun

    2009-12-01

    A new 3D model retrieval methodology is proposed by exploiting a novel Bayesian networks lightfield descriptor (BNLD). There are two key novelties in our approach: (1) a BN-based method for building lightfield descriptor; and (2) a 3D model retrieval scheme based on the proposed BNLD. To overcome the disadvantages of the existing 3D model retrieval methods, we explore BN for building a new lightfield descriptor. Firstly, 3D model is put into lightfield, about 300 binary-views can be obtained along a sphere, then Fourier descriptors and Zernike moments descriptors can be calculated out from binaryviews. Then shape feature sequence would be learned into a BN model based on BN learning algorithm; Secondly, we propose a new 3D model retrieval method by calculating Kullback-Leibler Divergence (KLD) between BNLDs. Beneficial from the statistical learning, our BNLD is noise robustness as compared to the existing methods. The comparison between our method and the lightfield descriptor-based approach is conducted to demonstrate the effectiveness of our proposed methodology.

  6. Guidelines for reporting evaluations based on observational methodology.

    PubMed

    Portell, Mariona; Anguera, M Teresa; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2015-01-01

    Observational methodology is one of the most suitable research designs for evaluating fidelity of implementation, especially in complex interventions. However, the conduct and reporting of observational studies is hampered by the absence of specific guidelines, such as those that exist for other evaluation designs. This lack of specific guidance poses a threat to the quality and transparency of these studies and also constitutes a considerable publication hurdle. The aim of this study thus was to draw up a set of proposed guidelines for reporting evaluations based on observational methodology. The guidelines were developed by triangulating three sources of information: observational studies performed in different fields by experts in observational methodology, reporting guidelines for general studies and studies with similar designs to observational studies, and proposals from experts in observational methodology at scientific meetings. We produced a list of guidelines grouped into three domains: intervention and expected outcomes, methods, and results. The result is a useful, carefully crafted set of simple guidelines for conducting and reporting observational studies in the field of program evaluation.

  7. Finding user personal interests by tweet-mining using advanced machine learning algorithm in R

    NASA Astrophysics Data System (ADS)

    Krithika, L. B.; Roy, P.; Asha Jerlin, M.

    2017-11-01

    The social-media plays a key role in every individual’s life by anyone’s personal views about their liking-ness/disliking-ness. This methodology is a sharp departure from the traditional techniques of inferring interests of a user from the tweets that he/she posts or receives. It is showed that the topics of interest inferred by the proposed methodology are far superior than the topics extracted by state-of-the-art techniques such as using topic models (Labelled LDA) on tweets. Based upon the proposed methodology, a system has been built, “Who is interested in what”, which can infer the interests of millions of Twitter users. A novel mechanism is proposed to infer topics of interest of individual users in the twitter social network. It has been observed that in twitter, a user generally follows experts on various topics of his/her interest in order to acquire information on those topics. A methodology based on social annotations is used to first deduce the topical expertise of popular twitter users and then transitively infer the interests of the users who follow them.

  8. Computerized methodology for micro-CT and histological data inflation using an IVUS based translation map.

    PubMed

    Athanasiou, Lambros S; Rigas, George A; Sakellarios, Antonis I; Exarchos, Themis P; Siogkas, Panagiotis K; Naka, Katerina K; Panetta, Daniele; Pelosi, Gualtiero; Vozzi, Federico; Michalis, Lampros K; Parodi, Oberdan; Fotiadis, Dimitrios I

    2015-10-01

    A framework for the inflation of micro-CT and histology data using intravascular ultrasound (IVUS) images, is presented. The proposed methodology consists of three steps. In the first step the micro-CT/histological images are manually co-registered with IVUS by experts using fiducial points as landmarks. In the second step the lumen of both the micro-CT/histological images and IVUS images are automatically segmented. Finally, in the third step the micro-CT/histological images are inflated by applying a transformation method on each image. The transformation method is based on the IVUS and micro-CT/histological contour difference. In order to validate the proposed image inflation methodology, plaque areas in the inflated micro-CT and histological images are compared with the ones in the IVUS images. The proposed methodology for inflating micro-CT/histological images increases the sensitivity of plaque area matching between the inflated and the IVUS images (7% and 22% in histological and micro-CT images, respectively). Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. An omnibus test for family-based association studies with multiple SNPs and multiple phenotypes.

    PubMed

    Lasky-Su, Jessica; Murphy, Amy; McQueen, Matthew B; Weiss, Scott; Lange, Christoph

    2010-06-01

    We propose an omnibus family-based association test (MFBAT) that can be applied to multiple markers and multiple phenotypes and that has only one degree of freedom. The proposed test statistic extends current FBAT methodology to incorporate multiple markers as well as multiple phenotypes. Using simulation studies, power estimates for the proposed methodology are compared with the standard methodologies. On the basis of these simulations, we find that MFBAT substantially outperforms other methods, including haplotypic approaches and doing multiple tests with single single-nucleotide polymorphisms (SNPs) and single phenotypes. The practical relevance of the approach is illustrated by an application to asthma in which SNP/phenotype combinations are identified and reach overall significance that would not have been identified using other approaches. This methodology is directly applicable to cases in which there are multiple SNPs, such as candidate gene studies, cases in which there are multiple phenotypes, such as expression data, and cases in which there are multiple phenotypes and genotypes, such as genome-wide association studies that incorporate expression profiles as phenotypes. This program is available in the PBAT analysis package.

  10. Determination of smoke plume and layer heights using scanning lidar data

    Treesearch

    Vladimir A. Kovalev; Alexander Petkov; Cyle Wold; Shawn Urbanski; Wei Min Hao

    2009-01-01

    The methodology of using mobile scanning lidar data for investigation of smoke plume rise and high-resolution smoke dispersion is considered. The methodology is based on the lidar-signal transformation proposed recently [Appl. Opt. 48, 2559 (2009)]. In this study, similar methodology is used to create the atmospheric heterogeneity height indicator (HHI...

  11. Implementation Proposal of Computer-Based Office Automation for Republic of Korea Army Intelligence Corps (ROKAIC).

    DTIC Science & Technology

    1987-03-01

    contends his soft systems methodology is such an approach. [Ref. 2: pp. 105-107] Overview of this Methodology is meant flor addressing fuzzy., ill...could form the basis of office systems development: Checkland’s (1981) soft systems methodology , Pava’s (1983) sociotechnical design, and Mumlbrd and

  12. An Overview of Prognosis Health Management Research at Glenn Research Center for Gas Turbine Engine Structures With Special Emphasis on Deformation and Damage Modeling

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.

    2009-01-01

    Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include (1) diagnostic/detection methodology, (2) prognosis/lifing methodology, (3) diagnostic/prognosis linkage, (4) experimental validation, and (5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multimechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.

  13. An Overview of Prognosis Health Management Research at GRC for Gas Turbine Engine Structures With Special Emphasis on Deformation and Damage Modeling

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.

    2009-01-01

    Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include 1) diagnostic/detection methodology, 2) prognosis/lifing methodology, 3) diagnostic/prognosis linkage, 4) experimental validation and 5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multi-mechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.

  14. Introducing a new bond reactivity index: Philicities for natural bond orbitals.

    PubMed

    Sánchez-Márquez, Jesús; Zorrilla, David; García, Víctor; Fernández, Manuel

    2017-12-22

    In the present work, a new methodology defined for obtaining reactivity indices (philicities) is proposed. This is based on reactivity functions such as the Fukui function or the dual descriptor, and makes it possible to project the information from reactivity functions onto molecular orbitals, instead of onto the atoms of the molecule (atomic reactivity indices). The methodology focuses on the molecules' natural bond orbitals (bond reactivity indices) because these orbitals have the advantage of being localized, allowing the reaction site of an electrophile or nucleophile to be determined within a very precise molecular region. This methodology provides a "philicity" index for every NBO, and a representative set of molecules has been used to test the new definition. A new methodology has also been developed to compare the "finite difference" and the "frontier molecular orbital" approximations. To facilitate their use, the proposed methodology as well as the possibility of calculating the new indices have been implemented in a new version of UCA-FUKUI software. In addition, condensation schemes based on atomic populations of the "atoms in molecules" theory, the Hirshfeld population analysis, the approximation of Mulliken (with a minimal basis set) and electrostatic potential-derived charges have also been implemented, including the calculation of "bond reactivity indices" defined in previous studies. Graphical abstract A new methodology defined for obtaining bond reactivity indices (philicities) is proposed and makes it possible to project the information from reactivity functions onto molecular orbitals. The proposed methodology as well as the possibility of calculating the new indices have been implemented in a new version of UCA-FUKUI software. In addition, this version can use new atomic condensation schemes and new "utilities" have also been included in this second version.

  15. Advanced Analytical Methodologies Based on Raman Spectroscopy to Detect Prebiotic and Biotic Molecules: Applicability in the Study of the Martian Nakhlite NWA 6148 Meteorite

    NASA Astrophysics Data System (ADS)

    Madariaga, J. M.; Torre-Fdez, I.; Ruiz-Galende, P.; Aramendia, J.; Gomez-Nubla, L.; Fdez-Ortiz de Vallejuelo, S.; Maguregui, M.; Castro, K.; Arana, G.

    2018-04-01

    Advanced methodologies based on Raman spectroscopy are proposed to detect prebiotic and biotic molecules in returned samples from Mars: (a) optical microscopy with confocal micro-Raman, (b) the SCA instrument, (c) Raman Imaging. Examples for NWA 6148.

  16. Ambient Vibration Testing for Story Stiffness Estimation of a Heritage Timber Building

    PubMed Central

    Min, Kyung-Won; Kim, Junhee; Park, Sung-Ah; Park, Chan-Soo

    2013-01-01

    This paper investigates dynamic characteristics of a historic wooden structure by ambient vibration testing, presenting a novel estimation methodology of story stiffness for the purpose of vibration-based structural health monitoring. As for the ambient vibration testing, measured structural responses are analyzed by two output-only system identification methods (i.e., frequency domain decomposition and stochastic subspace identification) to estimate modal parameters. The proposed methodology of story stiffness is estimation based on an eigenvalue problem derived from a vibratory rigid body model. Using the identified natural frequencies, the eigenvalue problem is efficiently solved and uniquely yields story stiffness. It is noteworthy that application of the proposed methodology is not necessarily confined to the wooden structure exampled in the paper. PMID:24227999

  17. KSC management training system project

    NASA Technical Reports Server (NTRS)

    Sepulveda, Jose A.

    1993-01-01

    The stated objectives for the summer of 1993 were: to review the Individual Development Plan Surveys for 1994 in order to automate the analysis of the Needs Assessment effort; and to develop and implement evaluation methodologies to perform ongoing program-wide course-to-course assessment. This includes the following: to propose a methodology to develop and implement objective, performance-based assessment instruments for each training effort; to mechanize course evaluation forms and develop software to facilitate the data gathering, analysis, and reporting processes; and to implement the methodology, forms, and software in at lease one training course or seminar selected among those normally offered in the summer at KSC. Section two of this report addresses the work done in regard to the Individual Development Plan Surveys for 1994. Section three presents the methodology proposed to develop and implement objective, performance-based assessment instruments for each training course offered at KSC.

  18. Modeling Single-Event Transient Propagation in a SiGe BiCMOS Direct-Conversion Receiver

    NASA Astrophysics Data System (ADS)

    Ildefonso, Adrian; Song, Ickhyun; Tzintzarov, George N.; Fleetwood, Zachary E.; Lourenco, Nelson E.; Wachter, Mason T.; Cressler, John D.

    2017-08-01

    The propagation of single-event transient (SET) signals in a silicon-germanium direct-conversion receiver carrying modulated data is explored. A theoretical analysis of transient propagation, verified by simulation, is presented. A new methodology to characterize and quantify the impact of SETs in communication systems carrying modulated data is proposed. The proposed methodology uses a pulsed radiation source to induce distortions in the signal constellation. The error vector magnitude due to SETs can then be calculated to quantify errors. Two different modulation schemes were simulated: QPSK and 16-QAM. The distortions in the constellation diagram agree with the presented circuit theory. Furthermore, the proposed methodology was applied to evaluate the improvements in the SET response due to a known radiation-hardening-by-design (RHBD) technique, where the common-base device of the low-noise amplifier was operated in inverse mode. The proposed methodology can be a valid technique to determine the most sensitive parts of a system carrying modulated data.

  19. A methodology for the design and evaluation of user interfaces for interactive information systems. Ph.D. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Farooq, Mohammad U.

    1986-01-01

    The definition of proposed research addressing the development and validation of a methodology for the design and evaluation of user interfaces for interactive information systems is given. The major objectives of this research are: the development of a comprehensive, objective, and generalizable methodology for the design and evaluation of user interfaces for information systems; the development of equations and/or analytical models to characterize user behavior and the performance of a designed interface; the design of a prototype system for the development and administration of user interfaces; and the design and use of controlled experiments to support the research and test/validate the proposed methodology. The proposed design methodology views the user interface as a virtual machine composed of three layers: an interactive layer, a dialogue manager layer, and an application interface layer. A command language model of user system interactions is presented because of its inherent simplicity and structured approach based on interaction events. All interaction events have a common structure based on common generic elements necessary for a successful dialogue. It is shown that, using this model, various types of interfaces could be designed and implemented to accommodate various categories of users. The implementation methodology is discussed in terms of how to store and organize the information.

  20. A Fault Diagnosis Methodology for Gear Pump Based on EEMD and Bayesian Network

    PubMed Central

    Liu, Zengkai; Liu, Yonghong; Shan, Hongkai; Cai, Baoping; Huang, Qing

    2015-01-01

    This paper proposes a fault diagnosis methodology for a gear pump based on the ensemble empirical mode decomposition (EEMD) method and the Bayesian network. Essentially, the presented scheme is a multi-source information fusion based methodology. Compared with the conventional fault diagnosis with only EEMD, the proposed method is able to take advantage of all useful information besides sensor signals. The presented diagnostic Bayesian network consists of a fault layer, a fault feature layer and a multi-source information layer. Vibration signals from sensor measurement are decomposed by the EEMD method and the energy of intrinsic mode functions (IMFs) are calculated as fault features. These features are added into the fault feature layer in the Bayesian network. The other sources of useful information are added to the information layer. The generalized three-layer Bayesian network can be developed by fully incorporating faults and fault symptoms as well as other useful information such as naked eye inspection and maintenance records. Therefore, diagnostic accuracy and capacity can be improved. The proposed methodology is applied to the fault diagnosis of a gear pump and the structure and parameters of the Bayesian network is established. Compared with artificial neural network and support vector machine classification algorithms, the proposed model has the best diagnostic performance when sensor data is used only. A case study has demonstrated that some information from human observation or system repair records is very helpful to the fault diagnosis. It is effective and efficient in diagnosing faults based on uncertain, incomplete information. PMID:25938760

  1. A Fault Diagnosis Methodology for Gear Pump Based on EEMD and Bayesian Network.

    PubMed

    Liu, Zengkai; Liu, Yonghong; Shan, Hongkai; Cai, Baoping; Huang, Qing

    2015-01-01

    This paper proposes a fault diagnosis methodology for a gear pump based on the ensemble empirical mode decomposition (EEMD) method and the Bayesian network. Essentially, the presented scheme is a multi-source information fusion based methodology. Compared with the conventional fault diagnosis with only EEMD, the proposed method is able to take advantage of all useful information besides sensor signals. The presented diagnostic Bayesian network consists of a fault layer, a fault feature layer and a multi-source information layer. Vibration signals from sensor measurement are decomposed by the EEMD method and the energy of intrinsic mode functions (IMFs) are calculated as fault features. These features are added into the fault feature layer in the Bayesian network. The other sources of useful information are added to the information layer. The generalized three-layer Bayesian network can be developed by fully incorporating faults and fault symptoms as well as other useful information such as naked eye inspection and maintenance records. Therefore, diagnostic accuracy and capacity can be improved. The proposed methodology is applied to the fault diagnosis of a gear pump and the structure and parameters of the Bayesian network is established. Compared with artificial neural network and support vector machine classification algorithms, the proposed model has the best diagnostic performance when sensor data is used only. A case study has demonstrated that some information from human observation or system repair records is very helpful to the fault diagnosis. It is effective and efficient in diagnosing faults based on uncertain, incomplete information.

  2. Privacy-Aware Relevant Data Access with Semantically Enriched Search Queries for Untrusted Cloud Storage Services.

    PubMed

    Pervez, Zeeshan; Ahmad, Mahmood; Khattak, Asad Masood; Lee, Sungyoung; Chung, Tae Choong

    2016-01-01

    Privacy-aware search of outsourced data ensures relevant data access in the untrusted domain of a public cloud service provider. Subscriber of a public cloud storage service can determine the presence or absence of a particular keyword by submitting search query in the form of a trapdoor. However, these trapdoor-based search queries are limited in functionality and cannot be used to identify secure outsourced data which contains semantically equivalent information. In addition, trapdoor-based methodologies are confined to pre-defined trapdoors and prevent subscribers from searching outsourced data with arbitrarily defined search criteria. To solve the problem of relevant data access, we have proposed an index-based privacy-aware search methodology that ensures semantic retrieval of data from an untrusted domain. This method ensures oblivious execution of a search query and leverages authorized subscribers to model conjunctive search queries without relying on predefined trapdoors. A security analysis of our proposed methodology shows that, in a conspired attack, unauthorized subscribers and untrusted cloud service providers cannot deduce any information that can lead to the potential loss of data privacy. A computational time analysis on commodity hardware demonstrates that our proposed methodology requires moderate computational resources to model a privacy-aware search query and for its oblivious evaluation on a cloud service provider.

  3. Privacy-Aware Relevant Data Access with Semantically Enriched Search Queries for Untrusted Cloud Storage Services

    PubMed Central

    Pervez, Zeeshan; Ahmad, Mahmood; Khattak, Asad Masood; Lee, Sungyoung; Chung, Tae Choong

    2016-01-01

    Privacy-aware search of outsourced data ensures relevant data access in the untrusted domain of a public cloud service provider. Subscriber of a public cloud storage service can determine the presence or absence of a particular keyword by submitting search query in the form of a trapdoor. However, these trapdoor-based search queries are limited in functionality and cannot be used to identify secure outsourced data which contains semantically equivalent information. In addition, trapdoor-based methodologies are confined to pre-defined trapdoors and prevent subscribers from searching outsourced data with arbitrarily defined search criteria. To solve the problem of relevant data access, we have proposed an index-based privacy-aware search methodology that ensures semantic retrieval of data from an untrusted domain. This method ensures oblivious execution of a search query and leverages authorized subscribers to model conjunctive search queries without relying on predefined trapdoors. A security analysis of our proposed methodology shows that, in a conspired attack, unauthorized subscribers and untrusted cloud service providers cannot deduce any information that can lead to the potential loss of data privacy. A computational time analysis on commodity hardware demonstrates that our proposed methodology requires moderate computational resources to model a privacy-aware search query and for its oblivious evaluation on a cloud service provider. PMID:27571421

  4. Toward the Long-Term Scientific Study of Encounter Group Phenomena: I. Methodological Considerations.

    ERIC Educational Resources Information Center

    Diamond, Michael Jay; Shapiro, Jerrold Lee

    This paper proposes a model for the long-term scientific study of encounter, T-, and sensitivity groups. The authors see the need for overcoming major methodological and design inadequacies of such research. They discuss major methodological flaws in group outcome research as including: (1) lack of adequate base rate or pretraining measures; (2)…

  5. A Negative Selection Immune System Inspired Methodology for Fault Diagnosis of Wind Turbines.

    PubMed

    Alizadeh, Esmaeil; Meskin, Nader; Khorasani, Khashayar

    2017-11-01

    High operational and maintenance costs represent as major economic constraints in the wind turbine (WT) industry. These concerns have made investigation into fault diagnosis of WT systems an extremely important and active area of research. In this paper, an immune system (IS) inspired methodology for performing fault detection and isolation (FDI) of a WT system is proposed and developed. The proposed scheme is based on a self nonself discrimination paradigm of a biological IS. Specifically, the negative selection mechanism [negative selection algorithm (NSA)] of the human body is utilized. In this paper, a hierarchical bank of NSAs are designed to detect and isolate both individual as well as simultaneously occurring faults common to the WTs. A smoothing moving window filter is then utilized to further improve the reliability and performance of the FDI scheme. Moreover, the performance of our proposed scheme is compared with another state-of-the-art data-driven technique, namely the support vector machines (SVMs) to demonstrate and illustrate the superiority and advantages of our proposed NSA-based FDI scheme. Finally, a nonparametric statistical comparison test is implemented to evaluate our proposed methodology with that of the SVM under various fault severities.

  6. A New Methodology for 3D Target Detection in Automotive Radar Applications

    PubMed Central

    Baselice, Fabio; Ferraioli, Giampaolo; Lukin, Sergyi; Matuozzo, Gianfranco; Pascazio, Vito; Schirinzi, Gilda

    2016-01-01

    Today there is a growing interest in automotive sensor monitoring systems. One of the main challenges is to make them an effective and valuable aid in dangerous situations, improving transportation safety. The main limitation of visual aid systems is that they do not produce accurate results in critical visibility conditions, such as in presence of rain, fog or smoke. Radar systems can greatly help in overcoming such limitations. In particular, imaging radar is gaining interest in the framework of Driver Assistance Systems (DAS). In this manuscript, a new methodology able to reconstruct the 3D imaged scene and to detect the presence of multiple targets within each line of sight is proposed. The technique is based on the use of Compressive Sensing (CS) theory and produces the estimation of multiple targets for each line of sight, their range distance and their reflectivities. Moreover, a fast approach for 2D focus based on the FFT algorithm is proposed. After the description of the proposed methodology, different simulated case studies are reported in order to evaluate the performances of the proposed approach. PMID:27136558

  7. Pressure-based high-order TVD methodology for dynamic stall control

    NASA Astrophysics Data System (ADS)

    Yang, H. Q.; Przekwas, A. J.

    1992-01-01

    The quantitative prediction of the dynamics of separating unsteady flows, such as dynamic stall, is of crucial importance. This six-month SBIR Phase 1 study has developed several new pressure-based methodologies for solving 3D Navier-Stokes equations in both stationary and moving (body-comforting) coordinates. The present pressure-based algorithm is equally efficient for low speed incompressible flows and high speed compressible flows. The discretization of convective terms by the presently developed high-order TVD schemes requires no artificial dissipation and can properly resolve the concentrated vortices in the wing-body with minimum numerical diffusion. It is demonstrated that the proposed Newton's iteration technique not only increases the convergence rate but also strongly couples the iteration between pressure and velocities. The proposed hyperbolization of the pressure correction equation is shown to increase the solver's efficiency. The above proposed methodologies were implemented in an existing CFD code, REFLEQS. The modified code was used to simulate both static and dynamic stalls on two- and three-dimensional wing-body configurations. Three-dimensional effect and flow physics are discussed.

  8. A New Finite-Time Observer for Nonlinear Systems: Applications to Synchronization of Lorenz-Like Systems.

    PubMed

    Aguilar-López, Ricardo; Mata-Machuca, Juan L

    2016-01-01

    This paper proposes a synchronization methodology of two chaotic oscillators under the framework of identical synchronization and master-slave configuration. The proposed methodology is based on state observer design under the frame of control theory; the observer structure provides finite-time synchronization convergence by cancelling the upper bounds of the main nonlinearities of the chaotic oscillator. The above is showed via an analysis of the dynamic of the so called synchronization error. Numerical experiments corroborate the satisfactory results of the proposed scheme.

  9. A New Finite-Time Observer for Nonlinear Systems: Applications to Synchronization of Lorenz-Like Systems

    PubMed Central

    Aguilar-López, Ricardo

    2016-01-01

    This paper proposes a synchronization methodology of two chaotic oscillators under the framework of identical synchronization and master-slave configuration. The proposed methodology is based on state observer design under the frame of control theory; the observer structure provides finite-time synchronization convergence by cancelling the upper bounds of the main nonlinearities of the chaotic oscillator. The above is showed via an analysis of the dynamic of the so called synchronization error. Numerical experiments corroborate the satisfactory results of the proposed scheme. PMID:27738651

  10. Evaluating Payments for Environmental Services: Methodological Challenges

    PubMed Central

    2016-01-01

    Over the last fifteen years, Payments for Environmental Services (PES) schemes have become very popular environmental policy instruments, but the academic literature has begun to question their additionality. The literature attempts to estimate the causal effect of these programs by applying impact evaluation (IE) techniques. However, PES programs are complex instruments and IE methods cannot be directly applied without adjustments. Based on a systematic review of the literature, this article proposes a framework for the methodological process of designing an IE for PES schemes. It revises and discusses the methodological choices at each step of the process and proposes guidelines for practitioners. PMID:26910850

  11. Using State Estimation Residuals to Detect Abnormal SCADA Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Jian; Chen, Yousu; Huang, Zhenyu

    2010-04-30

    Detection of abnormal supervisory control and data acquisition (SCADA) data is critically important for safe and secure operation of modern power systems. In this paper, a methodology of abnormal SCADA data detection based on state estimation residuals is presented. Preceded with a brief overview of outlier detection methods and bad SCADA data detection for state estimation, the framework of the proposed methodology is described. Instead of using original SCADA measurements as the bad data sources, the residuals calculated based on the results of the state estimator are used as the input for the outlier detection algorithm. The BACON algorithm ismore » applied to the outlier detection task. The IEEE 118-bus system is used as a test base to evaluate the effectiveness of the proposed methodology. The accuracy of the BACON method is compared with that of the 3-σ method for the simulated SCADA measurements and residuals.« less

  12. TRI Relative Risk-Based Environmental Indicators: Summary of Comments Received on the Draft 1992 Methodology and Responses to Comment, May 1997

    EPA Pesticide Factsheets

    This document contains general comments on the original Indicators methodology, the toxicity weighting, the chronic ecological indicator and other issues. OPPT's responses and proposed changes are also discussed.

  13. Development of risk-based decision methodology for facility design.

    DOT National Transportation Integrated Search

    2014-06-01

    This report develops a methodology for CDOT to use in the risk analysis of various types of facilities and provides : illustrative examples for the use of the proposed framework. An overview of the current practices and applications to : illustrate t...

  14. The substantiation of methodical instrumentation to increase the tempo of high-rise construction in region

    NASA Astrophysics Data System (ADS)

    Belyaeva, Svetlana; Makeeva, Tatyana; Chugunov, Andrei; Andreeva, Peraskovya

    2018-03-01

    One of the important conditions of effective renovation of accommodation in region on the base of realization of high-rise construction projects is attraction of investments by forming favorable investment climate, as well as reduction if administrative barriers in construction and update of main funds of housing and communal services. The article proposes methodological bases for assessing the state of the investment climate in the region, as well as the methodology for the formation and evaluation of the investment program of the housing and communal services enterprise. The proposed methodologies are tested on the example of the Voronezh region. Authors also showed the necessity and expediency of using the consulting mechanism in the development of state and non-state investment projects and programs.

  15. Reliability-Based Stability Analysis of Rock Slopes Using Numerical Analysis and Response Surface Method

    NASA Astrophysics Data System (ADS)

    Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.

    2017-08-01

    While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.

  16. Using Geographic Information Systems to measure retail food environments: Discussion of methodological considerations and a proposed reporting checklist (Geo-FERN).

    PubMed

    Wilkins, Emma L; Morris, Michelle A; Radley, Duncan; Griffiths, Claire

    2017-03-01

    Geographic Information Systems (GIS) are widely used to measure retail food environments. However the methods used are hetrogeneous, limiting collation and interpretation of evidence. This problem is amplified by unclear and incomplete reporting of methods. This discussion (i) identifies common dimensions of methodological diversity across GIS-based food environment research (data sources, data extraction methods, food outlet construct definitions, geocoding methods, and access metrics), (ii) reviews the impact of different methodological choices, and (iii) highlights areas where reporting is insufficient. On the basis of this discussion, the Geo-FERN reporting checklist is proposed to support methodological reporting and interpretation. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Performance analysis of complex repairable industrial systems using PSO and fuzzy confidence interval based methodology.

    PubMed

    Garg, Harish

    2013-03-01

    The main objective of the present paper is to propose a methodology for analyzing the behavior of the complex repairable industrial systems. In real-life situations, it is difficult to find the most optimal design policies for MTBF (mean time between failures), MTTR (mean time to repair) and related costs by utilizing available resources and uncertain data. For this, the availability-cost optimization model has been constructed for determining the optimal design parameters for improving the system design efficiency. The uncertainties in the data related to each component of the system are estimated with the help of fuzzy and statistical methodology in the form of the triangular fuzzy numbers. Using these data, the various reliability parameters, which affects the system performance, are obtained in the form of the fuzzy membership function by the proposed confidence interval based fuzzy Lambda-Tau (CIBFLT) methodology. The computed results by CIBFLT are compared with the existing fuzzy Lambda-Tau methodology. Sensitivity analysis on the system MTBF has also been addressed. The methodology has been illustrated through a case study of washing unit, the main part of the paper industry. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  18. A methodology for the evaluation of the human-bioclimatic performance of open spaces

    NASA Astrophysics Data System (ADS)

    Charalampopoulos, Ioannis; Tsiros, Ioannis; Chronopoulou-Sereli, Aik.; Matzarakis, Andreas

    2017-05-01

    The purpose of this paper is to present a simple methodology to improve the evaluation of the human-biometeorological benefits of open spaces. It is based on two groups of new indices using as basis the well-known PET index. This simple methodology along with the accompanying indices allows a qualitative and quantitative evaluation of the climatic behavior of the selected sites. The proposed methodology was applied in a human-biometeorology research in the city of Athens, Greece. The results of this study are in line with the results of other related studies indicating the considerable influence of the sky view factor (SVF), the existence of the vegetation and the building material on human-biometeorological conditions. The proposed methodology may provide new insights in the decision-making process related to urban open spaces' best configuration.

  19. A Novel Clustering Methodology Based on Modularity Optimisation for Detecting Authorship Affinities in Shakespearean Era Plays

    PubMed Central

    Craig, Hugh; Berretta, Regina; Moscato, Pablo

    2016-01-01

    In this study we propose a novel, unsupervised clustering methodology for analyzing large datasets. This new, efficient methodology converts the general clustering problem into the community detection problem in graph by using the Jensen-Shannon distance, a dissimilarity measure originating in Information Theory. Moreover, we use graph theoretic concepts for the generation and analysis of proximity graphs. Our methodology is based on a newly proposed memetic algorithm (iMA-Net) for discovering clusters of data elements by maximizing the modularity function in proximity graphs of literary works. To test the effectiveness of this general methodology, we apply it to a text corpus dataset, which contains frequencies of approximately 55,114 unique words across all 168 written in the Shakespearean era (16th and 17th centuries), to analyze and detect clusters of similar plays. Experimental results and comparison with state-of-the-art clustering methods demonstrate the remarkable performance of our new method for identifying high quality clusters which reflect the commonalities in the literary style of the plays. PMID:27571416

  20. Analysis of torque transmitting behavior and wheel slip prevention control during regenerative braking for high speed EMU trains

    NASA Astrophysics Data System (ADS)

    Xu, Kun; Xu, Guo-Qing; Zheng, Chun-Hua

    2016-04-01

    The wheel-rail adhesion control for regenerative braking systems of high speed electric multiple unit trains is crucial to maintaining the stability, improving the adhesion utilization, and achieving deep energy recovery. There remain technical challenges mainly because of the nonlinear, uncertain, and varying features of wheel-rail contact conditions. This research analyzes the torque transmitting behavior during regenerative braking, and proposes a novel methodology to detect the wheel-rail adhesion stability. Then, applications to the wheel slip prevention during braking are investigated, and the optimal slip ratio control scheme is proposed, which is based on a novel optimal reference generation of the slip ratio and a robust sliding mode control. The proposed methodology achieves the optimal braking performance without the wheel-rail contact information. Numerical simulation results for uncertain slippery rails verify the effectiveness of the proposed methodology.

  1. Using experts feedback in clinical case resolution and arbitration as accuracy diagnosis methodology.

    PubMed

    Rodríguez-González, Alejandro; Torres-Niño, Javier; Valencia-Garcia, Rafael; Mayer, Miguel A; Alor-Hernandez, Giner

    2013-09-01

    This paper proposes a new methodology for assessing the efficiency of medical diagnostic systems and clinical decision support systems by using the feedback/opinions of medical experts. The methodology behind this work is based on a comparison between the expert feedback that has helped solve different clinical cases and the expert system that has evaluated these same cases. Once the results are returned, an arbitration process is carried out in order to ensure the correctness of the results provided by both methods. Once this process has been completed, the results are analyzed using Precision, Recall, Accuracy, Specificity and Matthews Correlation Coefficient (MCC) (PRAS-M) metrics. When the methodology is applied, the results obtained from a real diagnostic system allow researchers to establish the accuracy of the system based on objective facts. The methodology returns enough information to analyze the system's behavior for each disease in the knowledge base or across the entire knowledge base. It also returns data on the efficiency of the different assessors involved in the evaluation process, analyzing their behavior in the diagnostic process. The proposed work facilitates the evaluation of medical diagnostic systems, having a reliable process based on objective facts. The methodology presented in this research makes it possible to identify the main characteristics that define a medical diagnostic system and their values, allowing for system improvement. A good example of the results provided by the application of the methodology is shown in this paper. A diagnosis system was evaluated by means of this methodology, yielding positive results (statistically significant) when comparing the system with the assessors that participated in the evaluation process of the system through metrics such as recall (+27.54%) and MCC (+32.19%). These results demonstrate the real applicability of the methodology used. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. A neural network based methodology to predict site-specific spectral acceleration values

    NASA Astrophysics Data System (ADS)

    Kamatchi, P.; Rajasankar, J.; Ramana, G. V.; Nagpal, A. K.

    2010-12-01

    A general neural network based methodology that has the potential to replace the computationally-intensive site-specific seismic analysis of structures is proposed in this paper. The basic framework of the methodology consists of a feed forward back propagation neural network algorithm with one hidden layer to represent the seismic potential of a region and soil amplification effects. The methodology is implemented and verified with parameters corresponding to Delhi city in India. For this purpose, strong ground motions are generated at bedrock level for a chosen site in Delhi due to earthquakes considered to originate from the central seismic gap of the Himalayan belt using necessary geological as well as geotechnical data. Surface level ground motions and corresponding site-specific response spectra are obtained by using a one-dimensional equivalent linear wave propagation model. Spectral acceleration values are considered as a target parameter to verify the performance of the methodology. Numerical studies carried out to validate the proposed methodology show that the errors in predicted spectral acceleration values are within acceptable limits for design purposes. The methodology is general in the sense that it can be applied to other seismically vulnerable regions and also can be updated by including more parameters depending on the state-of-the-art in the subject.

  3. Use of evidence-based practice in an aid organisation: a proposal to deal with the variety in terminology and methodology.

    PubMed

    De Buck, Emmy; Pauwels, Nele S; Dieltjens, Tessa; Vandekerckhove, Philippe

    2014-03-01

    As part of its strategy Belgian Red Cross-Flanders underpins all its activities with evidence-based guidelines and systematic reviews. The aim of this publication is to describe in detail the methodology used to achieve this goal within an action-oriented organisation, in a timely and cost-effective way. To demonstrate transparency in our methods, we wrote a methodological charter describing the way in which we develop evidence-based materials to support our activities. Criteria were drawn up for deciding on project priority and the choice of different types of projects (scoping reviews, systematic reviews and evidence-based guidelines). While searching for rigorous and realistically attainable methodological standards, we encountered a wide variety in terminology and methodology used in the field of evidence-based practice. Terminologies currently being used by different organisations and institutions include systematic reviews, systematic literature searches, evidence-based guidelines, rapid reviews, pragmatic systematic reviews, and rapid response service. It is not always clear what the definition and methodology is behind these terms and whether they are used consistently. We therefore describe the terminology and methodology used by Belgian Red Cross-Flanders; criteria for making methodological choices and details on the methodology we use are given. In our search for an appropriate methodology, taking into account time and resource constraints, we encountered an enormous variety of methodological approaches and terminology used for evidence-based materials. In light of this, we recommend that authors of evidence-based guidelines and reviews are transparent and clear about the methodology used. To be transparent about our approach, we developed a methodological charter. This charter may inspire other organisations that want to use evidence-based methodology to support their activities.

  4. A methodology for secure recovery of spacecrafts based on a trusted hardware platform

    NASA Astrophysics Data System (ADS)

    Juliato, Marcio; Gebotys, Catherine

    2017-02-01

    This paper proposes a methodology for the secure recovery of spacecrafts and the recovery of its cryptographic capabilities in emergency scenarios recurring from major unintentional failures and malicious attacks. The proposed approach employs trusted modules to achieve higher reliability and security levels in space missions due to the presence of integrity check capabilities as well as secure recovery mechanisms. Additionally, several recovery protocols are thoroughly discussed and analyzed against a wide variety of attacks. Exhaustive search attacks are shown in a wide variety of contexts and are shown to be infeasible and totally independent of the computational power of attackers. Experimental results have shown that the proposed methodology allows for the fast and secure recovery of spacecrafts, demanding minimum implementation area, power consumption and bandwidth.

  5. Economic feasibility study for new technological alternatives in wastewater treatment processes: a review.

    PubMed

    Molinos-Senante, María; Hernández-Sancho, Francesc; Sala-Garrido, Ramón

    2012-01-01

    The concept of sustainability involves the integration of economic, environmental, and social aspects and this also applies in the field of wastewater treatment. Economic feasibility studies are a key tool for selecting the most appropriate option from a set of technological proposals. Moreover, these studies are needed to assess the viability of transferring new technologies from pilot-scale to full-scale. In traditional economic feasibility studies, the benefits that have no market price, such as environmental benefits, are not considered and are therefore underestimated. To overcome this limitation, we propose a new methodology to assess the economic viability of wastewater treatment technologies that considers internal and external impacts. The estimation of the costs is based on the use of cost functions. To quantify the environmental benefits from wastewater treatment, the distance function methodology is proposed to estimate the shadow price of each pollutant removed in the wastewater treatment. The application of this methodological approach by decision makers enables the calculation of the true costs and benefits associated with each alternative technology. The proposed methodology is presented as a useful tool to support decision making.

  6. Enhanced High Performance Power Compensation Methodology by IPFC Using PIGBT-IDVR

    PubMed Central

    Arumugom, Subramanian; Rajaram, Marimuthu

    2015-01-01

    Currently, power systems are involuntarily controlled without high speed control and are frequently initiated, therefore resulting in a slow process when compared with static electronic devices. Among various power interruptions in power supply systems, voltage dips play a central role in causing disruption. The dynamic voltage restorer (DVR) is a process based on voltage control that compensates for line transients in the distributed system. To overcome these issues and to achieve a higher speed, a new methodology called the Parallel IGBT-Based Interline Dynamic Voltage Restorer (PIGBT-IDVR) method has been proposed, which mainly spotlights the dynamic processing of energy reloads in common dc-linked energy storage with less adaptive transition. The interline power flow controller (IPFC) scheme has been employed to manage the power transmission between the lines and the restorer method for controlling the reactive power in the individual lines. By employing the proposed methodology, the failure of a distributed system has been avoided and provides better performance than the existing methodologies. PMID:26613101

  7. Modelling Ecological Cognitive Rehabilitation Therapies for Building Virtual Environments in Brain Injury.

    PubMed

    Martínez-Moreno, J M; Sánchez-González, P; Luna, M; Roig, T; Tormos, J M; Gómez, E J

    2016-01-01

    Brain Injury (BI) has become one of the most common causes of neurological disability in developed countries. Cognitive disorders result in a loss of independence and patients' quality of life. Cognitive rehabilitation aims to promote patients' skills to achieve their highest degree of personal autonomy. New technologies such as virtual reality or interactive video allow developing rehabilitation therapies based on reproducible Activities of Daily Living (ADLs), increasing the ecological validity of the therapy. However, the lack of frameworks to formalize and represent the definition of this kind of therapies can be a barrier for widespread use of interactive virtual environments in clinical routine. To provide neuropsychologists with a methodology and an instrument to design and evaluate cognitive rehabilitation therapeutic interventions strategies based on ADLs performed in interactive virtual environments. The proposed methodology is used to model therapeutic interventions during virtual ADLs considering cognitive deficit, expected abnormal interactions and therapeutic hypotheses. It allows identifying abnormal behavioural patterns and designing interventions strategies in order to achieve errorless-based rehabilitation. An ADL case study ('buying bread') is defined according to the guidelines established by the ADL intervention model. This case study is developed, as a proof of principle, using interactive video technology and is used to assess the feasibility of the proposed methodology in the definition of therapeutic intervention procedures. The proposed methodology provides neuropsychologists with an instrument to design and evaluate ADL-based therapeutic intervention strategies, attending to solve actual limitation of virtual scenarios, to be use for ecological rehabilitation of cognitive deficit in daily clinical practice. The developed case study proves the potential of the methodology to design therapeutic interventions strategies; however our current work is devoted to designing more experiments in order to present more evidence about its values.

  8. Developing a stochastic conflict resolution model for urban runoff quality management: Application of info-gap and bargaining theories

    NASA Astrophysics Data System (ADS)

    Ghodsi, Seyed Hamed; Kerachian, Reza; Estalaki, Siamak Malakpour; Nikoo, Mohammad Reza; Zahmatkesh, Zahra

    2016-02-01

    In this paper, two deterministic and stochastic multilateral, multi-issue, non-cooperative bargaining methodologies are proposed for urban runoff quality management. In the proposed methodologies, a calibrated Storm Water Management Model (SWMM) is used to simulate stormwater runoff quantity and quality for different urban stormwater runoff management scenarios, which have been defined considering several Low Impact Development (LID) techniques. In the deterministic methodology, the best management scenario, representing location and area of LID controls, is identified using the bargaining model. In the stochastic methodology, uncertainties of some key parameters of SWMM are analyzed using the info-gap theory. For each water quality management scenario, robustness and opportuneness criteria are determined based on utility functions of different stakeholders. Then, to find the best solution, the bargaining model is performed considering a combination of robustness and opportuneness criteria for each scenario based on utility function of each stakeholder. The results of applying the proposed methodology in the Velenjak urban watershed located in the northeastern part of Tehran, the capital city of Iran, illustrate its practical utility for conflict resolution in urban water quantity and quality management. It is shown that the solution obtained using the deterministic model cannot outperform the result of the stochastic model considering the robustness and opportuneness criteria. Therefore, it can be concluded that the stochastic model, which incorporates the main uncertainties, could provide more reliable results.

  9. Using State Estimation Residuals to Detect Abnormal SCADA Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Jian; Chen, Yousu; Huang, Zhenyu

    2010-06-14

    Detection of manipulated supervisory control and data acquisition (SCADA) data is critically important for the safe and secure operation of modern power systems. In this paper, a methodology of detecting manipulated SCADA data based on state estimation residuals is presented. A framework of the proposed methodology is described. Instead of using original SCADA measurements as the bad data sources, the residuals calculated based on the results of the state estimator are used as the input for the outlier detection process. The BACON algorithm is applied to detect outliers in the state estimation residuals. The IEEE 118-bus system is used asmore » a test case to evaluate the effectiveness of the proposed methodology. The accuracy of the BACON method is compared with that of the 3-σ method for the simulated SCADA measurements and residuals.« less

  10. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  11. Reliability based impact localization in composite panels using Bayesian updating and the Kalman filter

    NASA Astrophysics Data System (ADS)

    Morse, Llewellyn; Sharif Khodaei, Zahra; Aliabadi, M. H.

    2018-01-01

    In this work, a reliability based impact detection strategy for a sensorized composite structure is proposed. Impacts are localized using Artificial Neural Networks (ANNs) with recorded guided waves due to impacts used as inputs. To account for variability in the recorded data under operational conditions, Bayesian updating and Kalman filter techniques are applied to improve the reliability of the detection algorithm. The possibility of having one or more faulty sensors is considered, and a decision fusion algorithm based on sub-networks of sensors is proposed to improve the application of the methodology to real structures. A strategy for reliably categorizing impacts into high energy impacts, which are probable to cause damage in the structure (true impacts), and low energy non-damaging impacts (false impacts), has also been proposed to reduce the false alarm rate. The proposed strategy involves employing classification ANNs with different features extracted from captured signals used as inputs. The proposed methodologies are validated by experimental results on a quasi-isotropic composite coupon impacted with a range of impact energies.

  12. A proposal on teaching methodology: cooperative learning by peer tutoring based on the case method

    NASA Astrophysics Data System (ADS)

    Pozo, Antonio M.; Durbán, Juan J.; Salas, Carlos; del Mar Lázaro, M.

    2014-07-01

    The European Higher Education Area (EHEA) proposes substantial changes in the teaching-learning model, moving from a model based mainly on the activity of teachers to a model in which the true protagonist is the student. This new framework requires that students develop new abilities and acquire specific skills. This also implies that the teacher should incorporate new methodologies in class. In this work, we present a proposal on teaching methodology based on cooperative learning and peer tutoring by case study. A noteworthy aspect of the case-study method is that it presents situations that can occur in real life. Therefore, students can acquire certain skills that will be useful in their future professional practice. An innovative aspect in the teaching methodology that we propose is to form work groups consisting of students from different levels in the same major. In our case, the teaching of four subjects would be involved: one subject of the 4th year, one subject of the 3rd year, and two subjects of the 2nd year of the Degree in Optics and Optometry of the University of Granada, Spain. Each work group would consist of a professor and a student of the 4th year, a professor and a student of the 3rd year, and two professors and two students of the 2nd year. Each work group would have a tutoring process from each professor for the corresponding student, and a 4th-year student providing peer tutoring for the students of the 2nd and 3rd year.

  13. Effects of Assuming Independent Component Failure Times, If They Are Actually Dependent, in a Series System.

    DTIC Science & Technology

    1985-11-26

    etc.).., Major decisions involving reliability ptudies, based on competing risk methodology , have been made in the past and will continue to be made...censoring mechanism. In such instances, the methodology for estimating relevant reliabili- ty probabilities has received considerable attention (cf. David...proposal for a discussion of the general methodology . .,4..% . - ’ -. - ’ . ’ , . * I - " . . - - - - . . ,_ . . . . . . . . .4

  14. Background for Joint Systems Aspects of AIR 6000

    DTIC Science & Technology

    2000-04-01

    Checkland’s Soft Systems Methodology [7, 8,9]. The analytical techniques that are proposed for joint systems work are based on calculating probability...Supporting Global Interests 21 DSTO-CR-0155 SLMP Structural Life Management Plan SOW Stand-Off Weapon SSM Soft Systems Methodology UAV Uninhabited Aerial... Systems Methodology in Action, John Wiley & Sons, Chichester, 1990. [101 Pearl, Judea, Probabilistic Reasoning in Intelligent Systems: Networks of Plausible

  15. New classification of natural breeding habitats for Neotropical anophelines in the Yanomami Indian Reserve, Amazon Region, Brazil and a new larval sampling methodology.

    PubMed

    Sánchez-Ribas, Jordi; Oliveira-Ferreira, Joseli; Rosa-Freitas, Maria Goreti; Trilla, Lluís; Silva-do-Nascimento, Teresa Fernandes

    2015-09-01

    Here we present the first in a series of articles about the ecology of immature stages of anophelines in the Brazilian Yanomami area. We propose a new larval habitat classification and a new larval sampling methodology. We also report some preliminary results illustrating the applicability of the methodology based on data collected in the Brazilian Amazon rainforest in a longitudinal study of two remote Yanomami communities, Parafuri and Toototobi. In these areas, we mapped and classified 112 natural breeding habitats located in low-order river systems based on their association with river flood pulses, seasonality and exposure to sun. Our classification rendered seven types of larval habitats: lakes associated with the river, which are subdivided into oxbow lakes and nonoxbow lakes, flooded areas associated with the river, flooded areas not associated with the river, rainfall pools, small forest streams, medium forest streams and rivers. The methodology for larval sampling was based on the accurate quantification of the effective breeding area, taking into account the area of the perimeter and subtypes of microenvironments present per larval habitat type using a laser range finder and a small portable inflatable boat. The new classification and new sampling methodology proposed herein may be useful in vector control programs.

  16. New classification of natural breeding habitats for Neotropical anophelines in the Yanomami Indian Reserve, Amazon Region, Brazil and a new larval sampling methodology

    PubMed Central

    Sánchez-Ribas, Jordi; Oliveira-Ferreira, Joseli; Rosa-Freitas, Maria Goreti; Trilla, Lluís; Silva-do-Nascimento, Teresa Fernandes

    2015-01-01

    Here we present the first in a series of articles about the ecology of immature stages of anophelines in the Brazilian Yanomami area. We propose a new larval habitat classification and a new larval sampling methodology. We also report some preliminary results illustrating the applicability of the methodology based on data collected in the Brazilian Amazon rainforest in a longitudinal study of two remote Yanomami communities, Parafuri and Toototobi. In these areas, we mapped and classified 112 natural breeding habitats located in low-order river systems based on their association with river flood pulses, seasonality and exposure to sun. Our classification rendered seven types of larval habitats: lakes associated with the river, which are subdivided into oxbow lakes and nonoxbow lakes, flooded areas associated with the river, flooded areas not associated with the river, rainfall pools, small forest streams, medium forest streams and rivers. The methodology for larval sampling was based on the accurate quantification of the effective breeding area, taking into account the area of the perimeter and subtypes of microenvironments present per larval habitat type using a laser range finder and a small portable inflatable boat. The new classification and new sampling methodology proposed herein may be useful in vector control programs. PMID:26517655

  17. Use of raster-based data layers to model spatial variation of seismotectonic data in probabilistic seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Zolfaghari, Mohammad R.

    2009-07-01

    Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.

  18. 78 FR 26371 - Notice of Hearing: Reconsideration of Disapproval of Kentucky State Plan Amendments (SPA) 10-007

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-06

    ... Community Mental Health Clinics (CMHCs). At issue in the hearing is whether the proposed cost-based Medicaid... proposed a payment methodology based on actual, incurred, costs for services provided by Community Mental... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Medicare & Medicaid Services Notice of Hearing...

  19. A novelty detection diagnostic methodology for gearboxes operating under fluctuating operating conditions using probabilistic techniques

    NASA Astrophysics Data System (ADS)

    Schmidt, S.; Heyns, P. S.; de Villiers, J. P.

    2018-02-01

    In this paper, a fault diagnostic methodology is developed which is able to detect, locate and trend gear faults under fluctuating operating conditions when only vibration data from a single transducer, measured on a healthy gearbox are available. A two-phase feature extraction and modelling process is proposed to infer the operating condition and based on the operating condition, to detect changes in the machine condition. Information from optimised machine and operating condition hidden Markov models are statistically combined to generate a discrepancy signal which is post-processed to infer the condition of the gearbox. The discrepancy signal is processed and combined with statistical methods for automatic fault detection and localisation and to perform fault trending over time. The proposed methodology is validated on experimental data and a tacholess order tracking methodology is used to enhance the cost-effectiveness of the diagnostic methodology.

  20. Combining users' activity survey and simulators to evaluate human activity recognition systems.

    PubMed

    Azkune, Gorka; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2015-04-08

    Evaluating human activity recognition systems usually implies following expensive and time-consuming methodologies, where experiments with humans are run with the consequent ethical and legal issues. We propose a novel evaluation methodology to overcome the enumerated problems, which is based on surveys for users and a synthetic dataset generator tool. Surveys allow capturing how different users perform activities of daily living, while the synthetic dataset generator is used to create properly labelled activity datasets modelled with the information extracted from surveys. Important aspects, such as sensor noise, varying time lapses and user erratic behaviour, can also be simulated using the tool. The proposed methodology is shown to have very important advantages that allow researchers to carry out their work more efficiently. To evaluate the approach, a synthetic dataset generated following the proposed methodology is compared to a real dataset computing the similarity between sensor occurrence frequencies. It is concluded that the similarity between both datasets is more than significant.

  1. High Resolution Higher Energy X-ray Microscope for Mesoscopic Materials

    NASA Astrophysics Data System (ADS)

    Snigireva, I.; Snigirev, A.

    2013-10-01

    We developed a novel X-ray microscopy technique to study mesoscopically structured materials, employing compound refractive lenses. The easily seen advantage of lens-based methodology is the possibility to retrieve high resolution diffraction pattern and real-space images in the same experimental setup. Methodologically the proposed approach is similar to the studies of crystals by high resolution transmission electron microscopy. The proposed microscope was applied for studying of mesoscopic materials such as natural and synthetic opals, inverted photonic crystals.

  2. Automatic recognition of postural allocations.

    PubMed

    Sazonov, Edward; Krishnamurthy, Vidya; Makeyev, Oleksandr; Browning, Ray; Schutz, Yves; Hill, James

    2007-01-01

    A significant part of daily energy expenditure may be attributed to non-exercise activity thermogenesis and exercise activity thermogenesis. Automatic recognition of postural allocations such as standing or sitting can be used in behavioral modification programs aimed at minimizing static postures. In this paper we propose a shoe-based device and related pattern recognition methodology for recognition of postural allocations. Inexpensive technology allows implementation of this methodology as a part of footwear. The experimental results suggest high efficiency and reliability of the proposed approach.

  3. Methodologies for validating ray-based forward model using finite element method in ultrasonic array data simulation

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; Nixon, Andrew; Barber, Tom; Budyn, Nicolas; Bevan, Rhodri; Croxford, Anthony; Wilcox, Paul

    2018-04-01

    In this paper, a methodology of using finite element (FE) model to validate a ray-based model in the simulation of full matrix capture (FMC) ultrasonic array data set is proposed. The overall aim is to separate signal contributions from different interactions in FE results for easier comparing each individual component in the ray-based model results. This is achieved by combining the results from multiple FE models of the system of interest that include progressively more geometrical features while preserving the same mesh structure. It is shown that the proposed techniques allow the interactions from a large number of different ray-paths to be isolated in FE results and compared directly to the results from a ray-based forward model.

  4. Post-decomposition optimizations using pattern matching and rule-based clustering for multi-patterning technology

    NASA Astrophysics Data System (ADS)

    Wang, Lynn T.-N.; Madhavan, Sriram

    2018-03-01

    A pattern matching and rule-based polygon clustering methodology with DFM scoring is proposed to detect decomposition-induced manufacturability detractors and fix the layout designs prior to manufacturing. A pattern matcher scans the layout for pre-characterized patterns from a library. If a pattern were detected, rule-based clustering identifies the neighboring polygons that interact with those captured by the pattern. Then, DFM scores are computed for the possible layout fixes: the fix with the best score is applied. The proposed methodology was applied to two 20nm products with a chip area of 11 mm2 on the metal 2 layer. All the hotspots were resolved. The number of DFM spacing violations decreased by 7-15%.

  5. VALIDATION OF AMBIENT WATER QUALITY CRITERIA (AWQC) BIOACCUMULATION METHODOLOGY USING FIELD DATA FROM GREEN BAY AND THE HUDSON RIVER

    EPA Science Inventory

    In 1998, EPA published its draft revision to the methodology for deriving ambient water quality criteria to protect human health. Four methods were proposed to determine lipid-normalized bioaccumulation factors based on freely-dissolved water concentrations (BAFs) for nonpolar or...

  6. A Proposed Performance-Based System for Teacher Interactive Electronic Continuous Professional Development (TIE-CPD)

    ERIC Educational Resources Information Center

    Razak, Rafiza Abdul; Yusop, Farrah Dina; Idris, Aizal Yusrina; Al-Sinaiyah, Yanbu; Halili, Siti Hajar

    2016-01-01

    The paper introduces Teacher Interactive Electronic Continuous Professional Development (TIE-CPD), an online interactive training system. The framework and methodology of TIE-CPD are designed with functionalities comparable with existing e-training systems. The system design and development literature offers several methodology and framework…

  7. Bearing damage assessment using Jensen-Rényi Divergence based on EEMD

    NASA Astrophysics Data System (ADS)

    Singh, Jaskaran; Darpe, A. K.; Singh, S. P.

    2017-03-01

    An Ensemble Empirical Mode Decomposition (EEMD) and Jensen Rényi divergence (JRD) based methodology is proposed for the degradation assessment of rolling element bearings using vibration data. The EEMD decomposes vibration signals into a set of intrinsic mode functions (IMFs). A systematic methodology to select IMFs that are sensitive and closely related to the fault is proposed in the paper. The change in probability distribution of the energies of the sensitive IMFs is measured through JRD which acts as a damage identification parameter. Evaluation of JRD with sensitive IMFs makes it largely unaffected by change/fluctuations in operating conditions. Further, an algorithm based on Chebyshev's inequality is applied to JRD to identify exact points of change in bearing health and remove outliers. The identified change points are investigated for fault classification as possible locations where specific defect initiation could have taken place. For fault classification, two new parameters are proposed: 'α value' and Probable Fault Index, which together classify the fault. To standardize the degradation process, a Confidence Value parameter is proposed to quantify the bearing degradation value in a range of zero to unity. A simulation study is first carried out to demonstrate the robustness of the proposed JRD parameter under variable operating conditions of load and speed. The proposed methodology is then validated on experimental data (seeded defect data and accelerated bearing life test data). The first validation on two different vibration datasets (inner/outer) obtained from seeded defect experiments demonstrate the effectiveness of JRD parameter in detecting a change in health state as the severity of fault changes. The second validation is on two accelerated life tests. The results demonstrate the proposed approach as a potential tool for bearing performance degradation assessment.

  8. New methodology to reconstruct in 2-D the cuspal enamel of modern human lower molars.

    PubMed

    Modesto-Mata, Mario; García-Campos, Cecilia; Martín-Francés, Laura; Martínez de Pinillos, Marina; García-González, Rebeca; Quintino, Yuliet; Canals, Antoni; Lozano, Marina; Dean, M Christopher; Martinón-Torres, María; Bermúdez de Castro, José María

    2017-08-01

    In the last years different methodologies have been developed to reconstruct worn teeth. In this article, we propose a new 2-D methodology to reconstruct the worn enamel of lower molars. Our main goals are to reconstruct molars with a high level of accuracy when measuring relevant histological variables and to validate the methodology calculating the errors associated with the measurements. This methodology is based on polynomial regression equations, and has been validated using two different dental variables: cuspal enamel thickness and crown height of the protoconid. In order to perform the validation process, simulated worn modern human molars were employed. The associated errors of the measurements were also estimated applying methodologies previously proposed by other authors. The mean percentage error estimated in reconstructed molars for these two variables in comparison with their own real values is -2.17% for the cuspal enamel thickness of the protoconid and -3.18% for the crown height of the protoconid. This error significantly improves the results of other methodologies, both in the interobserver error and in the accuracy of the measurements. The new methodology based on polynomial regressions can be confidently applied to the reconstruction of cuspal enamel of lower molars, as it improves the accuracy of the measurements and reduces the interobserver error. The present study shows that it is important to validate all methodologies in order to know the associated errors. This new methodology can be easily exportable to other modern human populations, the human fossil record and forensic sciences. © 2017 Wiley Periodicals, Inc.

  9. A probabilistic seismic risk assessment procedure for nuclear power plants: (I) Methodology

    USGS Publications Warehouse

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2011-01-01

    A new procedure for probabilistic seismic risk assessment of nuclear power plants (NPPs) is proposed. This procedure modifies the current procedures using tools developed recently for performance-based earthquake engineering of buildings. The proposed procedure uses (a) response-based fragility curves to represent the capacity of structural and nonstructural components of NPPs, (b) nonlinear response-history analysis to characterize the demands on those components, and (c) Monte Carlo simulations to determine the damage state of the components. The use of response-rather than ground-motion-based fragility curves enables the curves to be independent of seismic hazard and closely related to component capacity. The use of Monte Carlo procedure enables the correlation in the responses of components to be directly included in the risk assessment. An example of the methodology is presented in a companion paper to demonstrate its use and provide the technical basis for aspects of the methodology. ?? 2011 Published by Elsevier B.V.

  10. Characterizing Postural Sway during Quiet Stance Based on the Intermittent Control Hypothesis

    NASA Astrophysics Data System (ADS)

    Nomura, Taishin; Nakamura, Toru; Fukada, Kei; Sakoda, Saburo

    2007-07-01

    This article illustrates a signal processing methodology for the time series of postural sway and accompanied electromyographs from the lower limb muscles during quiet stance. It was shown that the proposed methodology was capable of identifying the underlying postural control mechanisms. A preliminary application of the methodology provided evidence that supports the intermittent control hypothesis alternative to the conventional stiffness control hypothesis during human quiet upright stance.

  11. Towards more sustainable management of European food waste: Methodological approach and numerical application.

    PubMed

    Manfredi, Simone; Cristobal, Jorge

    2016-09-01

    Trying to respond to the latest policy needs, the work presented in this article aims at developing a life-cycle based framework methodology to quantitatively evaluate the environmental and economic sustainability of European food waste management options. The methodology is structured into six steps aimed at defining boundaries and scope of the evaluation, evaluating environmental and economic impacts and identifying best performing options. The methodology is able to accommodate additional assessment criteria, for example the social dimension of sustainability, thus moving towards a comprehensive sustainability assessment framework. A numerical case study is also developed to provide an example of application of the proposed methodology to an average European context. Different options for food waste treatment are compared, including landfilling, composting, anaerobic digestion and incineration. The environmental dimension is evaluated with the software EASETECH, while the economic assessment is conducted based on different indicators expressing the costs associated with food waste management. Results show that the proposed methodology allows for a straightforward identification of the most sustainable options for food waste, thus can provide factual support to decision/policy making. However, it was also observed that results markedly depend on a number of user-defined assumptions, for example on the choice of the indicators to express the environmental and economic performance. © The Author(s) 2016.

  12. Self-Calibration and Optimal Response in Intelligent Sensors Design Based on Artificial Neural Networks

    PubMed Central

    Rivera, José; Carrillo, Mariano; Chacón, Mario; Herrera, Gilberto; Bojorquez, Gilberto

    2007-01-01

    The development of smart sensors involves the design of reconfigurable systems capable of working with different input sensors. Reconfigurable systems ideally should spend the least possible amount of time in their calibration. An autocalibration algorithm for intelligent sensors should be able to fix major problems such as offset, variation of gain and lack of linearity, as accurately as possible. This paper describes a new autocalibration methodology for nonlinear intelligent sensors based on artificial neural networks, ANN. The methodology involves analysis of several network topologies and training algorithms. The proposed method was compared against the piecewise and polynomial linearization methods. Method comparison was achieved using different number of calibration points, and several nonlinear levels of the input signal. This paper also shows that the proposed method turned out to have a better overall accuracy than the other two methods. Besides, experimentation results and analysis of the complete study, the paper describes the implementation of the ANN in a microcontroller unit, MCU. In order to illustrate the method capability to build autocalibration and reconfigurable systems, a temperature measurement system was designed and tested. The proposed method is an improvement over the classic autocalibration methodologies, because it impacts on the design process of intelligent sensors, autocalibration methodologies and their associated factors, like time and cost.

  13. Methodology for estimating human perception to tremors in high-rise buildings

    NASA Astrophysics Data System (ADS)

    Du, Wenqi; Goh, Key Seng; Pan, Tso-Chien

    2017-07-01

    Human perception to tremors during earthquakes in high-rise buildings is usually associated with psychological discomfort such as fear and anxiety. This paper presents a methodology for estimating the level of perception to tremors for occupants living in high-rise buildings subjected to ground motion excitations. Unlike other approaches based on empirical or historical data, the proposed methodology performs a regression analysis using the analytical results of two generic models of 15 and 30 stories. The recorded ground motions in Singapore are collected and modified for structural response analyses. Simple predictive models are then developed to estimate the perception level to tremors based on a proposed ground motion intensity parameter—the average response spectrum intensity in the period range between 0.1 and 2.0 s. These models can be used to predict the percentage of occupants in high-rise buildings who may perceive the tremors at a given ground motion intensity. Furthermore, the models are validated with two recent tremor events reportedly felt in Singapore. It is found that the estimated results match reasonably well with the reports in the local newspapers and from the authorities. The proposed methodology is applicable to urban regions where people living in high-rise buildings might feel tremors during earthquakes.

  14. Equivalent orthotropic elastic moduli identification method for laminated electrical steel sheets

    NASA Astrophysics Data System (ADS)

    Saito, Akira; Nishikawa, Yasunari; Yamasaki, Shintaro; Fujita, Kikuo; Kawamoto, Atsushi; Kuroishi, Masakatsu; Nakai, Hideo

    2016-05-01

    In this paper, a combined numerical-experimental methodology for the identification of elastic moduli of orthotropic media is presented. Special attention is given to the laminated electrical steel sheets, which are modeled as orthotropic media with nine independent engineering elastic moduli. The elastic moduli are determined specifically for use with finite element vibration analyses. We propose a three-step methodology based on a conventional nonlinear least squares fit between measured and computed natural frequencies. The methodology consists of: (1) successive augmentations of the objective function by increasing the number of modes, (2) initial condition updates, and (3) appropriate selection of the natural frequencies based on their sensitivities on the elastic moduli. Using the results of numerical experiments, it is shown that the proposed method achieves more accurate converged solution than a conventional approach. Finally, the proposed method is applied to measured natural frequencies and mode shapes of the laminated electrical steel sheets. It is shown that the method can successfully identify the orthotropic elastic moduli that can reproduce the measured natural frequencies and frequency response functions by using finite element analyses with a reasonable accuracy.

  15. Measuring Value Added in Higher Education: A Proposed Methodology for Developing a Performance Indicator Based on the Economic Value Added to Graduates

    ERIC Educational Resources Information Center

    Rodgers, Timothy

    2007-01-01

    The 2003 UK higher education White Paper suggested that the sector needed to re-examine the potential of the value added concept. This paper describes a possible methodology for developing a performance indicator based on the economic value added to graduates. The paper examines how an entry-quality-adjusted measure of a graduate's…

  16. Preliminary comparative assessment of PM10 hourly measurement results from new monitoring stations type using stochastic and exploratory methodology and models

    NASA Astrophysics Data System (ADS)

    Czechowski, Piotr Oskar; Owczarek, Tomasz; Badyda, Artur; Majewski, Grzegorz; Rogulski, Mariusz; Ogrodnik, Paweł

    2018-01-01

    The paper presents selected preliminary stage key issues proposed extended equivalence measurement results assessment for new portable devices - the comparability PM10 concentration results hourly series with reference station measurement results with statistical methods. In article presented new portable meters technical aspects. The emphasis was placed on the comparability the results using the stochastic and exploratory methods methodology concept. The concept is based on notice that results series simple comparability in the time domain is insufficient. The comparison of regularity should be done in three complementary fields of statistical modeling: time, frequency and space. The proposal is based on model's results of five annual series measurement results new mobile devices and WIOS (Provincial Environmental Protection Inspectorate) reference station located in Nowy Sacz city. The obtained results indicate both the comparison methodology completeness and the high correspondence obtained new measurements results devices with reference.

  17. An object-oriented approach for harmonization of multimedia markup languages

    NASA Astrophysics Data System (ADS)

    Chen, Yih-Feng; Kuo, May-Chen; Sun, Xiaoming; Kuo, C.-C. Jay

    2003-12-01

    An object-oriented methodology is proposed to harmonize several different markup languages in this research. First, we adopt the Unified Modelling Language (UML) as the data model to formalize the concept and the process of the harmonization process between the eXtensible Markup Language (XML) applications. Then, we design the Harmonization eXtensible Markup Language (HXML) based on the data model and formalize the transformation between the Document Type Definitions (DTDs) of the original XML applications and HXML. The transformation between instances is also discussed. We use the harmonization of SMIL and X3D as an example to demonstrate the proposed methodology. This methodology can be generalized to various application domains.

  18. A Model-Free Diagnostic for Single-Peakedness of Item Responses Using Ordered Conditional Means

    ERIC Educational Resources Information Center

    Polak, Marike; De Rooij, Mark; Heiser, Willem J.

    2012-01-01

    In this article we propose a model-free diagnostic for single-peakedness (unimodality) of item responses. Presuming a unidimensional unfolding scale and a given item ordering, we approximate item response functions of all items based on ordered conditional means (OCM). The proposed OCM methodology is based on Thurstone & Chave's (1929) "criterion…

  19. Stochastic HKMDHE: A multi-objective contrast enhancement algorithm

    NASA Astrophysics Data System (ADS)

    Pratiher, Sawon; Mukhopadhyay, Sabyasachi; Maity, Srideep; Pradhan, Asima; Ghosh, Nirmalya; Panigrahi, Prasanta K.

    2018-02-01

    This contribution proposes a novel extension of the existing `Hyper Kurtosis based Modified Duo-Histogram Equalization' (HKMDHE) algorithm, for multi-objective contrast enhancement of biomedical images. A novel modified objective function has been formulated by joint optimization of the individual histogram equalization objectives. The optimal adequacy of the proposed methodology with respect to image quality metrics such as brightness preserving abilities, peak signal-to-noise ratio (PSNR), Structural Similarity Index (SSIM) and universal image quality metric has been experimentally validated. The performance analysis of the proposed Stochastic HKMDHE with existing histogram equalization methodologies like Global Histogram Equalization (GHE) and Contrast Limited Adaptive Histogram Equalization (CLAHE) has been given for comparative evaluation.

  20. Leveraging the Zachman framework implementation using action - research methodology - a case study: aligning the enterprise architecture and the business goals

    NASA Astrophysics Data System (ADS)

    Nogueira, Juan Manuel; Romero, David; Espadas, Javier; Molina, Arturo

    2013-02-01

    With the emergence of new enterprise models, such as technology-based enterprises, and the large quantity of information generated through technological advances, the Zachman framework continues to represent a modelling tool of great utility and value to construct an enterprise architecture (EA) that can integrate and align the IT infrastructure and business goals. Nevertheless, implementing an EA requires an important effort within an enterprise. Small technology-based enterprises and start-ups can take advantage of EAs and frameworks but, because these enterprises have limited resources to allocate for this task, an enterprise framework implementation is not feasible in most cases. This article proposes a new methodology based on action-research for the implementation of the business, system and technology models of the Zachman framework to assist and facilitate its implementation. Following the explanation of cycles of the proposed methodology, a case study is presented to illustrate the results of implementing the Zachman framework in a technology-based enterprise: PyME CREATIVA, using action-research approach.

  1. Model identification methodology for fluid-based inerters

    NASA Astrophysics Data System (ADS)

    Liu, Xiaofu; Jiang, Jason Zheng; Titurus, Branislav; Harrison, Andrew

    2018-06-01

    Inerter is the mechanical dual of the capacitor via the force-current analogy. It has the property that the force across the terminals is proportional to their relative acceleration. Compared with flywheel-based inerters, fluid-based forms have advantages of improved durability, inherent damping and simplicity of design. In order to improve the understanding of the physical behaviour of this fluid-based device, especially caused by the hydraulic resistance and inertial effects in the external tube, this work proposes a comprehensive model identification methodology. Firstly, a modelling procedure is established, which allows the topological arrangement of the mechanical networks to be obtained by mapping the damping, inertance and stiffness effects directly to their respective hydraulic counterparts. Secondly, an experimental sequence is followed, which separates the identification of friction, stiffness and various damping effects. Furthermore, an experimental set-up is introduced, where two pressure gauges are used to accurately measure the pressure drop across the external tube. The theoretical models with improved confidence are obtained using the proposed methodology for a helical-tube fluid inerter prototype. The sources of remaining discrepancies are further analysed.

  2. Software Risk Identification for Interplanetary Probes

    NASA Technical Reports Server (NTRS)

    Dougherty, Robert J.; Papadopoulos, Periklis E.

    2005-01-01

    The need for a systematic and effective software risk identification methodology is critical for interplanetary probes that are using increasingly complex and critical software. Several probe failures are examined that suggest more attention and resources need to be dedicated to identifying software risks. The direct causes of these failures can often be traced to systemic problems in all phases of the software engineering process. These failures have lead to the development of a practical methodology to identify risks for interplanetary probes. The proposed methodology is based upon the tailoring of the Software Engineering Institute's (SEI) method of taxonomy-based risk identification. The use of this methodology will ensure a more consistent and complete identification of software risks in these probes.

  3. A new hyperspectral image compression paradigm based on fusion

    NASA Astrophysics Data System (ADS)

    Guerra, Raúl; Melián, José; López, Sebastián.; Sarmiento, Roberto

    2016-10-01

    The on-board compression of remote sensed hyperspectral images is an important task nowadays. One of the main difficulties is that the compression of these images must be performed in the satellite which carries the hyperspectral sensor. Hence, this process must be performed by space qualified hardware, having area, power and speed limitations. Moreover, it is important to achieve high compression ratios without compromising the quality of the decompress image. In this manuscript we proposed a new methodology for compressing hyperspectral images based on hyperspectral image fusion concepts. The proposed compression process has two independent steps. The first one is to spatially degrade the remote sensed hyperspectral image to obtain a low resolution hyperspectral image. The second step is to spectrally degrade the remote sensed hyperspectral image to obtain a high resolution multispectral image. These two degraded images are then send to the earth surface, where they must be fused using a fusion algorithm for hyperspectral and multispectral image, in order to recover the remote sensed hyperspectral image. The main advantage of the proposed methodology for compressing remote sensed hyperspectral images is that the compression process, which must be performed on-board, becomes very simple, being the fusion process used to reconstruct image the more complex one. An extra advantage is that the compression ratio can be fixed in advanced. Many simulations have been performed using different fusion algorithms and different methodologies for degrading the hyperspectral image. The results obtained in the simulations performed corroborate the benefits of the proposed methodology.

  4. New methodology for fast prediction of wheel wear evolution

    NASA Astrophysics Data System (ADS)

    Apezetxea, I. S.; Perez, X.; Casanueva, C.; Alonso, A.

    2017-07-01

    In railway applications wear prediction in the wheel-rail interface is a fundamental matter in order to study problems such as wheel lifespan and the evolution of vehicle dynamic characteristic with time. However, one of the principal drawbacks of the existing methodologies for calculating the wear evolution is the computational cost. This paper proposes a new wear prediction methodology with a reduced computational cost. This methodology is based on two main steps: the first one is the substitution of the calculations over the whole network by the calculation of the contact conditions in certain characteristic point from whose result the wheel wear evolution can be inferred. The second one is the substitution of the dynamic calculation (time integration calculations) by the quasi-static calculation (the solution of the quasi-static situation of a vehicle at a certain point which is the same that neglecting the acceleration terms in the dynamic equations). These simplifications allow a significant reduction of computational cost to be obtained while maintaining an acceptable level of accuracy (error order of 5-10%). Several case studies are analysed along the paper with the objective of assessing the proposed methodology. The results obtained in the case studies allow concluding that the proposed methodology is valid for an arbitrary vehicle running through an arbitrary track layout.

  5. Methodology based on genetic heuristics for in-vivo characterizing the patient-specific biomechanical behavior of the breast tissues

    PubMed Central

    Lago, M. A.; Rúperez, M. J.; Martínez-Martínez, F.; Martínez-Sanchis, S.; Bakic, P. R.; Monserrat, C.

    2015-01-01

    This paper presents a novel methodology to in-vivo estimate the elastic constants of a constitutive model proposed to characterize the mechanical behavior of the breast tissues. An iterative search algorithm based on genetic heuristics was constructed to in-vivo estimate these parameters using only medical images, thus avoiding invasive measurements of the mechanical response of the breast tissues. For the first time, a combination of overlap and distance coefficients were used for the evaluation of the similarity between a deformed MRI of the breast and a simulation of that deformation. The methodology was validated using breast software phantoms for virtual clinical trials, compressed to mimic MRI-guided biopsies. The biomechanical model chosen to characterize the breast tissues was an anisotropic neo-Hookean hyperelastic model. Results from this analysis showed that the algorithm is able to find the elastic constants of the constitutive equations of the proposed model with a mean relative error of about 10%. Furthermore, the overlap between the reference deformation and the simulated deformation was of around 95% showing the good performance of the proposed methodology. This methodology can be easily extended to characterize the real biomechanical behavior of the breast tissues, which means a great novelty in the field of the simulation of the breast behavior for applications such as surgical planing, surgical guidance or cancer diagnosis. This reveals the impact and relevance of the presented work. PMID:27103760

  6. Methodology based on genetic heuristics for in-vivo characterizing the patient-specific biomechanical behavior of the breast tissues.

    PubMed

    Lago, M A; Rúperez, M J; Martínez-Martínez, F; Martínez-Sanchis, S; Bakic, P R; Monserrat, C

    2015-11-30

    This paper presents a novel methodology to in-vivo estimate the elastic constants of a constitutive model proposed to characterize the mechanical behavior of the breast tissues. An iterative search algorithm based on genetic heuristics was constructed to in-vivo estimate these parameters using only medical images, thus avoiding invasive measurements of the mechanical response of the breast tissues. For the first time, a combination of overlap and distance coefficients were used for the evaluation of the similarity between a deformed MRI of the breast and a simulation of that deformation. The methodology was validated using breast software phantoms for virtual clinical trials, compressed to mimic MRI-guided biopsies. The biomechanical model chosen to characterize the breast tissues was an anisotropic neo-Hookean hyperelastic model. Results from this analysis showed that the algorithm is able to find the elastic constants of the constitutive equations of the proposed model with a mean relative error of about 10%. Furthermore, the overlap between the reference deformation and the simulated deformation was of around 95% showing the good performance of the proposed methodology. This methodology can be easily extended to characterize the real biomechanical behavior of the breast tissues, which means a great novelty in the field of the simulation of the breast behavior for applications such as surgical planing, surgical guidance or cancer diagnosis. This reveals the impact and relevance of the presented work.

  7. Performance evaluation methodology for historical document image binarization.

    PubMed

    Ntirogiannis, Konstantinos; Gatos, Basilis; Pratikakis, Ioannis

    2013-02-01

    Document image binarization is of great importance in the document image analysis and recognition pipeline since it affects further stages of the recognition process. The evaluation of a binarization method aids in studying its algorithmic behavior, as well as verifying its effectiveness, by providing qualitative and quantitative indication of its performance. This paper addresses a pixel-based binarization evaluation methodology for historical handwritten/machine-printed document images. In the proposed evaluation scheme, the recall and precision evaluation measures are properly modified using a weighting scheme that diminishes any potential evaluation bias. Additional performance metrics of the proposed evaluation scheme consist of the percentage rates of broken and missed text, false alarms, background noise, character enlargement, and merging. Several experiments conducted in comparison with other pixel-based evaluation measures demonstrate the validity of the proposed evaluation scheme.

  8. An Approach for Implementation of Project Management Information Systems

    NASA Astrophysics Data System (ADS)

    Běrziša, Solvita; Grabis, Jānis

    Project management is governed by project management methodologies, standards, and other regulatory requirements. This chapter proposes an approach for implementing and configuring project management information systems according to requirements defined by these methodologies. The approach uses a project management specification framework to describe project management methodologies in a standardized manner. This specification is used to automatically configure the project management information system by applying appropriate transformation mechanisms. Development of the standardized framework is based on analysis of typical project management concepts and process and existing XML-based representations of project management. A demonstration example of project management information system's configuration is provided.

  9. Medical privacy protection based on granular computing.

    PubMed

    Wang, Da-Wei; Liau, Churn-Jung; Hsu, Tsan-Sheng

    2004-10-01

    Based on granular computing methodology, we propose two criteria to quantitatively measure privacy invasion. The total cost criterion measures the effort needed for a data recipient to find private information. The average benefit criterion measures the benefit a data recipient obtains when he received the released data. These two criteria remedy the inadequacy of the deterministic privacy formulation proposed in Proceedings of Asia Pacific Medical Informatics Conference, 2000; Int J Med Inform 2003;71:17-23. Granular computing methodology provides a unified framework for these quantitative measurements and previous bin size and logical approaches. These two new criteria are implemented in a prototype system Cellsecu 2.0. Preliminary system performance evaluation is conducted and reviewed.

  10. Methodological issues in the design of a rheumatoid arthritis activity score and its cut-offs.

    PubMed

    Collignon, Olivier

    2014-01-01

    Activity of rheumatoid arthritis (RA) can be evaluated using several scoring scales based on clinical features. The most widely used one is the Disease Activity Score involving 28 joint counts (DAS28) for which cut-offs were proposed to help physicians classify patients. However, inaccurate scoring can lead to inappropriate medical decisions. In this article some methodological issues in the design of such a score and its cut-offs are highlighted in order to further propose a strategy to overcome them. As long as the issues reviewed in this article are not addressed, results of studies based on standard disease activity scores such as DAS28 should be considered with caution.

  11. Validation of a SysML based design for wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Berrachedi, Amel; Rahim, Messaoud; Ioualalen, Malika; Hammad, Ahmed

    2017-07-01

    When developing complex systems, the requirement for the verification of the systems' design is one of the main challenges. Wireless Sensor Networks (WSNs) are examples of such systems. We address the problem of how WSNs must be designed to fulfil the system requirements. Using the SysML Language, we propose a Model Based System Engineering (MBSE) specification and verification methodology for designing WSNs. This methodology uses SysML to describe the WSNs requirements, structure and behaviour. Then, it translates the SysML elements to an analytic model, specifically, to a Deterministic Stochastic Petri Net. The proposed approach allows to design WSNs and study their behaviors and their energy performances.

  12. The influence of biodegradability of sewer solids for the management of CSOs.

    PubMed

    Sakrabani, R; Ashley, R M; Vollertsen, J

    2005-01-01

    The re-suspension of sediments in combined sewers and the associated pollutants into the bulk water during wet weather flows can cause pollutants to be carried further downstream to receiving waters or discharged via Combined Sewer Overflows (CSO). A typical pollutograph shows the trend of released bulk pollutants with time but does not consider information on the biodegradability of these pollutants. A new prediction methodology based on Oxygen Utilisation Rate (respirometric method) and Erosionmeter (laboratory device replicating in-sewer erosion) experiments is proposed which is able to predict the trends in biodegradability during in-sewer sediment erosion in wet weather conditions. The proposed new prediction methodology is also based on COD fractionation techniques.

  13. Integral Methodological Pluralism in Science Education Research: Valuing Multiple Perspectives

    ERIC Educational Resources Information Center

    Davis, Nancy T.; Callihan, Laurie P.

    2013-01-01

    This article examines the multiple methodologies used in educational research and proposes a model that includes all of them as contributing to understanding educational contexts and research from multiple perspectives. The model, based on integral theory (Wilber in a theory of everything. Shambhala, Boston, 2000) values all forms of research as…

  14. Classification of physical activities based on body-segments coordination.

    PubMed

    Fradet, Laetitia; Marin, Frederic

    2016-09-01

    Numerous innovations based on connected objects and physical activity (PA) monitoring have been proposed. However, recognition of PAs requires robust algorithm and methodology. The current study presents an innovative approach for PA recognition. It is based on the heuristic definition of postures and the use of body-segments coordination obtained through external sensors. The first part of this study presents the methodology required to define the set of accelerations which is the most appropriate to represent the particular body-segments coordination involved in the chosen PAs (here walking, running, and cycling). For that purpose, subjects of different ages and heterogeneous physical conditions walked, ran, cycled, and performed daily activities at different paces. From the 3D motion capture, vertical and horizontal accelerations of 8 anatomical landmarks representative of the body were computed. Then, the 680 combinations from up to 3 accelerations were compared to identify the most appropriate set of acceleration to discriminate the PAs in terms of body segment coordinations. The discrimination was based on the maximal Hausdorff Distance obtained between the different set of accelerations. The vertical accelerations of both knees demonstrated the best PAs discrimination. The second step was the proof of concept, implementing the proposed algorithm to classify PAs of new group of subjects. The originality of the proposed algorithm is the possibility to use the subject's specific measures as reference data. With the proposed algorithm, 94% of the trials were correctly classified. In conclusion, our study proposed a flexible and extendable methodology. At the current stage, the algorithm has been shown to be valid for heterogeneous subjects, which suggests that it could be deployed in clinical or health-related applications regardless of the subjects' physical abilities or characteristics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. A Bootstrap Based Measure Robust to the Choice of Normalization Methods for Detecting Rhythmic Features in High Dimensional Data.

    PubMed

    Larriba, Yolanda; Rueda, Cristina; Fernández, Miguel A; Peddada, Shyamal D

    2018-01-01

    Motivation: Gene-expression data obtained from high throughput technologies are subject to various sources of noise and accordingly the raw data are pre-processed before formally analyzed. Normalization of the data is a key pre-processing step, since it removes systematic variations across arrays. There are numerous normalization methods available in the literature. Based on our experience, in the context of oscillatory systems, such as cell-cycle, circadian clock, etc., the choice of the normalization method may substantially impact the determination of a gene to be rhythmic. Thus rhythmicity of a gene can purely be an artifact of how the data were normalized. Since the determination of rhythmic genes is an important component of modern toxicological and pharmacological studies, it is important to determine truly rhythmic genes that are robust to the choice of a normalization method. Results: In this paper we introduce a rhythmicity measure and a bootstrap methodology to detect rhythmic genes in an oscillatory system. Although the proposed methodology can be used for any high-throughput gene expression data, in this paper we illustrate the proposed methodology using several publicly available circadian clock microarray gene-expression datasets. We demonstrate that the choice of normalization method has very little effect on the proposed methodology. Specifically, for any pair of normalization methods considered in this paper, the resulting values of the rhythmicity measure are highly correlated. Thus it suggests that the proposed measure is robust to the choice of a normalization method. Consequently, the rhythmicity of a gene is potentially not a mere artifact of the normalization method used. Lastly, as demonstrated in the paper, the proposed bootstrap methodology can also be used for simulating data for genes participating in an oscillatory system using a reference dataset. Availability: A user friendly code implemented in R language can be downloaded from http://www.eio.uva.es/~miguel/robustdetectionprocedure.html.

  16. A Bootstrap Based Measure Robust to the Choice of Normalization Methods for Detecting Rhythmic Features in High Dimensional Data

    PubMed Central

    Larriba, Yolanda; Rueda, Cristina; Fernández, Miguel A.; Peddada, Shyamal D.

    2018-01-01

    Motivation: Gene-expression data obtained from high throughput technologies are subject to various sources of noise and accordingly the raw data are pre-processed before formally analyzed. Normalization of the data is a key pre-processing step, since it removes systematic variations across arrays. There are numerous normalization methods available in the literature. Based on our experience, in the context of oscillatory systems, such as cell-cycle, circadian clock, etc., the choice of the normalization method may substantially impact the determination of a gene to be rhythmic. Thus rhythmicity of a gene can purely be an artifact of how the data were normalized. Since the determination of rhythmic genes is an important component of modern toxicological and pharmacological studies, it is important to determine truly rhythmic genes that are robust to the choice of a normalization method. Results: In this paper we introduce a rhythmicity measure and a bootstrap methodology to detect rhythmic genes in an oscillatory system. Although the proposed methodology can be used for any high-throughput gene expression data, in this paper we illustrate the proposed methodology using several publicly available circadian clock microarray gene-expression datasets. We demonstrate that the choice of normalization method has very little effect on the proposed methodology. Specifically, for any pair of normalization methods considered in this paper, the resulting values of the rhythmicity measure are highly correlated. Thus it suggests that the proposed measure is robust to the choice of a normalization method. Consequently, the rhythmicity of a gene is potentially not a mere artifact of the normalization method used. Lastly, as demonstrated in the paper, the proposed bootstrap methodology can also be used for simulating data for genes participating in an oscillatory system using a reference dataset. Availability: A user friendly code implemented in R language can be downloaded from http://www.eio.uva.es/~miguel/robustdetectionprocedure.html PMID:29456555

  17. Optical modeling based on mean free path calculations for quantum dot phosphors applied to optoelectronic devices.

    PubMed

    Shin, Min-Ho; Kim, Hyo-Jun; Kim, Young-Joo

    2017-02-20

    We proposed an optical simulation model for the quantum dot (QD) nanophosphor based on the mean free path concept to understand precisely the optical performance of optoelectronic devices. A measurement methodology was also developed to get the desired optical characteristics such as the mean free path and absorption spectra for QD nanophosphors which are to be incorporated into the simulation. The simulation results for QD-based white LED and OLED displays show good agreement with the experimental values from the fabricated devices in terms of spectral power distribution, chromaticity coordinate, CCT, and CRI. The proposed simulation model and measurement methodology can be applied easily to the design of lots of optoelectronics devices using QD nanophosphors to obtain high efficiency and the desired color characteristics.

  18. a Metadata Based Approach for Analyzing Uav Datasets for Photogrammetric Applications

    NASA Astrophysics Data System (ADS)

    Dhanda, A.; Remondino, F.; Santana Quintero, M.

    2018-05-01

    This paper proposes a methodology for pre-processing and analysing Unmanned Aerial Vehicle (UAV) datasets before photogrammetric processing. In cases where images are gathered without a detailed flight plan and at regular acquisition intervals the datasets can be quite large and be time consuming to process. This paper proposes a method to calculate the image overlap and filter out images to reduce large block sizes and speed up photogrammetric processing. The python-based algorithm that implements this methodology leverages the metadata in each image to determine the end and side overlap of grid-based UAV flights. Utilizing user input, the algorithm filters out images that are unneeded for photogrammetric processing. The result is an algorithm that can speed up photogrammetric processing and provide valuable information to the user about the flight path.

  19. A methodology for risk analysis based on hybrid Bayesian networks: application to the regasification system of liquefied natural gas onboard a floating storage and regasification unit.

    PubMed

    Martins, Marcelo Ramos; Schleder, Adriana Miralles; Droguett, Enrique López

    2014-12-01

    This article presents an iterative six-step risk analysis methodology based on hybrid Bayesian networks (BNs). In typical risk analysis, systems are usually modeled as discrete and Boolean variables with constant failure rates via fault trees. Nevertheless, in many cases, it is not possible to perform an efficient analysis using only discrete and Boolean variables. The approach put forward by the proposed methodology makes use of BNs and incorporates recent developments that facilitate the use of continuous variables whose values may have any probability distributions. Thus, this approach makes the methodology particularly useful in cases where the available data for quantification of hazardous events probabilities are scarce or nonexistent, there is dependence among events, or when nonbinary events are involved. The methodology is applied to the risk analysis of a regasification system of liquefied natural gas (LNG) on board an FSRU (floating, storage, and regasification unit). LNG is becoming an important energy source option and the world's capacity to produce LNG is surging. Large reserves of natural gas exist worldwide, particularly in areas where the resources exceed the demand. Thus, this natural gas is liquefied for shipping and the storage and regasification process usually occurs at onshore plants. However, a new option for LNG storage and regasification has been proposed: the FSRU. As very few FSRUs have been put into operation, relevant failure data on FSRU systems are scarce. The results show the usefulness of the proposed methodology for cases where the risk analysis must be performed under considerable uncertainty. © 2014 Society for Risk Analysis.

  20. [Cooperative learning for improving healthy housing conditions in Bogota: a case study].

    PubMed

    Torres-Parra, Camilo A; García-Ubaque, Juan C; García-Ubaque, César A

    2014-01-01

    This was a community-based effort at constructing an educational proposal orientated towards self-empowerment aimed at improving the target population's sanitary, housing and living conditions through cooperative learning. A constructivist approach was adopted based on a programme called "Habitat community manger". The project involved working with fifteen families living in the Mochuelo Bajo barrio in Ciudad Bolívar in Bogotá, Colombia, for identifying the most relevant sanitary aspects for improving their homes and proposing a methodology and organisation for an educational proposal. Twenty-one poor housing-related epidemiological indicators were identified which formed the basis for defining specific problems and establishing a methodology for designing an educational proposal. The course which emerged from the cooperative learning experience was designed to promote the community's skills and education regarding health aimed at improving households' living conditions and ensuring a healthy environment which would allow them to develop an immediate habitat ensuring their own welfare and dignity.

  1. Evaluating and redesigning teaching learning sequences at the introductory physics level

    NASA Astrophysics Data System (ADS)

    Guisasola, Jenaro; Zuza, Kristina; Ametller, Jaume; Gutierrez-Berraondo, José

    2017-12-01

    In this paper we put forward a proposal for the design and evaluation of teaching and learning sequences in upper secondary school and university. We will connect our proposal with relevant contributions on the design of teaching sequences, ground it on the design-based research methodology, and discuss how teaching and learning sequences designed according to our proposal relate to learning progressions. An iterative methodology for evaluating and redesigning the teaching and learning sequence (TLS) is presented. The proposed assessment strategy focuses on three aspects: (a) evaluation of the activities of the TLS, (b) evaluation of learning achieved by students in relation to the intended objectives, and (c) a document for gathering the difficulties found when implementing the TLS to serve as a guide to teachers. Discussion of this guide with external teachers provides feedback used for the TLS redesign. The context of our implementation and evaluation is an innovative calculus-based physics course for first-year engineering and science degree students at the University of the Basque Country.

  2. Regression Tree-Based Methodology for Customizing Building Energy Benchmarks to Individual Commercial Buildings

    NASA Astrophysics Data System (ADS)

    Kaskhedikar, Apoorva Prakash

    According to the U.S. Energy Information Administration, commercial buildings represent about 40% of the United State's energy consumption of which office buildings consume a major portion. Gauging the extent to which an individual building consumes energy in excess of its peers is the first step in initiating energy efficiency improvement. Energy Benchmarking offers initial building energy performance assessment without rigorous evaluation. Energy benchmarking tools based on the Commercial Buildings Energy Consumption Survey (CBECS) database are investigated in this thesis. This study proposes a new benchmarking methodology based on decision trees, where a relationship between the energy use intensities (EUI) and building parameters (continuous and categorical) is developed for different building types. This methodology was applied to medium office and school building types contained in the CBECS database. The Random Forest technique was used to find the most influential parameters that impact building energy use intensities. Subsequently, correlations which were significant were identified between EUIs and CBECS variables. Other than floor area, some of the important variables were number of workers, location, number of PCs and main cooling equipment. The coefficient of variation was used to evaluate the effectiveness of the new model. The customization technique proposed in this thesis was compared with another benchmarking model that is widely used by building owners and designers namely, the ENERGY STAR's Portfolio Manager. This tool relies on the standard Linear Regression methods which is only able to handle continuous variables. The model proposed uses data mining technique and was found to perform slightly better than the Portfolio Manager. The broader impacts of the new benchmarking methodology proposed is that it allows for identifying important categorical variables, and then incorporating them in a local, as against a global, model framework for EUI pertinent to the building type. The ability to identify and rank the important variables is of great importance in practical implementation of the benchmarking tools which rely on query-based building and HVAC variable filters specified by the user.

  3. Regional Differences in Demand for Coal as A Basis for Development of A Product Distribution Model for Mining Companies in the Individual Customers Segment

    NASA Astrophysics Data System (ADS)

    Magda, Roman; Bogacz, Paweł; Franik, Tadeusz; Celej, Maciej; Migza, Marcin

    2014-10-01

    The article presents a proposal of methodology based on the process of relationship marketing, serving to determine the level of demand for coal in the individual customer segment, as well as fuel distribution model for this customer group in Poland developed on the basis of this methodology. It also includes selected results of tests carried out using the proposed methods. These proposals have been defined on the basis of market capacity indicators, which can be determined for the district level based on data from the Polish Central Statistical Office. The study also included the use of linear programming, based on the cost of coal logistics, data concerning railway, road and storage infrastructure present on the Polish market and taking into account the legal aspects. The presented results may provide a basis for mining companies to develop a system of coal distribution management in the locations with the highest demand values.

  4. Structural mapping from MSS-LANDSAT imagery: A proposed methodology for international geological correlation studies

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Crepani, E.; Martini, P. R.

    1980-01-01

    A methodology is proposed for international geological correlation studies based on LANDSAT-MSS imagery, Bullard's model of continental fit and compatible structural trends between Northeast Brazil and the West African counterpart. Six extensive lineaments in the Brazilian study area are mapped and discussed according to their regional behavior and in relation to the adjacent continental margin. Among the first conclusions, correlations were found between the Sobral Pedro II Lineament and the megafaults that surround the West African craton; and the Pernambuco Lineament with the Ngaurandere Linemanet in Cameroon. Ongoing research to complete the methodological stages includes the mapping of the West African structural framework, reconstruction of the pre-drift puzzle, and an analysis of the counterpart correlations.

  5. A decomposition model and voxel selection framework for fMRI analysis to predict neural response of visual stimuli.

    PubMed

    Raut, Savita V; Yadav, Dinkar M

    2018-03-28

    This paper presents an fMRI signal analysis methodology using geometric mean curve decomposition (GMCD) and mutual information-based voxel selection framework. Previously, the fMRI signal analysis has been conducted using empirical mean curve decomposition (EMCD) model and voxel selection on raw fMRI signal. The erstwhile methodology loses frequency component, while the latter methodology suffers from signal redundancy. Both challenges are addressed by our methodology in which the frequency component is considered by decomposing the raw fMRI signal using geometric mean rather than arithmetic mean and the voxels are selected from EMCD signal using GMCD components, rather than raw fMRI signal. The proposed methodologies are adopted for predicting the neural response. Experimentations are conducted in the openly available fMRI data of six subjects, and comparisons are made with existing decomposition models and voxel selection frameworks. Subsequently, the effect of degree of selected voxels and the selection constraints are analyzed. The comparative results and the analysis demonstrate the superiority and the reliability of the proposed methodology.

  6. Improving electrical power systems reliability through locally controlled distributed curtailable load

    NASA Astrophysics Data System (ADS)

    Dehbozorgi, Mohammad Reza

    2000-10-01

    Improvements in power system reliability have always been of interest to both power companies and customers. Since there are no sizable electrical energy storage elements in electrical power systems, the generated power should match the load demand at any given time. Failure to meet this balance may cause severe system problems, including loss of generation and system blackouts. This thesis proposes a methodology which can respond to either loss of generation or loss of load. It is based on switching of electric water heaters using power system frequency as the controlling signal. The proposed methodology encounters, and the thesis has addressed, the following associated problems. The controller must be interfaced with the existing thermostat control. When necessary to switch on loads, the water in the tank should not be overheated. Rapid switching of blocks of load, or chattering, has been considered. The contributions of the thesis are: (A) A system has been proposed which makes a significant portion of the distributed loads connected to a power system to behave in a predetermined manner to improve the power system response during disturbances. (B) The action of the proposed system is transparent to the customers. (C) The thesis proposes a simple analysis for determining the amount of such loads which might be switched and relates this amount to the size of the disturbances which can occur in the utility. (D) The proposed system acts without any formal communication links, solely using the embedded information present system-wide. (E) The methodology of the thesis proposes switching of water heater loads based on a simple, localized frequency set-point controller. The thesis has identified the consequent problem of rapid switching of distributed loads, which is referred to as chattering. (F) Two approaches have been proposed to reduce chattering to tolerable levels. (G) A frequency controller has been designed and built according to the specifications required to switch electric water heater loads in response to power system disturbances. (H) A cost analysis for building and installing the distributed frequency controller has been carried out. (I) The proposed equipment and methodology has been implemented and tested successfully. (Abstract shortened by UMI.)

  7. Control and communication co-design: analysis and practice on performance improvement in distributed measurement and control system based on fieldbus and Ethernet.

    PubMed

    Liang, Geng

    2015-01-01

    In this paper, improving control performance of a networked control system by reducing DTD in a different perspective was investigated. Two different network architectures for system implementation were presented. Analysis and improvement dealing with DTD for the experimental control system were expounded. Effects of control scheme configuration on DTD in the form of FB were investigated and corresponding improvements by reallocation of FB and re-arrangement of schedule table are proposed. Issues of DTD in hybrid network were investigated and corresponding approaches to improve performance including (1) reducing DTD in PLC or PAC by way of IEC61499 and (2) cascade Smith predictive control with BPNN-based identification were proposed and investigated. Control effects under the proposed methodologies were also given. Experimental and field practices validated these methodologies. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Variance change point detection for fractional Brownian motion based on the likelihood ratio test

    NASA Astrophysics Data System (ADS)

    Kucharczyk, Daniel; Wyłomańska, Agnieszka; Sikora, Grzegorz

    2018-01-01

    Fractional Brownian motion is one of the main stochastic processes used for describing the long-range dependence phenomenon for self-similar processes. It appears that for many real time series, characteristics of the data change significantly over time. Such behaviour one can observe in many applications, including physical and biological experiments. In this paper, we present a new technique for the critical change point detection for cases where the data under consideration are driven by fractional Brownian motion with a time-changed diffusion coefficient. The proposed methodology is based on the likelihood ratio approach and represents an extension of a similar methodology used for Brownian motion, the process with independent increments. Here, we also propose a statistical test for testing the significance of the estimated critical point. In addition to that, an extensive simulation study is provided to test the performance of the proposed method.

  9. Reflexive Audiovisual Methodology: The Emergence of "Minority Practices" among Pluriactive Stock Farmers

    ERIC Educational Resources Information Center

    Stassart, Pierre Marie; Mathieu, Valerie; Melard, Francois

    2011-01-01

    This paper proposes a new way for sociology, through both methodology and theory, to understand the reality of social groups and their "minority practices." It is based on an experiment that concerns a very specific category of agriculturalists called "pluriactive" stock farmers. These stock farmers, who engage in raising livestock part-time…

  10. International Management Skills for Success in Asia: A Needs-Based Determination of Skills for Foreign Managers and Local Managers

    ERIC Educational Resources Information Center

    Neupert, Kent E.; Baughn, C. Cristopher; Dao, Thi Thanh Lam

    2005-01-01

    Purpose: This paper identifies skills necessary in order to succeed in Vietnam and proposes a training program to develop such skills. Design/methodology/approach: To determine necessary skills, 74 managers were interviewed using critical incident methodology to identify training needs. Critical incident approach asks respondents to describe the…

  11. International Harmonization of Training and Qualification in the Manufacturing Industry

    ERIC Educational Resources Information Center

    Quintino, L.; Fernandes, I.; Miranda, R. M.

    2011-01-01

    Purpose: The aim of this paper is to propose a model for international harmonization of the training and qualification of human resources for industrial professions. The outcome is a system based on training guidelines and a quality assurance methodology that is now in use in 42 countries around the world. Design/methodology/approach: The paper…

  12. Evaluation Tool for the Application of Discovery Teaching Method in the Greek Environmental School Projects

    ERIC Educational Resources Information Center

    Kalathaki, Maria

    2015-01-01

    Greek school community emphasizes on the discovery direction of teaching methodology in the school Environmental Education (EE) in order to promote Education for the Sustainable Development (ESD). In ESD school projects the used methodology is experiential teamwork for inquiry based learning. The proposed tool checks whether and how a school…

  13. Implementation of an active instructional design for teaching the concepts of current, voltage and resistance

    NASA Astrophysics Data System (ADS)

    Orlaineta-Agüero, S.; Del Sol-Fernández, S.; Sánchez-Guzmán, D.; García-Salcedo, R.

    2017-01-01

    In the present work we show the implementation of a learning sequence based on an active learning methodology for teaching Physics, this proposal tends to promote a better learning in high school students with the use of a comic book and it combines the use of different low-cost experimental activities for teaching the electrical concepts of Current, Resistance and Voltage. We consider that this kind of strategy can be easily extrapolated to higher-education levels like Engineering-college/university level and other disciplines of Science. To evaluate this proposal, we used some conceptual questions from the Electric Circuits Concept Evaluation survey developed by Sokoloff and the results from this survey was analysed with the Normalized Conceptual Gain proposed by Hake and the Concentration Factor that was proposed by Bao and Redish, to identify the effectiveness of the methodology and the models that the students presented after and before the instruction, respectively. We found that this methodology was more effective than only the implementation of traditional lectures, we consider that these results cannot be generalized but gave us the opportunity to view many important approaches in Physics Education; finally, we will continue to apply the same experiment with more students, in the same and upper levels of education, to confirm and validate the effectiveness of this methodology proposal.

  14. Rolling element bearing fault diagnosis based on Over-Complete rational dilation wavelet transform and auto-correlation of analytic energy operator

    NASA Astrophysics Data System (ADS)

    Singh, Jaskaran; Darpe, A. K.; Singh, S. P.

    2018-02-01

    Local damage in rolling element bearings usually generates periodic impulses in vibration signals. The severity, repetition frequency and the fault excited resonance zone by these impulses are the key indicators for diagnosing bearing faults. In this paper, a methodology based on over complete rational dilation wavelet transform (ORDWT) is proposed, as it enjoys a good shift invariance. ORDWT offers flexibility in partitioning the frequency spectrum to generate a number of subbands (filters) with diverse bandwidths. The selection of the optimal filter that perfectly overlaps with the bearing fault excited resonance zone is based on the maximization of a proposed impulse detection measure "Temporal energy operated auto correlated kurtosis". The proposed indicator is robust and consistent in evaluating the impulsiveness of fault signals in presence of interfering vibration such as heavy background noise or sporadic shocks unrelated to the fault or normal operation. The structure of the proposed indicator enables it to be sensitive to fault severity. For enhanced fault classification, an autocorrelation of the energy time series of the signal filtered through the optimal subband is proposed. The application of the proposed methodology is validated on simulated and experimental data. The study shows that the performance of the proposed technique is more robust and consistent in comparison to the original fast kurtogram and wavelet kurtogram.

  15. The integrative review: updated methodology.

    PubMed

    Whittemore, Robin; Knafl, Kathleen

    2005-12-01

    The aim of this paper is to distinguish the integrative review method from other review methods and to propose methodological strategies specific to the integrative review method to enhance the rigour of the process. Recent evidence-based practice initiatives have increased the need for and the production of all types of reviews of the literature (integrative reviews, systematic reviews, meta-analyses, and qualitative reviews). The integrative review method is the only approach that allows for the combination of diverse methodologies (for example, experimental and non-experimental research), and has the potential to play a greater role in evidence-based practice for nursing. With respect to the integrative review method, strategies to enhance data collection and extraction have been developed; however, methods of analysis, synthesis, and conclusion drawing remain poorly formulated. A modified framework for research reviews is presented to address issues specific to the integrative review method. Issues related to specifying the review purpose, searching the literature, evaluating data from primary sources, analysing data, and presenting the results are discussed. Data analysis methods of qualitative research are proposed as strategies that enhance the rigour of combining diverse methodologies as well as empirical and theoretical sources in an integrative review. An updated integrative review method has the potential to allow for diverse primary research methods to become a greater part of evidence-based practice initiatives.

  16. Two-stage commercial evaluation of engineering systems production projects for high-rise buildings

    NASA Astrophysics Data System (ADS)

    Bril, Aleksander; Kalinina, Olga; Levina, Anastasia

    2018-03-01

    The paper is devoted to the current and debatable problem of methodology of choosing the effective innovative enterprises for venture financing. A two-stage system of commercial innovation evaluation based on the UNIDO methodology is proposed. Engineering systems account for 25 to 40% of the cost of high-rise residential buildings. This proportion increases with the use of new construction technologies. Analysis of the construction market in Russia showed that the production of internal engineering systems elements based on innovative technologies has a growth trend. The production of simple elements is organized in small enterprises on the basis of new technologies. The most attractive for development is the use of venture financing of small innovative business. To improve the efficiency of these operations, the paper proposes a methodology for a two-stage evaluation of small business development projects. A two-stage system of commercial evaluation of innovative projects allows creating an information base for informed and coordinated decision-making on venture financing of enterprises that produce engineering systems elements for the construction business.

  17. Application of multimedia models for screening assessment of long-range transport potential and overall persistence.

    PubMed

    Klasmeier, Jörg; Matthies, Michael; Macleod, Matthew; Fenner, Kathrin; Scheringer, Martin; Stroebe, Maximilian; Le Gall, Anne Christine; Mckone, Thomas; Van De Meent, Dik; Wania, Frank

    2006-01-01

    We propose a multimedia model-based methodology to evaluate whether a chemical substance qualifies as POP-like based on overall persistence (Pov) and potential for long-range transport (LRTP). It relies upon screening chemicals against the Pov and LRTP characteristics of selected reference chemicals with well-established environmental fates. Results indicate that chemicals of high and low concern in terms of persistence and long-range transport can be consistently identified by eight contemporary multimedia models using the proposed methodology. Model results for three hypothetical chemicals illustrate that the model-based classification of chemicals according to Pov and LRTP is not always consistent with the single-media half-life approach proposed by the UNEP Stockholm Convention and thatthe models provide additional insight into the likely long-term hazards associated with chemicals in the environment. We suggest this model-based classification method be adopted as a complement to screening against defined half-life criteria at the initial stages of tiered assessments designed to identify POP-like chemicals and to prioritize further environmental fate studies for new and existing chemicals.

  18. Methodology for the Evaluation of the Algorithms for Text Line Segmentation Based on Extended Binary Classification

    NASA Astrophysics Data System (ADS)

    Brodic, D.

    2011-01-01

    Text line segmentation represents the key element in the optical character recognition process. Hence, testing of text line segmentation algorithms has substantial relevance. All previously proposed testing methods deal mainly with text database as a template. They are used for testing as well as for the evaluation of the text segmentation algorithm. In this manuscript, methodology for the evaluation of the algorithm for text segmentation based on extended binary classification is proposed. It is established on the various multiline text samples linked with text segmentation. Their results are distributed according to binary classification. Final result is obtained by comparative analysis of cross linked data. At the end, its suitability for different types of scripts represents its main advantage.

  19. Data Sufficiency Assessment and Pumping Test Design for Groundwater Prediction Using Decision Theory and Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    McPhee, J.; William, Y. W.

    2005-12-01

    This work presents a methodology for pumping test design based on the reliability requirements of a groundwater model. Reliability requirements take into consideration the application of the model results in groundwater management, expressed in this case as a multiobjective management model. The pumping test design is formulated as a mixed-integer nonlinear programming (MINLP) problem and solved using a combination of genetic algorithm (GA) and gradient-based optimization. Bayesian decision theory provides a formal framework for assessing the influence of parameter uncertainty over the reliability of the proposed pumping test. The proposed methodology is useful for selecting a robust design that will outperform all other candidate designs under most potential 'true' states of the system

  20. Bridge condition assessment based on long-term strain monitoring

    NASA Astrophysics Data System (ADS)

    Sun, LiMin; Sun, Shouwang

    2011-04-01

    In consideration of the important role that bridges play as transportation infrastructures, their safety, durability and serviceability have always been deeply concerned. Structural Health Monitoring Systems (SHMS) have been installed to many long-span bridges to provide bridge engineers with the information needed in making rational decisions for maintenance. However, SHMS also confronted bridge engineers with the challenge of efficient use of monitoring data. Thus, methodologies which are robust to random disturbance and sensitive to damage become a subject on which many researches in structural condition assessment concentrate. In this study, an innovative probabilistic approach for condition assessment of bridge structures was proposed on the basis of long-term strain monitoring on steel girder of a cable-stayed bridge. First, the methodology of damage detection in the vicinity of monitoring point using strain-based indices was investigated. Then, the composition of strain response of bridge under operational loads was analyzed. Thirdly, the influence of temperature and wind on strains was eliminated and thus strain fluctuation under vehicle loads is obtained. Finally, damage evolution assessment was carried out based on the statistical characteristics of rain-flow cycles derived from the strain fluctuation under vehicle loads. The research conducted indicates that the methodology proposed is qualified for structural condition assessment so far as the following respects are concerned: (a) capability of revealing structural deterioration; (b) immunity to the influence of environmental variation; (c) adaptability to the random characteristic exhibited by long-term monitoring data. Further examination of the applicability of the proposed methodology in aging bridge may provide a more convincing validation.

  1. Optimized planning methodologies of ASON implementation

    NASA Astrophysics Data System (ADS)

    Zhou, Michael M.; Tamil, Lakshman S.

    2005-02-01

    Advanced network planning concerns effective network-resource allocation for dynamic and open business environment. Planning methodologies of ASON implementation based on qualitative analysis and mathematical modeling are presented in this paper. The methodology includes method of rationalizing technology and architecture, building network and nodal models, and developing dynamic programming for multi-period deployment. The multi-layered nodal architecture proposed here can accommodate various nodal configurations for a multi-plane optical network and the network modeling presented here computes the required network elements for optimizing resource allocation.

  2. EEMD-MUSIC-Based Analysis for Natural Frequencies Identification of Structures Using Artificial and Natural Excitations

    PubMed Central

    Amezquita-Sanchez, Juan P.; Romero-Troncoso, Rene J.; Osornio-Rios, Roque A.; Garcia-Perez, Arturo

    2014-01-01

    This paper presents a new EEMD-MUSIC- (ensemble empirical mode decomposition-multiple signal classification-) based methodology to identify modal frequencies in structures ranging from free and ambient vibration signals produced by artificial and natural excitations and also considering several factors as nonstationary effects, close modal frequencies, and noisy environments, which are common situations where several techniques reported in literature fail. The EEMD and MUSIC methods are used to decompose the vibration signal into a set of IMFs (intrinsic mode functions) and to identify the natural frequencies of a structure, respectively. The effectiveness of the proposed methodology has been validated and tested with synthetic signals and under real operating conditions. The experiments are focused on extracting the natural frequencies of a truss-type scaled structure and of a bridge used for both highway traffic and pedestrians. Results show the proposed methodology as a suitable solution for natural frequencies identification of structures from free and ambient vibration signals. PMID:24683346

  3. EEMD-MUSIC-based analysis for natural frequencies identification of structures using artificial and natural excitations.

    PubMed

    Camarena-Martinez, David; Amezquita-Sanchez, Juan P; Valtierra-Rodriguez, Martin; Romero-Troncoso, Rene J; Osornio-Rios, Roque A; Garcia-Perez, Arturo

    2014-01-01

    This paper presents a new EEMD-MUSIC- (ensemble empirical mode decomposition-multiple signal classification-) based methodology to identify modal frequencies in structures ranging from free and ambient vibration signals produced by artificial and natural excitations and also considering several factors as nonstationary effects, close modal frequencies, and noisy environments, which are common situations where several techniques reported in literature fail. The EEMD and MUSIC methods are used to decompose the vibration signal into a set of IMFs (intrinsic mode functions) and to identify the natural frequencies of a structure, respectively. The effectiveness of the proposed methodology has been validated and tested with synthetic signals and under real operating conditions. The experiments are focused on extracting the natural frequencies of a truss-type scaled structure and of a bridge used for both highway traffic and pedestrians. Results show the proposed methodology as a suitable solution for natural frequencies identification of structures from free and ambient vibration signals.

  4. Determining radiated sound power of building structures by means of laser Doppler vibrometry

    NASA Astrophysics Data System (ADS)

    Roozen, N. B.; Labelle, L.; Rychtáriková, M.; Glorieux, C.

    2015-06-01

    This paper introduces a methodology that makes use of laser Doppler vibrometry to assess the acoustic insulation performance of a building element. The sound power radiated by the surface of the element is numerically determined from the vibrational pattern, offering an alternative for classical microphone measurements. Compared to the latter the proposed analysis is not sensitive to room acoustical effects. This allows the proposed methodology to be used at low frequencies, where the standardized microphone based approach suffers from a high uncertainty due to a low acoustic modal density. Standardized measurements as well as laser Doppler vibrometry measurements and computations have been performed on two test panels, a light-weight wall and a gypsum block wall and are compared and discussed in this paper. The proposed methodology offers an adequate solution for the assessment of the acoustic insulation of building elements at low frequencies. This is crucial in the framework of recent proposals of acoustic standards for measurement approaches and single number sound insulation performance ratings to take into account frequencies down to 50 Hz.

  5. Problems related to the integration of fault tolerant aircraft electronic systems

    NASA Technical Reports Server (NTRS)

    Bannister, J. A.; Adlakha, V.; Triyedi, K.; Alspaugh, T. A., Jr.

    1982-01-01

    Problems related to the design of the hardware for an integrated aircraft electronic system are considered. Taxonomies of concurrent systems are reviewed and a new taxonomy is proposed. An informal methodology intended to identify feasible regions of the taxonomic design space is described. Specific tools are recommended for use in the methodology. Based on the methodology, a preliminary strawman integrated fault tolerant aircraft electronic system is proposed. Next, problems related to the programming and control of inegrated aircraft electronic systems are discussed. Issues of system resource management, including the scheduling and allocation of real time periodic tasks in a multiprocessor environment, are treated in detail. The role of software design in integrated fault tolerant aircraft electronic systems is discussed. Conclusions and recommendations for further work are included.

  6. ATS-F and Man: A Course of Study: An Experiment in Satellite Application to Statewide Instructional Methodology.

    ERIC Educational Resources Information Center

    Gaven, Patricia; Williams, R. David

    An experiment is proposed which will study the advantages of satellite technology as a means for the standardization of teaching methodology in an attempt to socially integrate the rural Alaskan native. With "Man: A Course of Study" as the curricular base of the experiment, there will be a Library Experiment Program for Adults using…

  7. Drawing and Emerging Research: The Acquisition of Experiential Knowledge through Drawing as a Methodological Strategy

    ERIC Educational Resources Information Center

    Roberts, Amanda; Riley, Howard

    2014-01-01

    The paper proposes the activity of drawing as a methodological strategy within a university research context. It is illustrated with examples from one of the authors' (Roberts) practice-based PhD research. The paper argues that drawing as a research method can be validated in a manner akin to the more established research methods associated…

  8. Development of a methodology for strategic environmental assessment: application to the assessment of golf course installation policy in Taiwan.

    PubMed

    Chen, Ching-Ho; Wu, Ray-Shyan; Liu, Wei-Lin; Su, Wen-Ray; Chang, Yu-Min

    2009-01-01

    Some countries, including Taiwan, have adopted strategic environmental assessment (SEA) to assess and modify proposed policies, plans, and programs (PPPs) in the planning phase for pursuing sustainable development. However, there were only some sketchy steps focusing on policy assessment in the system of Taiwan. This study aims to develop a methodology for SEA in Taiwan to enhance the effectiveness associated with PPPs. The proposed methodology comprises an SEA procedure involving PPP management and assessment in various phases, a sustainable assessment framework, and an SEA management system. The SEA procedure is devised based on the theoretical considerations by systems thinking and the regulative requirements in Taiwan. The positive and negative impacts on ecology, society, and economy are simultaneously considered in the planning (including policy generation and evaluation), implementation, and control phases of the procedure. This study used the analytic hierarchy process, Delphi technique, and systems analysis to develop a sustainable assessment framework. An SEA management system was built based on geographic information system software to process spatial, attribute, and satellite image data during the assessment procedure. The proposed methodology was applied in the SEA of golf course installation policy in 2001 as a case study, which was the first SEA in Taiwan. Most of the 82 existing golf courses in 2001 were installed on slope lands and caused a serious ecological impact. Assessment results indicated that 15 future golf courses installed on marginal lands (including buffer zones, remedied lands, and wastelands) were acceptable because the comprehensive environmental (ecological, social, and economic) assessment value was better based on environmental characteristics and management regulations of Taiwan. The SEA procedure in the planning phase for this policy was completed but the implementation phase of this policy was not begun because the related legislation procedure could not be arranged due to a few senators' resistance. A self-review of the control phase was carried out in 2006 using this methodology. Installation permits for 12 courses on slope lands were terminated after 2001 and then 27 future courses could be installed on marginal lands. The assessment value of this policy using the data on ecological, social, and economic conditions from 2006 was higher than that using the data from 2001. The analytical results illustrate that the proposed methodology can be used to effectively and efficiently assist the related authorities for SEA.

  9. District Heating Systems Performance Analyses. Heat Energy Tariff

    NASA Astrophysics Data System (ADS)

    Ziemele, Jelena; Vigants, Girts; Vitolins, Valdis; Blumberga, Dagnija; Veidenbergs, Ivars

    2014-12-01

    The paper addresses an important element of the European energy sector: the evaluation of district heating (DH) system operations from the standpoint of increasing energy efficiency and increasing the use of renewable energy resources. This has been done by developing a new methodology for the evaluation of the heat tariff. The paper presents an algorithm of this methodology, which includes not only a data base and calculation equation systems, but also an integrated multi-criteria analysis module using MADM/MCDM (Multi-Attribute Decision Making / Multi-Criteria Decision Making) based on TOPSIS (Technique for Order Performance by Similarity to Ideal Solution). The results of the multi-criteria analysis are used to set the tariff benchmarks. The evaluation methodology has been tested for Latvian heat tariffs, and the obtained results show that only half of heating companies reach a benchmark value equal to 0.5 for the efficiency closeness to the ideal solution indicator. This means that the proposed evaluation methodology would not only allow companies to determine how they perform with regard to the proposed benchmark, but also to identify their need to restructure so that they may reach the level of a low-carbon business.

  10. Radiological Characterization Methodology of INEEL Stored RH-TRU Waste from ANL-E

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rajiv N. Bhatt

    2003-02-01

    An Acceptable Knowledge (AK)-based radiological characterization methodology is being developed for RH TRU waste generated from ANL-E hot cell operations performed on fuel elements irradiated in the EBR-II reactor. The methodology relies on AK for composition of the fresh fuel elements, their irradiation history, and the waste generation and collection processes. Radiological characterization of the waste involves the estimates of the quantities of significant fission products and transuranic isotopes in the waste. Methods based on reactor and physics principles are used to achieve these estimates. Because of the availability of AK and the robustness of the calculation methods, the AK-basedmore » characterization methodology offers a superior alternative to traditional waste assay techniques. Using this methodology, it is shown that the radiological parameters of a test batch of ANL-E waste is well within the proposed WIPP Waste Acceptance Criteria limits.« less

  11. Radiological Characterization Methodology for INEEL-Stored Remote-Handled Transuranic (RH TRU) Waste from Argonne National Laboratory-East

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuan, P.; Bhatt, R.N.

    2003-01-14

    An Acceptable Knowledge (AK)-based radiological characterization methodology is being developed for RH TRU waste generated from ANL-E hot cell operations performed on fuel elements irradiated in the EBR-II reactor. The methodology relies on AK for composition of the fresh fuel elements, their irradiation history, and the waste generation and collection processes. Radiological characterization of the waste involves the estimates of the quantities of significant fission products and transuranic isotopes in the waste. Methods based on reactor and physics principles are used to achieve these estimates. Because of the availability of AK and the robustness of the calculation methods, the AK-basedmore » characterization methodology offers a superior alternative to traditional waste assay techniques. Using the methodology, it is shown that the radiological parameters of a test batch of ANL-E waste is well within the proposed WIPP Waste Acceptance Criteria limits.« less

  12. Conjugate gradient based projection - A new explicit methodology for frictional contact

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; Li, Maocheng; Sha, Desong

    1993-01-01

    With special attention towards the applicability to parallel computation or vectorization, a new and effective explicit approach for linear complementary formulations involving a conjugate gradient based projection methodology is proposed in this study for contact problems with Coulomb friction. The overall objectives are focussed towards providing an explicit methodology of computation for the complete contact problem with friction. In this regard, the primary idea for solving the linear complementary formulations stems from an established search direction which is projected to a feasible region determined by the non-negative constraint condition; this direction is then applied to the Fletcher-Reeves conjugate gradient method resulting in a powerful explicit methodology which possesses high accuracy, excellent convergence characteristics, fast computational speed and is relatively simple to implement for contact problems involving Coulomb friction.

  13. Design and implementation of the tree-based fuzzy logic controller.

    PubMed

    Liu, B D; Huang, C Y

    1997-01-01

    In this paper, a tree-based approach is proposed to design the fuzzy logic controller. Based on the proposed methodology, the fuzzy logic controller has the following merits: the fuzzy control rule can be extracted automatically from the input-output data of the system and the extraction process can be done in one-pass; owing to the fuzzy tree inference structure, the search spaces of the fuzzy inference process are largely reduced; the operation of the inference process can be simplified as a one-dimensional matrix operation because of the fuzzy tree approach; and the controller has regular and modular properties, so it is easy to be implemented by hardware. Furthermore, the proposed fuzzy tree approach has been applied to design the color reproduction system for verifying the proposed methodology. The color reproduction system is mainly used to obtain a color image through the printer that is identical to the original one. In addition to the software simulation, an FPGA is used to implement the prototype hardware system for real-time application. Experimental results show that the effect of color correction is quite good and that the prototype hardware system can operate correctly under the condition of 30 MHz clock rate.

  14. Parametric representation of weld fillets using shell finite elements—a proposal based on minimum stiffness and inertia errors

    NASA Astrophysics Data System (ADS)

    Echer, L.; Marczak, R. J.

    2018-02-01

    The objective of the present work is to introduce a methodology capable of modelling welded components for structural stress analysis. The modelling technique was based on the recommendations of the International Institute of Welding; however, some geometrical features of the weld fillet were used as design parameters in an optimization problem. Namely, the weld leg length and thickness of the shell elements representing the weld fillet were optimized in such a way that the first natural frequencies were not changed significantly when compared to a reference result. Sequential linear programming was performed for T-joint structures corresponding to two different structural details: with and without full penetration weld fillets. Both structural details were tested in scenarios of various plate thicknesses and depths. Once the optimal parameters were found, a modelling procedure was proposed for T-shaped components. Furthermore, the proposed modelling technique was extended for overlapped welded joints. The results obtained were compared to well-established methodologies presented in standards and in the literature. The comparisons included results for natural frequencies, total mass and structural stress. By these comparisons, it was observed that some established practices produce significant errors in the overall stiffness and inertia. The methodology proposed herein does not share this issue and can be easily extended to other types of structure.

  15. Feature extraction through parallel Probabilistic Principal Component Analysis for heart disease diagnosis

    NASA Astrophysics Data System (ADS)

    Shah, Syed Muhammad Saqlain; Batool, Safeera; Khan, Imran; Ashraf, Muhammad Usman; Abbas, Syed Hussnain; Hussain, Syed Adnan

    2017-09-01

    Automatic diagnosis of human diseases are mostly achieved through decision support systems. The performance of these systems is mainly dependent on the selection of the most relevant features. This becomes harder when the dataset contains missing values for the different features. Probabilistic Principal Component Analysis (PPCA) has reputation to deal with the problem of missing values of attributes. This research presents a methodology which uses the results of medical tests as input, extracts a reduced dimensional feature subset and provides diagnosis of heart disease. The proposed methodology extracts high impact features in new projection by using Probabilistic Principal Component Analysis (PPCA). PPCA extracts projection vectors which contribute in highest covariance and these projection vectors are used to reduce feature dimension. The selection of projection vectors is done through Parallel Analysis (PA). The feature subset with the reduced dimension is provided to radial basis function (RBF) kernel based Support Vector Machines (SVM). The RBF based SVM serves the purpose of classification into two categories i.e., Heart Patient (HP) and Normal Subject (NS). The proposed methodology is evaluated through accuracy, specificity and sensitivity over the three datasets of UCI i.e., Cleveland, Switzerland and Hungarian. The statistical results achieved through the proposed technique are presented in comparison to the existing research showing its impact. The proposed technique achieved an accuracy of 82.18%, 85.82% and 91.30% for Cleveland, Hungarian and Switzerland dataset respectively.

  16. Decomposition of timed automata for solving scheduling problems

    NASA Astrophysics Data System (ADS)

    Nishi, Tatsushi; Wakatake, Masato

    2014-03-01

    A decomposition algorithm for scheduling problems based on timed automata (TA) model is proposed. The problem is represented as an optimal state transition problem for TA. The model comprises of the parallel composition of submodels such as jobs and resources. The procedure of the proposed methodology can be divided into two steps. The first step is to decompose the TA model into several submodels by using decomposable condition. The second step is to combine individual solution of subproblems for the decomposed submodels by the penalty function method. A feasible solution for the entire model is derived through the iterated computation of solving the subproblem for each submodel. The proposed methodology is applied to solve flowshop and jobshop scheduling problems. Computational experiments demonstrate the effectiveness of the proposed algorithm compared with a conventional TA scheduling algorithm without decomposition.

  17. Analysis and methodology for aeronautical systems technology program planning

    NASA Technical Reports Server (NTRS)

    White, M. J.; Gershkoff, I.; Lamkin, S.

    1983-01-01

    A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.

  18. Classification of malignant and benign lung nodules using taxonomic diversity index and phylogenetic distance.

    PubMed

    de Sousa Costa, Robherson Wector; da Silva, Giovanni Lucca França; de Carvalho Filho, Antonio Oseas; Silva, Aristófanes Corrêa; de Paiva, Anselmo Cardoso; Gattass, Marcelo

    2018-05-23

    Lung cancer presents the highest cause of death among patients around the world, in addition of being one of the smallest survival rates after diagnosis. Therefore, this study proposes a methodology for diagnosis of lung nodules in benign and malignant tumors based on image processing and pattern recognition techniques. Mean phylogenetic distance (MPD) and taxonomic diversity index (Δ) were used as texture descriptors. Finally, the genetic algorithm in conjunction with the support vector machine were applied to select the best training model. The proposed methodology was tested on computed tomography (CT) images from the Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI), with the best sensitivity of 93.42%, specificity of 91.21%, accuracy of 91.81%, and area under the ROC curve of 0.94. The results demonstrate the promising performance of texture extraction techniques using mean phylogenetic distance and taxonomic diversity index combined with phylogenetic trees. Graphical Abstract Stages of the proposed methodology.

  19. Evaluations and Comparisons of Treatment Effects Based on Best Combinations of Biomarkers with Applications to Biomedical Studies

    PubMed Central

    Chen, Xiwei; Yu, Jihnhee

    2014-01-01

    Abstract Many clinical and biomedical studies evaluate treatment effects based on multiple biomarkers that commonly consist of pre- and post-treatment measurements. Some biomarkers can show significant positive treatment effects, while other biomarkers can reflect no effects or even negative effects of the treatments, giving rise to a necessity to develop methodologies that may correctly and efficiently evaluate the treatment effects based on multiple biomarkers as a whole. In the setting of pre- and post-treatment measurements of multiple biomarkers, we propose to apply a receiver operating characteristic (ROC) curve methodology based on the best combination of biomarkers maximizing the area under the receiver operating characteristic curve (AUC)-type criterion among all possible linear combinations. In the particular case with independent pre- and post-treatment measurements, we show that the proposed method represents the well-known Su and Liu's (1993) result. Further, proceeding from derived best combinations of biomarkers' measurements, we propose an efficient technique via likelihood ratio tests to compare treatment effects. We show an extensive Monte Carlo study that confirms the superiority of the proposed test in comparison with treatment effects based on multiple biomarkers in a paired data setting. For practical applications, the proposed method is illustrated with a randomized trial of chlorhexidine gluconate on oral bacterial pathogens in mechanically ventilated patients as well as a treatment study for children with attention deficit-hyperactivity disorder and severe mood dysregulation. PMID:25019920

  20. When is good, good enough? Methodological pragmatism for sustainable guideline development.

    PubMed

    Browman, George P; Somerfield, Mark R; Lyman, Gary H; Brouwers, Melissa C

    2015-03-06

    Continuous escalation in methodological and procedural rigor for evidence-based processes in guideline development is associated with increasing costs and production delays that threaten sustainability. While health research methodologists are appropriately responsible for promoting increasing rigor in guideline development, guideline sponsors are responsible for funding such processes. This paper acknowledges that other stakeholders in addition to methodologists should be more involved in negotiating trade-offs between methodological procedures and efficiency in guideline production to produce guidelines that are 'good enough' to be trustworthy and affordable under specific circumstances. The argument for reasonable methodological compromise to meet practical circumstances is consistent with current implicit methodological practice. This paper proposes a conceptual tool as a framework to be used by different stakeholders in negotiating, and explicitly reporting, reasonable compromises for trustworthy as well as cost-worthy guidelines. The framework helps fill a transparency gap in how methodological choices in guideline development are made. The principle, 'when good is good enough' can serve as a basis for this approach. The conceptual tool 'Efficiency-Validity Methodological Continuum' acknowledges trade-offs between validity and efficiency in evidence-based guideline development and allows for negotiation, guided by methodologists, of reasonable methodological compromises among stakeholders. Collaboration among guideline stakeholders in the development process is necessary if evidence-based guideline development is to be sustainable.

  1. New insights into soil temperature time series modeling: linear or nonlinear?

    NASA Astrophysics Data System (ADS)

    Bonakdari, Hossein; Moeeni, Hamid; Ebtehaj, Isa; Zeynoddin, Mohammad; Mahoammadian, Abdolmajid; Gharabaghi, Bahram

    2018-03-01

    Soil temperature (ST) is an important dynamic parameter, whose prediction is a major research topic in various fields including agriculture because ST has a critical role in hydrological processes at the soil surface. In this study, a new linear methodology is proposed based on stochastic methods for modeling daily soil temperature (DST). With this approach, the ST series components are determined to carry out modeling and spectral analysis. The results of this process are compared with two linear methods based on seasonal standardization and seasonal differencing in terms of four DST series. The series used in this study were measured at two stations, Champaign and Springfield, at depths of 10 and 20 cm. The results indicate that in all ST series reviewed, the periodic term is the most robust among all components. According to a comparison of the three methods applied to analyze the various series components, it appears that spectral analysis combined with stochastic methods outperformed the seasonal standardization and seasonal differencing methods. In addition to comparing the proposed methodology with linear methods, the ST modeling results were compared with the two nonlinear methods in two forms: considering hydrological variables (HV) as input variables and DST modeling as a time series. In a previous study at the mentioned sites, Kim and Singh Theor Appl Climatol 118:465-479, (2014) applied the popular Multilayer Perceptron (MLP) neural network and Adaptive Neuro-Fuzzy Inference System (ANFIS) nonlinear methods and considered HV as input variables. The comparison results signify that the relative error projected in estimating DST by the proposed methodology was about 6%, while this value with MLP and ANFIS was over 15%. Moreover, MLP and ANFIS models were employed for DST time series modeling. Due to these models' relatively inferior performance to the proposed methodology, two hybrid models were implemented: the weights and membership function of MLP and ANFIS (respectively) were optimized with the particle swarm optimization (PSO) algorithm in conjunction with the wavelet transform and nonlinear methods (Wavelet-MLP & Wavelet-ANFIS). A comparison of the proposed methodology with individual and hybrid nonlinear models in predicting DST time series indicates the lowest Akaike Information Criterion (AIC) index value, which considers model simplicity and accuracy simultaneously at different depths and stations. The methodology presented in this study can thus serve as an excellent alternative to complex nonlinear methods that are normally employed to examine DST.

  2. Ensemble Empirical Mode Decomposition based methodology for ultrasonic testing of coarse grain austenitic stainless steels.

    PubMed

    Sharma, Govind K; Kumar, Anish; Jayakumar, T; Purnachandra Rao, B; Mariyappa, N

    2015-03-01

    A signal processing methodology is proposed in this paper for effective reconstruction of ultrasonic signals in coarse grained high scattering austenitic stainless steel. The proposed methodology is comprised of the Ensemble Empirical Mode Decomposition (EEMD) processing of ultrasonic signals and application of signal minimisation algorithm on selected Intrinsic Mode Functions (IMFs) obtained by EEMD. The methodology is applied to ultrasonic signals obtained from austenitic stainless steel specimens of different grain size, with and without defects. The influence of probe frequency and data length of a signal on EEMD decomposition is also investigated. For a particular sampling rate and probe frequency, the same range of IMFs can be used to reconstruct the ultrasonic signal, irrespective of the grain size in the range of 30-210 μm investigated in this study. This methodology is successfully employed for detection of defects in a 50mm thick coarse grain austenitic stainless steel specimens. Signal to noise ratio improvement of better than 15 dB is observed for the ultrasonic signal obtained from a 25 mm deep flat bottom hole in 200 μm grain size specimen. For ultrasonic signals obtained from defects at different depths, a minimum of 7 dB extra enhancement in SNR is achieved as compared to the sum of selected IMF approach. The application of minimisation algorithm with EEMD processed signal in the proposed methodology proves to be effective for adaptive signal reconstruction with improved signal to noise ratio. This methodology was further employed for successful imaging of defects in a B-scan. Copyright © 2014. Published by Elsevier B.V.

  3. A Practical, Robust Methodology for Acquiring New Observation Data Using Computationally Expensive Groundwater Models

    NASA Astrophysics Data System (ADS)

    Siade, Adam J.; Hall, Joel; Karelse, Robert N.

    2017-11-01

    Regional groundwater flow models play an important role in decision making regarding water resources; however, the uncertainty embedded in model parameters and model assumptions can significantly hinder the reliability of model predictions. One way to reduce this uncertainty is to collect new observation data from the field. However, determining where and when to obtain such data is not straightforward. There exist a number of data-worth and experimental design strategies developed for this purpose. However, these studies often ignore issues related to real-world groundwater models such as computational expense, existing observation data, high-parameter dimension, etc. In this study, we propose a methodology, based on existing methods and software, to efficiently conduct such analyses for large-scale, complex regional groundwater flow systems for which there is a wealth of available observation data. The method utilizes the well-established d-optimality criterion, and the minimax criterion for robust sampling strategies. The so-called Null-Space Monte Carlo method is used to reduce the computational burden associated with uncertainty quantification. And, a heuristic methodology, based on the concept of the greedy algorithm, is proposed for developing robust designs with subsets of the posterior parameter samples. The proposed methodology is tested on a synthetic regional groundwater model, and subsequently applied to an existing, complex, regional groundwater system in the Perth region of Western Australia. The results indicate that robust designs can be obtained efficiently, within reasonable computational resources, for making regional decisions regarding groundwater level sampling.

  4. CNN based approach for activity recognition using a wrist-worn accelerometer.

    PubMed

    Panwar, Madhuri; Dyuthi, S Ram; Chandra Prakash, K; Biswas, Dwaipayan; Acharyya, Amit; Maharatna, Koushik; Gautam, Arvind; Naik, Ganesh R

    2017-07-01

    In recent years, significant advancements have taken place in human activity recognition using various machine learning approaches. However, feature engineering have dominated conventional methods involving the difficult process of optimal feature selection. This problem has been mitigated by using a novel methodology based on deep learning framework which automatically extracts the useful features and reduces the computational cost. As a proof of concept, we have attempted to design a generalized model for recognition of three fundamental movements of the human forearm performed in daily life where data is collected from four different subjects using a single wrist worn accelerometer sensor. The validation of the proposed model is done with different pre-processing and noisy data condition which is evaluated using three possible methods. The results show that our proposed methodology achieves an average recognition rate of 99.8% as opposed to conventional methods based on K-means clustering, linear discriminant analysis and support vector machine.

  5. The Research of Improving the Particleboard Glue Dosing Process Based on TRIZ Analysis

    NASA Astrophysics Data System (ADS)

    Yu, Huiling; Fan, Delin; Zhang, Yizhuo

    This research creates a design methodology by synthesizing the Theory of Inventive Problem Solving (TRIZ) and cascade control based on Smith predictor. The particleboard glue supplying and dosing system case study defines the problem and the solution using the methodology proposed in the paper. Status difference existing in the gluing dosing process of particleboard production usually causes gluing volume inaccurately. In order to solve the problem above, we applied the TRIZ technical contradiction and inventive principle to improve the key process of particleboard production. The improving method mapped inaccurate problem to TRIZ technical contradiction, the prior action proposed Smith predictor as the control algorithm in the glue dosing system. This research examines the usefulness of a TRIZ based problem-solving process designed to improve the problem-solving ability of users in addressing difficult or reoccurring problems and also testify TRIZ is practicality and validity. Several suggestions are presented on how to approach this problem.

  6. Global Artificial Boundary Conditions for Computation of External Flow Problems with Propulsive Jets

    NASA Technical Reports Server (NTRS)

    Tsynkov, Semyon; Abarbanel, Saul; Nordstrom, Jan; Ryabenkii, Viktor; Vatsa, Veer

    1998-01-01

    We propose new global artificial boundary conditions (ABC's) for computation of flows with propulsive jets. The algorithm is based on application of the difference potentials method (DPM). Previously, similar boundary conditions have been implemented for calculation of external compressible viscous flows around finite bodies. The proposed modification substantially extends the applicability range of the DPM-based algorithm. In the paper, we present the general formulation of the problem, describe our numerical methodology, and discuss the corresponding computational results. The particular configuration that we analyze is a slender three-dimensional body with boat-tail geometry and supersonic jet exhaust in a subsonic external flow under zero angle of attack. Similarly to the results obtained earlier for the flows around airfoils and wings, current results for the jet flow case corroborate the superiority of the DPM-based ABC's over standard local methodologies from the standpoints of accuracy, overall numerical performance, and robustness.

  7. A Model for Oil-Gas Pipelines Cost Prediction Based on a Data Mining Process

    NASA Astrophysics Data System (ADS)

    Batzias, Fragiskos A.; Spanidis, Phillip-Mark P.

    2009-08-01

    This paper addresses the problems associated with the cost estimation of oil/gas pipelines during the elaboration of feasibility assessments. Techno-economic parameters, i.e., cost, length and diameter, are critical for such studies at the preliminary design stage. A methodology for the development of a cost prediction model based on Data Mining (DM) process is proposed. The design and implementation of a Knowledge Base (KB), maintaining data collected from various disciplines of the pipeline industry, are presented. The formulation of a cost prediction equation is demonstrated by applying multiple regression analysis using data sets extracted from the KB. Following the methodology proposed, a learning context is inductively developed as background pipeline data are acquired, grouped and stored in the KB, and through a linear regression model provide statistically substantial results, useful for project managers or decision makers.

  8. Authoring Multimedia Learning Material Using Open Standards and Free Software

    ERIC Educational Resources Information Center

    Tellez, Alberto Gonzalez

    2007-01-01

    Purpose: The purpose of this paper is to describe the case of synchronized multimedia presentations. Design/methodology/approach: The proposal is based on SMIL as composition language. Particularly, the paper reuses and customizes the SMIL template used by INRIA on their technical presentations. It also proposes a set of free tools to produce…

  9. Content-Based Indexing and Teaching Focus Mining for Lecture Videos

    ERIC Educational Resources Information Center

    Lin, Yu-Tzu; Yen, Bai-Jang; Chang, Chia-Hu; Lee, Greg C.; Lin, Yu-Chih

    2010-01-01

    Purpose: The purpose of this paper is to propose an indexing and teaching focus mining system for lecture videos recorded in an unconstrained environment. Design/methodology/approach: By applying the proposed algorithms in this paper, the slide structure can be reconstructed by extracting slide images from the video. Instead of applying…

  10. Towards a Multidimensional Measure of Governance

    ERIC Educational Resources Information Center

    Mitra, Shabana

    2013-01-01

    This paper proposes a new index of governance based on the Alkire-Foster methodology and compares it with the Mo Ibrahim Index of African Governance. The proposed new index improves on existing measures of governance in two ways. First, it is able to incorporate both cardinal and ordinal variables without having to assign cardinal meaning to…

  11. A Study of Best Practices in Training Transfer and Proposed Model of Transfer

    ERIC Educational Resources Information Center

    Burke, Lisa A.; Hutchins, Holly M.

    2008-01-01

    Data were gathered from a sample of training professionals of an American Society of Training and Development (ASTD) chapter in the southern United States regarding best practices for supporting training transfer. Content analysis techniques, based on a rigorous methodology proposed by Insch, Moore, & Murphy (1997), were used to analyze the…

  12. Damping mathematical modelling and dynamic responses for FRP laminated composite plates with polymer matrix

    NASA Astrophysics Data System (ADS)

    Liu, Qimao

    2018-02-01

    This paper proposes an assumption that the fibre is elastic material and polymer matrix is viscoelastic material so that the energy dissipation depends only on the polymer matrix in dynamic response process. The damping force vectors in frequency and time domains, of FRP (Fibre-Reinforced Polymer matrix) laminated composite plates, are derived based on this assumption. The governing equations of FRP laminated composite plates are formulated in both frequency and time domains. The direct inversion method and direct time integration method for nonviscously damped systems are employed to solve the governing equations and achieve the dynamic responses in frequency and time domains, respectively. The computational procedure is given in detail. Finally, dynamic responses (frequency responses with nonzero and zero initial conditions, free vibration, forced vibrations with nonzero and zero initial conditions) of a FRP laminated composite plate are computed using the proposed methodology. The proposed methodology in this paper is easy to be inserted into the commercial finite element analysis software. The proposed assumption, based on the theory of material mechanics, needs to be further proved by experiment technique in the future.

  13. Approach to fitting parameters and clustering for characterising measured voltage dips based on two-dimensional polarisation ellipses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    García-Sánchez, Tania; Gómez-Lázaro, Emilio; Muljadi, E.

    An alternative approach to characterise real voltage dips is proposed and evaluated in this study. The proposed methodology is based on voltage-space vector solutions, identifying parameters for ellipses trajectories by using the least-squares algorithm applied on a sliding window along the disturbance. The most likely patterns are then estimated through a clustering process based on the k-means algorithm. The objective is to offer an efficient and easily implemented alternative to characterise faults and visualise the most likely instantaneous phase-voltage evolution during events through their corresponding voltage-space vector trajectories. This novel solution minimises the data to be stored but maintains extensivemore » information about the dips including starting and ending transients. The proposed methodology has been applied satisfactorily to real voltage dips obtained from intensive field-measurement campaigns carried out in a Spanish wind power plant up to a time period of several years. A comparison to traditional minimum root mean square-voltage and time-duration classifications is also included in this study.« less

  14. Accurate Energy Transaction Allocation using Path Integration and Interpolation

    NASA Astrophysics Data System (ADS)

    Bhide, Mandar Mohan

    This thesis investigates many of the popular cost allocation methods which are based on actual usage of the transmission network. The Energy Transaction Allocation (ETA) method originally proposed by A.Fradi, S.Brigonne and B.Wollenberg which gives unique advantage of accurately allocating the transmission network usage is discussed subsequently. Modified calculation of ETA based on simple interpolation technique is then proposed. The proposed methodology not only increase the accuracy of calculation but also decreases number of calculations to less than half of the number of calculations required in original ETAs.

  15. Methodology for processing pressure traces used as inputs for combustion analyses in diesel engines

    NASA Astrophysics Data System (ADS)

    Rašić, Davor; Vihar, Rok; Žvar Baškovič, Urban; Katrašnik, Tomaž

    2017-05-01

    This study proposes a novel methodology for designing an optimum equiripple finite impulse response (FIR) filter for processing in-cylinder pressure traces of a diesel internal combustion engine, which serve as inputs for high-precision combustion analyses. The proposed automated workflow is based on an innovative approach of determining the transition band frequencies and optimum filter order. The methodology is based on discrete Fourier transform analysis, which is the first step to estimate the location of the pass-band and stop-band frequencies. The second step uses short-time Fourier transform analysis to refine the estimated aforementioned frequencies. These pass-band and stop-band frequencies are further used to determine the most appropriate FIR filter order. The most widely used existing methods for estimating the FIR filter order are not effective in suppressing the oscillations in the rate- of-heat-release (ROHR) trace, thus hindering the accuracy of combustion analyses. To address this problem, an innovative method for determining the order of an FIR filter is proposed in this study. This method is based on the minimization of the integral of normalized signal-to-noise differences between the stop-band frequency and the Nyquist frequency. Developed filters were validated using spectral analysis and calculation of the ROHR. The validation results showed that the filters designed using the proposed innovative method were superior compared with those using the existing methods for all analyzed cases. Highlights • Pressure traces of a diesel engine were processed by finite impulse response (FIR) filters with different orders • Transition band frequencies were determined with an innovative method based on discrete Fourier transform and short-time Fourier transform • Spectral analyses showed deficiencies of existing methods in determining the FIR filter order • A new method of determining the FIR filter order for processing pressure traces was proposed • The efficiency of the new method was demonstrated by spectral analyses and calculations of rate-of-heat-release traces

  16. Enhanced methods for determining operational capabilities and support costs of proposed space systems

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles

    1993-01-01

    This report documents the work accomplished during the first two years of research to provide support to NASA in predicting operational and support parameters and costs of proposed space systems. The first year's research developed a methodology for deriving reliability and maintainability (R & M) parameters based upon the use of regression analysis to establish empirical relationships between performance and design specifications and corresponding mean times of failure and repair. The second year focused on enhancements to the methodology, increased scope of the model, and software improvements. This follow-on effort expands the prediction of R & M parameters and their effect on the operations and support of space transportation vehicles to include other system components such as booster rockets and external fuel tanks. It also increases the scope of the methodology and the capabilities of the model as implemented by the software. The focus is on the failure and repair of major subsystems and their impact on vehicle reliability, turn times, maintenance manpower, and repairable spares requirements. The report documents the data utilized in this study, outlines the general methodology for estimating and relating R&M parameters, presents the analyses and results of application to the initial data base, and describes the implementation of the methodology through the use of a computer model. The report concludes with a discussion on validation and a summary of the research findings and results.

  17. 76 FR 50993 - Agency Information Collection Activities: Proposed Collection; Comment Request-Generic Clearance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-17

    ...: Proposed Collection; Comment Request--Generic Clearance to Conduct Methodological Testing, Surveys, Focus... proposed information collection. This information collection will conduct research by methodological... Methodological Testing, Surveys, Focus Groups, and Related Tools to Improve the Management of Federal Nutrition...

  18. A novel neural network based image reconstruction model with scale and rotation invariance for target identification and classification for Active millimetre wave imaging

    NASA Astrophysics Data System (ADS)

    Agarwal, Smriti; Bisht, Amit Singh; Singh, Dharmendra; Pathak, Nagendra Prasad

    2014-12-01

    Millimetre wave imaging (MMW) is gaining tremendous interest among researchers, which has potential applications for security check, standoff personal screening, automotive collision-avoidance, and lot more. Current state-of-art imaging techniques viz. microwave and X-ray imaging suffers from lower resolution and harmful ionizing radiation, respectively. In contrast, MMW imaging operates at lower power and is non-ionizing, hence, medically safe. Despite these favourable attributes, MMW imaging encounters various challenges as; still it is very less explored area and lacks suitable imaging methodology for extracting complete target information. Keeping in view of these challenges, a MMW active imaging radar system at 60 GHz was designed for standoff imaging application. A C-scan (horizontal and vertical scanning) methodology was developed that provides cross-range resolution of 8.59 mm. The paper further details a suitable target identification and classification methodology. For identification of regular shape targets: mean-standard deviation based segmentation technique was formulated and further validated using a different target shape. For classification: probability density function based target material discrimination methodology was proposed and further validated on different dataset. Lastly, a novel artificial neural network based scale and rotation invariant, image reconstruction methodology has been proposed to counter the distortions in the image caused due to noise, rotation or scale variations. The designed neural network once trained with sample images, automatically takes care of these deformations and successfully reconstructs the corrected image for the test targets. Techniques developed in this paper are tested and validated using four different regular shapes viz. rectangle, square, triangle and circle.

  19. A Proposal for Methodology for Negotiation Practicum with Effective Use of ICT: A Technology Enhanced Course for Communication for Trust Building

    ERIC Educational Resources Information Center

    Yamamoto, Tosh; Tagami, Masanori; Nakazawa, Minoru

    2012-01-01

    This paper purports to demonstrate a problem solving case in the course of the development of methodology, in which the quality of the negotiation practicum is maintained or raised without sacrificing the class contact hours for the lessons for reading comprehension skills, on which the essence of negotiation practicum is solely based. In this…

  20. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  1. A Methodology for the Hybridization Based in Active Components: The Case of cGA and Scatter Search.

    PubMed

    Villagra, Andrea; Alba, Enrique; Leguizamón, Guillermo

    2016-01-01

    This work presents the results of a new methodology for hybridizing metaheuristics. By first locating the active components (parts) of one algorithm and then inserting them into second one, we can build efficient and accurate optimization, search, and learning algorithms. This gives a concrete way of constructing new techniques that contrasts the spread ad hoc way of hybridizing. In this paper, the enhanced algorithm is a Cellular Genetic Algorithm (cGA) which has been successfully used in the past to find solutions to such hard optimization problems. In order to extend and corroborate the use of active components as an emerging hybridization methodology, we propose here the use of active components taken from Scatter Search (SS) to improve cGA. The results obtained over a varied set of benchmarks are highly satisfactory in efficacy and efficiency when compared with a standard cGA. Moreover, the proposed hybrid approach (i.e., cGA+SS) has shown encouraging results with regard to earlier applications of our methodology.

  2. A goal programming approach for a joint design of macroeconomic and environmental policies: a methodological proposal and an application to the Spanish economy.

    PubMed

    André, Francisco J; Cardenete, M Alejandro; Romero, Carlos

    2009-05-01

    The economic policy needs to pay increasingly more attention to the environmental issues, which requires the development of methodologies able to incorporate environmental, as well as macroeconomic, goals in the design of public policies. Starting from this observation, this article proposes a methodology based upon a Simonian satisficing logic made operational with the help of goal programming (GP) models, to address the joint design of macroeconomic and environmental policies. The methodology is applied to the Spanish economy, where a joint policy is elicited, taking into consideration macroeconomic goals (economic growth, inflation, unemployment, public deficit) and environmental goals (CO(2), NO( x ) and SO( x ) emissions) within the context of a computable general equilibrium model. The results show how the government can "fine-tune" its policy according to different criteria using GP models. The resulting policies aggregate the environmental and the economic goals in different ways: maximum aggregate performance, maximum balance and a lexicographic hierarchy of the goals.

  3. A multichannel model-based methodology for extubation readiness decision of patients on weaning trials.

    PubMed

    Casaseca-de-la-Higuera, Pablo; Simmross-Wattenberg, Federico; Martín-Fernández, Marcos; Alberola-López, Carlos

    2009-07-01

    Discontinuation of mechanical ventilation is a challenging task that involves a number of subtle clinical issues. The gradual removal of the respiratory support (referred to as weaning) should be performed as soon as autonomous respiration can be sustained. However, the prediction rate of successful extubation is still below 25% based on previous studies. Construction of an automatic system that provides information on extubation readiness is thus desirable. Recent works have demonstrated that the breathing pattern variability is a useful extubation readiness indicator, with improving performance when multiple respiratory signals are jointly processed. However, the existing methods for predictor extraction present several drawbacks when length-limited time series are to be processed in heterogeneous groups of patients. In this paper, we propose a model-based methodology for automatic readiness prediction. It is intended to deal with multichannel, nonstationary, short records of the breathing pattern. Results on experimental data yield an 87.27% of successful readiness prediction, which is in line with the best figures reported in the literature. A comparative analysis shows that our methodology overcomes the shortcomings of so far proposed methods when applied to length-limited records on heterogeneous groups of patients.

  4. Soft tissue deformation modelling through neural dynamics-based reaction-diffusion mechanics.

    PubMed

    Zhang, Jinao; Zhong, Yongmin; Gu, Chengfan

    2018-05-30

    Soft tissue deformation modelling forms the basis of development of surgical simulation, surgical planning and robotic-assisted minimally invasive surgery. This paper presents a new methodology for modelling of soft tissue deformation based on reaction-diffusion mechanics via neural dynamics. The potential energy stored in soft tissues due to a mechanical load to deform tissues away from their rest state is treated as the equivalent transmembrane potential energy, and it is distributed in the tissue masses in the manner of reaction-diffusion propagation of nonlinear electrical waves. The reaction-diffusion propagation of mechanical potential energy and nonrigid mechanics of motion are combined to model soft tissue deformation and its dynamics, both of which are further formulated as the dynamics of cellular neural networks to achieve real-time computational performance. The proposed methodology is implemented with a haptic device for interactive soft tissue deformation with force feedback. Experimental results demonstrate that the proposed methodology exhibits nonlinear force-displacement relationship for nonlinear soft tissue deformation. Homogeneous, anisotropic and heterogeneous soft tissue material properties can be modelled through the inherent physical properties of mass points. Graphical abstract Soft tissue deformation modelling with haptic feedback via neural dynamics-based reaction-diffusion mechanics.

  5. One Controller at a Time (1-CAT): A mimo design methodology

    NASA Technical Reports Server (NTRS)

    Mitchell, J. R.; Lucas, J. C.

    1987-01-01

    The One Controller at a Time (1-CAT) methodology for designing digital controllers for Large Space Structures (LSS's) is introduced and illustrated. The flexible mode problem is first discussed. Next, desirable features of a LSS control system design methodology are delineated. The 1-CAT approach is presented, along with an analytical technique for carrying out the 1-CAT process. Next, 1-CAT is used to design digital controllers for the proposed Space Based Laser (SBL). Finally, the SBL design is evaluated for dynamical performance, noise rejection, and robustness.

  6. Distributed computing methodology for training neural networks in an image-guided diagnostic application.

    PubMed

    Plagianakos, V P; Magoulas, G D; Vrahatis, M N

    2006-03-01

    Distributed computing is a process through which a set of computers connected by a network is used collectively to solve a single problem. In this paper, we propose a distributed computing methodology for training neural networks for the detection of lesions in colonoscopy. Our approach is based on partitioning the training set across multiple processors using a parallel virtual machine. In this way, interconnected computers of varied architectures can be used for the distributed evaluation of the error function and gradient values, and, thus, training neural networks utilizing various learning methods. The proposed methodology has large granularity and low synchronization, and has been implemented and tested. Our results indicate that the parallel virtual machine implementation of the training algorithms developed leads to considerable speedup, especially when large network architectures and training sets are used.

  7. Day-Ahead Crude Oil Price Forecasting Using a Novel Morphological Component Analysis Based Model

    PubMed Central

    Zhu, Qing; Zou, Yingchao; Lai, Kin Keung

    2014-01-01

    As a typical nonlinear and dynamic system, the crude oil price movement is difficult to predict and its accurate forecasting remains the subject of intense research activity. Recent empirical evidence suggests that the multiscale data characteristics in the price movement are another important stylized fact. The incorporation of mixture of data characteristics in the time scale domain during the modelling process can lead to significant performance improvement. This paper proposes a novel morphological component analysis based hybrid methodology for modeling the multiscale heterogeneous characteristics of the price movement in the crude oil markets. Empirical studies in two representative benchmark crude oil markets reveal the existence of multiscale heterogeneous microdata structure. The significant performance improvement of the proposed algorithm incorporating the heterogeneous data characteristics, against benchmark random walk, ARMA, and SVR models, is also attributed to the innovative methodology proposed to incorporate this important stylized fact during the modelling process. Meanwhile, work in this paper offers additional insights into the heterogeneous market microstructure with economic viable interpretations. PMID:25061614

  8. Day-ahead crude oil price forecasting using a novel morphological component analysis based model.

    PubMed

    Zhu, Qing; He, Kaijian; Zou, Yingchao; Lai, Kin Keung

    2014-01-01

    As a typical nonlinear and dynamic system, the crude oil price movement is difficult to predict and its accurate forecasting remains the subject of intense research activity. Recent empirical evidence suggests that the multiscale data characteristics in the price movement are another important stylized fact. The incorporation of mixture of data characteristics in the time scale domain during the modelling process can lead to significant performance improvement. This paper proposes a novel morphological component analysis based hybrid methodology for modeling the multiscale heterogeneous characteristics of the price movement in the crude oil markets. Empirical studies in two representative benchmark crude oil markets reveal the existence of multiscale heterogeneous microdata structure. The significant performance improvement of the proposed algorithm incorporating the heterogeneous data characteristics, against benchmark random walk, ARMA, and SVR models, is also attributed to the innovative methodology proposed to incorporate this important stylized fact during the modelling process. Meanwhile, work in this paper offers additional insights into the heterogeneous market microstructure with economic viable interpretations.

  9. DB4US: A Decision Support System for Laboratory Information Management.

    PubMed

    Carmona-Cejudo, José M; Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael

    2012-11-14

    Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources.

  10. Performance-based methodology for assessing seismic vulnerability and capacity of buildings

    NASA Astrophysics Data System (ADS)

    Shibin, Lin; Lili, Xie; Maosheng, Gong; Ming, Li

    2010-06-01

    This paper presents a performance-based methodology for the assessment of seismic vulnerability and capacity of buildings. The vulnerability assessment methodology is based on the HAZUS methodology and the improved capacitydemand-diagram method. The spectral displacement ( S d ) of performance points on a capacity curve is used to estimate the damage level of a building. The relationship between S d and peak ground acceleration (PGA) is established, and then a new vulnerability function is expressed in terms of PGA. Furthermore, the expected value of the seismic capacity index (SCev) is provided to estimate the seismic capacity of buildings based on the probability distribution of damage levels and the corresponding seismic capacity index. The results indicate that the proposed vulnerability methodology is able to assess seismic damage of a large number of building stock directly and quickly following an earthquake. The SCev provides an effective index to measure the seismic capacity of buildings and illustrate the relationship between the seismic capacity of buildings and seismic action. The estimated result is compared with damage surveys of the cities of Dujiangyan and Jiangyou in the M8.0 Wenchuan earthquake, revealing that the methodology is acceptable for seismic risk assessment and decision making. The primary reasons for discrepancies between the estimated results and the damage surveys are discussed.

  11. A methodology for eliciting, representing, and analysing stakeholder knowledge for decision making on complex socio-ecological systems: from cognitive maps to agent-based models.

    PubMed

    Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J

    2015-03-15

    This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. On-line capillary electrophoresis-based enzymatic methodology for the study of polymer-drug conjugates.

    PubMed

    Coussot, G; Ladner, Y; Bayart, C; Faye, C; Vigier, V; Perrin, C

    2015-01-09

    This work aims at studying the potentialities of an on-line capillary electrophoresis (CE)-based digestion methodology for evaluating polymer-drug conjugates degradability in the presence of free trypsin (in-solution digestion). A sandwich plugs injection scheme with transverse diffusion of laminar profile (TDLFP) mode was used to achieve on-line digestions. Electrophoretic separation conditions were established using poly-l-Lysine (PLL) as reference substrate. Comparison with off-line digestion was carried out to demonstrate the feasibility of the proposed methodology. The applicability of the on-line CE-based digestion methodology was evaluated for two PLL-drug conjugates and for the four first generations of dendrigraft of lysine (DGL). Different electrophoretic profiles presenting the formation of di, tri, and tetralysine were observed for PLL-drug and DGL. These findings are in good agreement with the nature of the linker used to link the drug to PLL structure and the predicted degradability of DGL. The present on-line methodology applicability was also successfully proven for protein conjugates hydrolysis. In summary, the described methodology provides a powerful tool for the rapid study of biodegradable polymers. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Disturbance characteristics of half-selected cells in a cross-point resistive switching memory array

    NASA Astrophysics Data System (ADS)

    Chen, Zhe; Li, Haitong; Chen, Hong-Yu; Chen, Bing; Liu, Rui; Huang, Peng; Zhang, Feifei; Jiang, Zizhen; Ye, Hongfei; Gao, Bin; Liu, Lifeng; Liu, Xiaoyan; Kang, Jinfeng; Wong, H.-S. Philip; Yu, Shimeng

    2016-05-01

    Disturbance characteristics of cross-point resistive random access memory (RRAM) arrays are comprehensively studied in this paper. An analytical model is developed to quantify the number of pulses (#Pulse) the cell can bear before disturbance occurs under various sub-switching voltage stresses based on physical understanding. An evaluation methodology is proposed to assess the disturb behavior of half-selected (HS) cells in cross-point RRAM arrays by combining the analytical model and SPICE simulation. The characteristics of cross-point RRAM arrays such as energy consumption, reliable operating cycles and total error bits are evaluated by the methodology. A possible solution to mitigate disturbance is proposed.

  14. An almost-parameter-free harmony search algorithm for groundwater pollution source identification.

    PubMed

    Jiang, Simin; Zhang, Yali; Wang, Pei; Zheng, Maohui

    2013-01-01

    The spatiotemporal characterization of unknown sources of groundwater pollution is frequently encountered in environmental problems. This study adopts a simulation-optimization approach that combines a contaminant transport simulation model with a heuristic harmony search algorithm to identify unknown pollution sources. In the proposed methodology, an almost-parameter-free harmony search algorithm is developed. The performance of this methodology is evaluated on an illustrative groundwater pollution source identification problem, and the identified results indicate that the proposed almost-parameter-free harmony search algorithm-based optimization model can give satisfactory estimations, even when the irregular geometry, erroneous monitoring data, and prior information shortage of potential locations are considered.

  15. Determination of fragrance content in perfume by Raman spectroscopy and multivariate calibration

    NASA Astrophysics Data System (ADS)

    Godinho, Robson B.; Santos, Mauricio C.; Poppi, Ronei J.

    2016-03-01

    An alternative methodology is herein proposed for determination of fragrance content in perfumes and their classification according to the guidelines established by fine perfume manufacturers. The methodology is based on Raman spectroscopy associated with multivariate calibration, allowing the determination of fragrance content in a fast, nondestructive, and sustainable manner. The results were considered consistent with the conventional method, whose standard error of prediction values was lower than the 1.0%. This result indicates that the proposed technology is a feasible analytical tool for determination of the fragrance content in a hydro-alcoholic solution for use in manufacturing, quality control and regulatory agencies.

  16. On the spot damage detection methodology for highway bridges during natural crises : tech transfer summary.

    DOT National Transportation Integrated Search

    2010-07-01

    The objective of this work was to develop a : low-cost portable damage detection tool to : assess and predict damage areas in highway : bridges. : The proposed tool was based on standard : vibration-based damage identification (VBDI) : techniques but...

  17. Resistance Curves in the Tensile and Compressive Longitudinal Failure of Composites

    NASA Technical Reports Server (NTRS)

    Camanho, Pedro P.; Catalanotti, Giuseppe; Davila, Carlos G.; Lopes, Claudio S.; Bessa, Miguel A.; Xavier, Jose C.

    2010-01-01

    This paper presents a new methodology to measure the crack resistance curves associated with fiber-dominated failure modes in polymer-matrix composites. These crack resistance curves not only characterize the fracture toughness of the material, but are also the basis for the identification of the parameters of the softening laws used in the analytical and numerical simulation of fracture in composite materials. The method proposed is based on the identification of the crack tip location by the use of Digital Image Correlation and the calculation of the J-integral directly from the test data using a simple expression derived for cross-ply composite laminates. It is shown that the results obtained using the proposed methodology yield crack resistance curves similar to those obtained using FEM-based methods in compact tension carbon-epoxy specimens. However, it is also shown that the Digital Image Correlation based technique can be used to extract crack resistance curves in compact compression tests for which FEM-based techniques are inadequate.

  18. Translational behavioral medicine for population and individual health: gaps, opportunities, and vision for practice-based translational behavior change research.

    PubMed

    Ma, Jun; Lewis, Megan A; Smyth, Joshua M

    2018-04-12

    In this commentary, we propose a vision for "practice-based translational behavior change research," which we define as clinical and public health practice-embedded research on the implementation, optimization, and fundamental mechanisms of behavioral interventions. This vision intends to be inclusive of important research elements for behavioral intervention development, testing, and implementation. We discuss important research gaps and conceptual and methodological advances in three key areas along the discovery (development) to delivery (implementation) continuum of evidence-based interventions to improve behavior and health that could help achieve our vision of practice-based translational behavior change research. We expect our proposed vision to be refined and evolve over time. Through highlighting critical gaps that can be addressed by integrating modern theoretical and methodological approaches across disciplines in behavioral medicine, we hope to inspire the development and funding of innovative research on more potent and implementable behavior change interventions for optimal population and individual health.

  19. Differentiation of seborrheic keratosis from basal cell carcinoma, nevi and melanoma by RGB autofluorescence imaging

    PubMed Central

    Lihachev, Alexey; Lihacova, Ilze; Plorina, Emilija V.; Lange, Marta; Derjabo, Alexander; Spigulis, Janis

    2018-01-01

    A clinical trial on the autofluorescence imaging of skin lesions comprising 16 dermatologically confirmed pigmented nevi, 15 seborrheic keratosis, 2 dysplastic nevi, histologically confirmed 17 basal cell carcinomas and 1 melanoma was performed. The autofluorescence spatial properties of the skin lesions were acquired by smartphone RGB camera under 405 nm LED excitation. The diagnostic criterion is based on the calculation of the mean autofluorescence intensity of the examined lesion in the spectral range of 515 nm–700 nm. The proposed methodology is able to differentiate seborrheic keratosis from basal cell carcinoma, pigmented nevi and melanoma. The sensitivity and specificity of the proposed method was estimated as being close to 100%. The proposed methodology and potential clinical applications are discussed in this article. PMID:29675324

  20. Differentiation of seborrheic keratosis from basal cell carcinoma, nevi and melanoma by RGB autofluorescence imaging.

    PubMed

    Lihachev, Alexey; Lihacova, Ilze; Plorina, Emilija V; Lange, Marta; Derjabo, Alexander; Spigulis, Janis

    2018-04-01

    A clinical trial on the autofluorescence imaging of skin lesions comprising 16 dermatologically confirmed pigmented nevi, 15 seborrheic keratosis, 2 dysplastic nevi, histologically confirmed 17 basal cell carcinomas and 1 melanoma was performed. The autofluorescence spatial properties of the skin lesions were acquired by smartphone RGB camera under 405 nm LED excitation. The diagnostic criterion is based on the calculation of the mean autofluorescence intensity of the examined lesion in the spectral range of 515 nm-700 nm. The proposed methodology is able to differentiate seborrheic keratosis from basal cell carcinoma, pigmented nevi and melanoma. The sensitivity and specificity of the proposed method was estimated as being close to 100%. The proposed methodology and potential clinical applications are discussed in this article.

  1. Toxicity assessment of ionic liquids with Vibrio fischeri: an alternative fully automated methodology.

    PubMed

    Costa, Susana P F; Pinto, Paula C A G; Lapa, Rui A S; Saraiva, M Lúcia M F S

    2015-03-02

    A fully automated Vibrio fischeri methodology based on sequential injection analysis (SIA) has been developed. The methodology was based on the aspiration of 75 μL of bacteria and 50 μL of inhibitor followed by measurement of the luminescence of bacteria. The assays were conducted for contact times of 5, 15, and 30 min, by means of three mixing chambers that ensured adequate mixing conditions. The optimized methodology provided a precise control of the reaction conditions which is an asset for the analysis of a large number of samples. The developed methodology was applied to the evaluation of the impact of a set of ionic liquids (ILs) on V. fischeri and the results were compared with those provided by a conventional assay kit (Biotox(®)). The collected data evidenced the influence of different cation head groups and anion moieties on the toxicity of ILs. Generally, aromatic cations and fluorine-containing anions displayed higher impact on V. fischeri, evidenced by lower EC50. The proposed methodology was validated through statistical analysis which demonstrated a strong positive correlation (P>0.98) between assays. It is expected that the automated methodology can be tested for more classes of compounds and used as alternative to microplate based V. fischeri assay kits. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Semantic Network Adaptation Based on QoS Pattern Recognition for Multimedia Streams

    NASA Astrophysics Data System (ADS)

    Exposito, Ernesto; Gineste, Mathieu; Lamolle, Myriam; Gomez, Jorge

    This article proposes an ontology based pattern recognition methodology to compute and represent common QoS properties of the Application Data Units (ADU) of multimedia streams. The use of this ontology by mechanisms located at different layers of the communication architecture will allow implementing fine per-packet self-optimization of communication services regarding the actual application requirements. A case study showing how this methodology is used by error control mechanisms in the context of wireless networks is presented in order to demonstrate the feasibility and advantages of this approach.

  3. A Learning Framework for Control-Oriented Modeling of Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rubio-Herrero, Javier; Chandan, Vikas; Siegel, Charles M.

    Buildings consume a significant amount of energy worldwide. Several building optimization and control use cases require models of energy consumption which are control oriented, have high predictive capability, imposes minimal data pre-processing requirements, and have the ability to be adapted continuously to account for changing conditions as new data becomes available. Data driven modeling techniques, that have been investigated so far, while promising in the context of buildings, have been unable to simultaneously satisfy all the requirements mentioned above. In this context, deep learning techniques such as Recurrent Neural Networks (RNNs) hold promise, empowered by advanced computational capabilities and bigmore » data opportunities. In this paper, we propose a deep learning based methodology for the development of control oriented models for building energy management and test in on data from a real building. Results show that the proposed methodology outperforms other data driven modeling techniques significantly. We perform a detailed analysis of the proposed methodology along dimensions such as topology, sensitivity, and downsampling. Lastly, we conclude by envisioning a building analytics suite empowered by the proposed deep framework, that can drive several use cases related to building energy management.« less

  4. An LMI approach for the Integral Sliding Mode and H∞ State Feedback Control Problem

    NASA Astrophysics Data System (ADS)

    Bezzaoucha, Souad; Henry, David

    2015-11-01

    This paper deals with the state feedback control problem for linear uncertain systems subject to both matched and unmatched perturbations. The proposed control law is based on an the Integral Sliding Mode Control (ISMC) approach to tackle matched perturbations as well as the H∞ paradigm for robustness against unmatched perturbations. The proposed method also parallels the work presented in [1] which addressed the same problem and proposed a solution involving an Algebraic Riccati Equation (ARE)-based formulation. The contribution of this paper is concerned by the establishment of a Linear Matrix Inequality (LMI)-based solution which offers the possibility to consider other types of constraints such as 𝓓-stability constraints (pole assignment-like constraints). The proposed methodology is applied to a pilot three-tank system and experiment results illustrate the feasibility. Note that only a few real experiments have been rarely considered using SMC in the past. This is due to the high energetic behaviour of the control signal. It is important to outline that the paper does not aim at proposing a LMI formulation of an ARE. This is done since 1971 [2] and further discussed in [3] where the link between AREs and ARIs (algebraic Riccati inequality) is established for the H∞ control problem. The main contribution of this paper is to establish the adequate LMI-based methodology (changes of matrix variables) so that the ARE that corresponds to the particular structure of the mixed ISMC/H∞ structure proposed by [1] can be re-formulated within the LMI paradigm.

  5. Evidence-Based Administration for Decision Making in the Framework of Knowledge Strategic Management

    ERIC Educational Resources Information Center

    Del Junco, Julio Garcia; Zaballa, Rafael De Reyna; de Perea, Juan Garcia Alvarez

    2010-01-01

    Purpose: This paper seeks to present a model based on evidence-based administration (EBA), which aims to facilitate the creation, transformation and diffusion of knowledge in learning organizations. Design/methodology/approach: A theoretical framework is proposed based on EBA and the case method. Accordingly, an empirical study was carried out in…

  6. Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics.

    PubMed

    Patrizi, Alfredo; Pennestrì, Ettore; Valentini, Pier Paolo

    2016-01-01

    The paper deals with the comparison between a high-end marker-based acquisition system and a low-cost marker-less methodology for the assessment of the human posture during working tasks. The low-cost methodology is based on the use of a single Microsoft Kinect V1 device. The high-end acquisition system is the BTS SMART that requires the use of reflective markers to be placed on the subject's body. Three practical working activities involving object lifting and displacement have been investigated. The operational risk has been evaluated according to the lifting equation proposed by the American National Institute for Occupational Safety and Health. The results of the study show that the risk multipliers computed from the two acquisition methodologies are very close for all the analysed activities. In agreement to this outcome, the marker-less methodology based on the Microsoft Kinect V1 device seems very promising to promote the dissemination of computer-aided assessment of ergonomics while maintaining good accuracy and affordable costs. PRACTITIONER’S SUMMARY: The study is motivated by the increasing interest for on-site working ergonomics assessment. We compared a low-cost marker-less methodology with a high-end marker-based system. We tested them on three different working tasks, assessing the working risk of lifting loads. The two methodologies showed comparable precision in all the investigations.

  7. Evaluating and Redesigning Teaching Learning Sequences at the Introductory Physics Level

    ERIC Educational Resources Information Center

    Guisasola, Jenaro; Zuza, Kristina; Ametller, Jaume; Gutierrez-Berraondo, José

    2017-01-01

    In this paper we put forward a proposal for the design and evaluation of teaching and learning sequences in upper secondary school and university. We will connect our proposal with relevant contributions on the design of teaching sequences, ground it on the design-based research methodology, and discuss how teaching and learning sequences designed…

  8. Implementation of the Human Talent Management through Competencies Model in a University in Metropolitan Lima

    ERIC Educational Resources Information Center

    Rodríguez, Juan C.

    2015-01-01

    This article is a work proposal that aims to describe the methodology proposed by the Management of Personnel Management from a university in Lima, to implement a management model based on competencies which traceability involves various technical HR processes practiced in the organization and is aligned to institutional outcomes defined in the…

  9. Using artificial intelligence to bring evidence-based medicine a step closer to making the individual difference.

    PubMed

    Sissons, B; Gray, W A; Bater, A; Morrey, D

    2007-03-01

    The vision of evidence-based medicine is that of experienced clinicians systematically using the best research evidence to meet the individual patient's needs. This vision remains distant from clinical reality, as no complete methodology exists to apply objective, population-based research evidence to the needs of an individual real-world patient. We describe an approach, based on techniques from machine learning, to bridge this gap between evidence and individual patients in oncology. We examine existing proposals for tackling this gap and the relative benefits and challenges of our proposed, k-nearest-neighbour-based, approach.

  10. Application of Observing System Simulation Experiments (OSSEs) to determining science and user requirements for space-based missions

    NASA Astrophysics Data System (ADS)

    Atlas, R. M.

    2016-12-01

    Observing System Simulation Experiments (OSSEs) provide an effective method for evaluating the potential impact of proposed new observing systems, as well as for evaluating trade-offs in observing system design, and in developing and assessing improved methodology for assimilating new observations. As such, OSSEs can be an important tool for determining science and user requirements, and for incorporating these requirements into the planning for future missions. Detailed OSSEs have been conducted at NASA/ GSFC and NOAA/AOML in collaboration with Simpson Weather Associates and operational data assimilation centers over the last three decades. These OSSEs determined correctly the quantitative potential for several proposed satellite observing systems to improve weather analysis and prediction prior to their launch, evaluated trade-offs in orbits, coverage and accuracy for space-based wind lidars, and were used in the development of the methodology that led to the first beneficial impacts of satellite surface winds on numerical weather prediction. In this talk, the speaker will summarize the development of OSSE methodology, early and current applications of OSSEs and how OSSEs will evolve in order to enhance mission planning.

  11. Development of an electron paramagnetic resonance methodology for studying the photo-generation of reactive species in semiconductor nano-particle assembled films

    NASA Astrophysics Data System (ADS)

    Twardoch, Marek; Messai, Youcef; Vileno, Bertrand; Hoarau, Yannick; Mekki, Djamel E.; Felix, Olivier; Turek, Philippe; Weiss, Jean; Decher, Gero; Martel, David

    2018-06-01

    An experimental approach involving electron paramagnetic resonance is proposed for studying photo-generated reactive species in semiconductor nano-particle-based films deposited on the internal wall of glass capillaries. This methodology is applied here to nano-TiO2 and allows a semi-quantitative analysis of the kinetic evolutions of radical production using a spin scavenger probe.

  12. Methodology for designing psychological habitability for the space station.

    PubMed

    Komastubara, A

    2000-09-01

    Psychological habitability is a critical quality issue for the International Space Station because poor habitability degrades performance shaping factors (PSFs) and increases human errors. However, habitability often receives rather limited design attention based on someone's superficial tastes because systematic design procedures lack habitability quality. To improve design treatment of psychological habitability, this paper proposes and discusses a design methodology for designing psychological habitability for the International Space Station.

  13. An islanding detection methodology combining decision trees and Sandia frequency shift for inverter-based distributed generations

    DOE PAGES

    Azim, Riyasat; Li, Fangxing; Xue, Yaosuo; ...

    2017-07-14

    Distributed generations (DGs) for grid-connected applications require an accurate and reliable islanding detection methodology (IDM) for secure system operation. This paper presents an IDM for grid-connected inverter-based DGs. The proposed method is a combination of passive and active islanding detection techniques for aggregation of their advantages and elimination/minimisation of the drawbacks. In the proposed IDM, the passive method utilises critical system attributes extracted from local voltage measurements at target DG locations as well as employs decision tree-based classifiers for characterisation and detection of islanding events. The active method is based on Sandia frequency shift technique and is initiated only whenmore » the passive method is unable to differentiate islanding events from other system events. Thus, the power quality degradation introduced into the system by active islanding detection techniques can be minimised. Furthermore, a combination of active and passive techniques allows detection of islanding events under low power mismatch scenarios eliminating the disadvantage associated with the use of passive techniques alone. Finally, detailed case study results demonstrate the effectiveness of the proposed method in detection of islanding events under various power mismatch scenarios, load quality factors and in the presence of single or multiple grid-connected inverter-based DG units.« less

  14. An islanding detection methodology combining decision trees and Sandia frequency shift for inverter-based distributed generations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Azim, Riyasat; Li, Fangxing; Xue, Yaosuo

    Distributed generations (DGs) for grid-connected applications require an accurate and reliable islanding detection methodology (IDM) for secure system operation. This paper presents an IDM for grid-connected inverter-based DGs. The proposed method is a combination of passive and active islanding detection techniques for aggregation of their advantages and elimination/minimisation of the drawbacks. In the proposed IDM, the passive method utilises critical system attributes extracted from local voltage measurements at target DG locations as well as employs decision tree-based classifiers for characterisation and detection of islanding events. The active method is based on Sandia frequency shift technique and is initiated only whenmore » the passive method is unable to differentiate islanding events from other system events. Thus, the power quality degradation introduced into the system by active islanding detection techniques can be minimised. Furthermore, a combination of active and passive techniques allows detection of islanding events under low power mismatch scenarios eliminating the disadvantage associated with the use of passive techniques alone. Finally, detailed case study results demonstrate the effectiveness of the proposed method in detection of islanding events under various power mismatch scenarios, load quality factors and in the presence of single or multiple grid-connected inverter-based DG units.« less

  15. Performance-Driven Hybrid Full-Body Character Control for Navigation and Interaction in Virtual Environments

    NASA Astrophysics Data System (ADS)

    Mousas, Christos; Anagnostopoulos, Christos-Nikolaos

    2017-06-01

    This paper presents a hybrid character control interface that provides the ability to synthesize in real-time a variety of actions based on the user's performance capture. The proposed methodology enables three different performance interaction modules: the performance animation control that enables the direct mapping of the user's pose to the character, the motion controller that synthesizes the desired motion of the character based on an activity recognition methodology, and the hybrid control that lies within the performance animation and the motion controller. With the methodology presented, the user will have the freedom to interact within the virtual environment, as well as the ability to manipulate the character and to synthesize a variety of actions that cannot be performed directly by him/her, but which the system synthesizes. Therefore, the user is able to interact with the virtual environment in a more sophisticated fashion. This paper presents examples of different scenarios based on the three different full-body character control methodologies.

  16. Combining Learning and Assessment in Assessment-Based Gaming Environments: A Case Study from a New York City School

    ERIC Educational Resources Information Center

    Zapata-Rivera, Diego; VanWinkle, Waverely; Doyle, Bryan; Buteux, Alyssa; Bauer, Malcolm

    2009-01-01

    Purpose: The purpose of this paper is to propose and demonstrate an evidence-based scenario design framework for assessment-based computer games. Design/methodology/approach: The evidence-based scenario design framework is presented and demonstrated by using BELLA, a new assessment-based gaming environment aimed at supporting student learning of…

  17. Fusion of PAN and multispectral remote sensing images in shearlet domain by considering regional metrics

    NASA Astrophysics Data System (ADS)

    Poobalasubramanian, Mangalraj; Agrawal, Anupam

    2016-10-01

    The presented work proposes fusion of panchromatic and multispectral images in a shearlet domain. The proposed fusion rules rely on the regional considerations which makes the system efficient in terms of spatial enhancement. The luminance hue saturation-based color conversion system is utilized to avoid spectral distortions. The proposed fusion method is tested on Worldview2 and Ikonos datasets, and the proposed method is compared against other methodologies. The proposed fusion method performs well against the other compared methods in terms of subjective and objective evaluations.

  18. Modeling Web-Based Educational Systems: Process Design Teaching Model

    ERIC Educational Resources Information Center

    Rokou, Franca Pantano; Rokou, Elena; Rokos, Yannis

    2004-01-01

    Using modeling languages is essential to the construction of educational systems based on software engineering principles and methods. Furthermore, the instructional design is undoubtedly the cornerstone of the design and development of educational systems. Although several methodologies and languages have been proposed for the specification of…

  19. Full-field modal analysis during base motion excitation using high-speed 3D digital image correlation

    NASA Astrophysics Data System (ADS)

    Molina-Viedma, Ángel J.; López-Alba, Elías; Felipe-Sesé, Luis; Díaz, Francisco A.

    2017-10-01

    In recent years, many efforts have been made to exploit full-field measurement optical techniques for modal identification. Three-dimensional digital image correlation using high-speed cameras has been extensively employed for this purpose. Modal identification algorithms are applied to process the frequency response functions (FRF), which relate the displacement response of the structure to the excitation force. However, one of the most common tests for modal analysis involves the base motion excitation of a structural element instead of force excitation. In this case, the relationship between response and excitation is typically based on displacements, which are known as transmissibility functions. In this study, a methodology for experimental modal analysis using high-speed 3D digital image correlation and base motion excitation tests is proposed. In particular, a cantilever beam was excited from its base with a random signal, using a clamped edge join. Full-field transmissibility functions were obtained through the beam and converted into FRF for proper identification, considering a single degree-of-freedom theoretical conversion. Subsequently, modal identification was performed using a circle-fit approach. The proposed methodology facilitates the management of the typically large amounts of data points involved in the DIC measurement during modal identification. Moreover, it was possible to determine the natural frequencies, damping ratios and full-field mode shapes without requiring any additional tests. Finally, the results were experimentally validated by comparing them with those obtained by employing traditional accelerometers, analytical models and finite element method analyses. The comparison was performed by using the quantitative indicator modal assurance criterion. The results showed a high level of correspondence, consolidating the proposed experimental methodology.

  20. Application of new methodologies based on design of experiments, independent component analysis and design space for robust optimization in liquid chromatography.

    PubMed

    Debrus, Benjamin; Lebrun, Pierre; Ceccato, Attilio; Caliaro, Gabriel; Rozet, Eric; Nistor, Iolanda; Oprean, Radu; Rupérez, Francisco J; Barbas, Coral; Boulanger, Bruno; Hubert, Philippe

    2011-04-08

    HPLC separations of an unknown sample mixture and a pharmaceutical formulation have been optimized using a recently developed chemometric methodology proposed by W. Dewé et al. in 2004 and improved by P. Lebrun et al. in 2008. This methodology is based on experimental designs which are used to model retention times of compounds of interest. Then, the prediction accuracy and the optimal separation robustness, including the uncertainty study, were evaluated. Finally, the design space (ICH Q8(R1) guideline) was computed as the probability for a criterion to lie in a selected range of acceptance. Furthermore, the chromatograms were automatically read. Peak detection and peak matching were carried out with a previously developed methodology using independent component analysis published by B. Debrus et al. in 2009. The present successful applications strengthen the high potential of these methodologies for the automated development of chromatographic methods. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Evolving rule-based systems in two medical domains using genetic programming.

    PubMed

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan; Axer, Hubertus; Bjerregaard, Beth; von Keyserlingk, Diedrich Graf

    2004-11-01

    To demonstrate and compare the application of different genetic programming (GP) based intelligent methodologies for the construction of rule-based systems in two medical domains: the diagnosis of aphasia's subtypes and the classification of pap-smear examinations. Past data representing (a) successful diagnosis of aphasia's subtypes from collaborating medical experts through a free interview per patient, and (b) correctly classified smears (images of cells) by cyto-technologists, previously stained using the Papanicolaou method. Initially a hybrid approach is proposed, which combines standard genetic programming and heuristic hierarchical crisp rule-base construction. Then, genetic programming for the production of crisp rule based systems is attempted. Finally, another hybrid intelligent model is composed by a grammar driven genetic programming system for the generation of fuzzy rule-based systems. Results denote the effectiveness of the proposed systems, while they are also compared for their efficiency, accuracy and comprehensibility, to those of an inductive machine learning approach as well as to those of a standard genetic programming symbolic expression approach. The proposed GP-based intelligent methodologies are able to produce accurate and comprehensible results for medical experts performing competitive to other intelligent approaches. The aim of the authors was the production of accurate but also sensible decision rules that could potentially help medical doctors to extract conclusions, even at the expense of a higher classification score achievement.

  2. 76 FR 43360 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-20

    .... The text of the proposed rule change is set forth below. Proposed new language is italicized; proposed... methodology approved by FINRA as announced in a Regulatory Notice (``approved margin methodology''). The... an Approved Margin Methodology. Members shall require as a minimum for computing customer or broker...

  3. An ontology based trust verification of software license agreement

    NASA Astrophysics Data System (ADS)

    Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo

    2017-08-01

    When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.

  4. A Novel Performance Evaluation Methodology for Single-Target Trackers.

    PubMed

    Kristan, Matej; Matas, Jiri; Leonardis, Ales; Vojir, Tomas; Pflugfelder, Roman; Fernandez, Gustavo; Nebehay, Georg; Porikli, Fatih; Cehovin, Luka

    2016-11-01

    This paper addresses the problem of single-target tracker performance evaluation. We consider the performance measures, the dataset and the evaluation system to be the most important components of tracker evaluation and propose requirements for each of them. The requirements are the basis of a new evaluation methodology that aims at a simple and easily interpretable tracker comparison. The ranking-based methodology addresses tracker equivalence in terms of statistical significance and practical differences. A fully-annotated dataset with per-frame annotations with several visual attributes is introduced. The diversity of its visual properties is maximized in a novel way by clustering a large number of videos according to their visual attributes. This makes it the most sophistically constructed and annotated dataset to date. A multi-platform evaluation system allowing easy integration of third-party trackers is presented as well. The proposed evaluation methodology was tested on the VOT2014 challenge on the new dataset and 38 trackers, making it the largest benchmark to date. Most of the tested trackers are indeed state-of-the-art since they outperform the standard baselines, resulting in a highly-challenging benchmark. An exhaustive analysis of the dataset from the perspective of tracking difficulty is carried out. To facilitate tracker comparison a new performance visualization technique is proposed.

  5. Integrating Design and Manufacturing for a High Speed Civil Transport Wing

    NASA Technical Reports Server (NTRS)

    Marx, William J.; Mavris, Dimitri N.; Schrage, Daniel P.

    1994-01-01

    The aerospace industry is currently addressing the problem of integrating design and manufacturing. Because of the difficulties associated with using conventional, procedural techniques and algorithms, it is the authors' belief that the only feasible way to integrate the two concepts is with the development of an appropriate Knowledge-Based System (KBS). The authors propose a methodology for an aircraft producibility assessment, including a KBS, that addresses both procedural and heuristic aspects of integrating design and manufacturing of a High Speed Civil Transport (HSCT) wing. The HSCT was chosen as the focus of this investigation since it is a current NASA/aerospace industry initiative full of technological challenges involving many disciplines. The paper gives a brief background of selected previous supersonic transport studies followed by descriptions of key relevant design and manufacturing methodologies. Georgia Tech's Concurrent Engineering/Integrated Product and Process Development methodology is discussed with reference to this proposed conceptual producibility assessment. Evaluation criteria are presented that relate pertinent product and process parameters to overall product producibility. In addition, the authors' integration methodology and reasons for selecting a KBS to integrate design and manufacturing are presented in this paper. Finally, a proposed KBS is given, as well as statements of future work and overall investigation objectives.

  6. Power system market implementation in a deregulated environment

    NASA Astrophysics Data System (ADS)

    Silva, Carlos

    2000-10-01

    The opening of the power system markets (also known as deregulation) gives rise to issues never seen before by this industry. One of the most important is the control of information about the cost of generation. Information that used to be common knowledge is now kept private by the new agents of the system (generator companies, distribution companies, etc.). Data such as the generator cost functions are now known only by the owning companies. The result is a new system consisting of a group of independent firms seeking the maximization of their own profit. There have been many proposals to organize the new market in an economically efficient manner. Nevertheless, the uniqueness of the electric power system has prevented the development of such a market. This thesis evaluates the most common proposals using simulations in an auction setting. In addition a new methodology is proposed based on mechanism design, a common technique in economics, that solves some of the practical problems of power system markets (such as the management of limited transmission capacity). In this methodology, when each company acts in its best interest, the outcome is efficient in spite of the information problem cited above. This new methodology, along with the existing methodologies, are tested using simulation and analyzed to create a clear comparison of benefits and disadvantages.

  7. Development of a methodology for the detection of hospital financial outliers using information systems.

    PubMed

    Okada, Sachiko; Nagase, Keisuke; Ito, Ayako; Ando, Fumihiko; Nakagawa, Yoshiaki; Okamoto, Kazuya; Kume, Naoto; Takemura, Tadamasa; Kuroda, Tomohiro; Yoshihara, Hiroyuki

    2014-01-01

    Comparison of financial indices helps to illustrate differences in operations and efficiency among similar hospitals. Outlier data tend to influence statistical indices, and so detection of outliers is desirable. Development of a methodology for financial outlier detection using information systems will help to reduce the time and effort required, eliminate the subjective elements in detection of outlier data, and improve the efficiency and quality of analysis. The purpose of this research was to develop such a methodology. Financial outliers were defined based on a case model. An outlier-detection method using the distances between cases in multi-dimensional space is proposed. Experiments using three diagnosis groups indicated successful detection of cases for which the profitability and income structure differed from other cases. Therefore, the method proposed here can be used to detect outliers. Copyright © 2013 John Wiley & Sons, Ltd.

  8. C-Based Design Methodology and Topological Change for an Indian Agricultural Tractor Component

    NASA Astrophysics Data System (ADS)

    Matta, Anil Kumar; Raju, D. Ranga; Suman, K. N. S.; Kranthi, A. S.

    2018-06-01

    The failure of tractor components and their replacement has now become very common in India because of re-cycling, re-sale, and duplication. To over come the problem of failure we propose a design methodology for topological change co-simulating with software's. In the proposed Design methodology, the designer checks Paxial, Pcr, Pfailue, τ by hand calculations, from which refined topological changes of R.S.Arm are formed. We explained several techniques employed in the component for reduction, removal of rib material to change center of gravity and centroid point by using system C for mixed level simulation and faster topological changes. The design process in system C can be compiled and executed with software, TURBO C7. The modified component is developed in proE and analyzed in ANSYS. The topologically changed component with slot 120 × 4.75 × 32.5 mm at the center showed greater effectiveness than the original component.

  9. A proposed reductionist solution to address the methodological challenges of inconsistent reflexology maps and poor experimental controls in reflexology research: a discussion paper.

    PubMed

    Jones, Jenny; Thomson, Patricia; Lauder, William; Leslie, Stephen J

    2013-03-01

    Reflexology is a complex massage intervention, based on the concept that specific areas of the feet (reflex points) correspond to individual internal organs within the body. Reflexologists trained in the popular Ingham reflexology method claim that massage to these points, using massage techniques unique to reflexology, stimulates an increase in blood supply to the corresponding organ. Reflexology researchers face two key methodological challenges that need to be addressed if a specific treatment-related hemodynamic effect is to be scientifically demonstrated. The first is the problem of inconsistent reflexology foot maps; the second is the issue of poor experimental controls. This article proposes a potential experimental solution that we believe can address both methodological challenges and in doing so, allow any specific hemodynamic treatment effect unique to reflexology to experimentally reveal itself.

  10. Updating finite element dynamic models using an element-by-element sensitivity methodology

    NASA Technical Reports Server (NTRS)

    Farhat, Charbel; Hemez, Francois M.

    1993-01-01

    A sensitivity-based methodology for improving the finite element model of a given structure using test modal data and a few sensors is presented. The proposed method searches for both the location and sources of the mass and stiffness errors and does not interfere with the theory behind the finite element model while correcting these errors. The updating algorithm is derived from the unconstrained minimization of the squared L sub 2 norms of the modal dynamic residuals via an iterative two-step staggered procedure. At each iteration, the measured mode shapes are first expanded assuming that the model is error free, then the model parameters are corrected assuming that the expanded mode shapes are exact. The numerical algorithm is implemented in an element-by-element fashion and is capable of 'zooming' on the detected error locations. Several simulation examples which demonstate the potential of the proposed methodology are discussed.

  11. Robust decentralized controller for minimizing coupling effect in single inductor multiple output DC-DC converter operating in continuous conduction mode.

    PubMed

    Medeiros, Renan Landau Paiva de; Barra, Walter; Bessa, Iury Valente de; Chaves Filho, João Edgar; Ayres, Florindo Antonio de Cavalho; Neves, Cleonor Crescêncio das

    2018-02-01

    This paper describes a novel robust decentralized control design methodology for a single inductor multiple output (SIMO) DC-DC converter. Based on a nominal multiple input multiple output (MIMO) plant model and performance requirements, a pairing input-output analysis is performed to select the suitable input to control each output aiming to attenuate the loop coupling. Thus, the plant uncertainty limits are selected and expressed in interval form with parameter values of the plant model. A single inductor dual output (SIDO) DC-DC buck converter board is developed for experimental tests. The experimental results show that the proposed methodology can maintain a desirable performance even in the presence of parametric uncertainties. Furthermore, the performance indexes calculated from experimental data show that the proposed methodology outperforms classical MIMO control techniques. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  12. History and theoretical-methodological fundaments of Community Psychology in Ceará.

    PubMed

    Barros, João Paulo Pereira; Ximenes, Verônica Morais

    2016-01-01

    In this article we discuss the historical and theoretical-methodological aspects of the Community Psychology that has been developed in the state of Ceará, in northeastern Brazil, based on the praxis initiated by Professor Cezar Wagner de Lima Góis and further developed by the Community Psychology Nucleus (NUCOM) at the Federal University of Ceará. Important aspects of the beginning of this Community Psychology are presented, highlighting its academic and social perspectives. NUCOM is a space for the development of teaching, research, and outreach activities, which allows the systematization and deepening of this proposal for a different Community Psychology. Community Psychology is constituted by five theoretical-methodological marks: Popular Education, Biodance, Carl Rogers' Humanistic Approach, Cultural-Historical Psychology, and Liberation Psychology. Finally, the article describes the methods comprising this proposal for working in communities, which are sustained by pillars such as participation and problematizing dialogue.

  13. C-Based Design Methodology and Topological Change for an Indian Agricultural Tractor Component

    NASA Astrophysics Data System (ADS)

    Matta, Anil Kumar; Raju, D. Ranga; Suman, K. N. S.; Kranthi, A. S.

    2018-02-01

    The failure of tractor components and their replacement has now become very common in India because of re-cycling, re-sale, and duplication. To over come the problem of failure we propose a design methodology for topological change co-simulating with software's. In the proposed Design methodology, the designer checks Paxial, Pcr, Pfailue, τ by hand calculations, from which refined topological changes of R.S.Arm are formed. We explained several techniques employed in the component for reduction, removal of rib material to change center of gravity and centroid point by using system C for mixed level simulation and faster topological changes. The design process in system C can be compiled and executed with software, TURBO C7. The modified component is developed in proE and analyzed in ANSYS. The topologically changed component with slot 120 × 4.75 × 32.5 mm at the center showed greater effectiveness than the original component.

  14. Increasing the applicability of wind power projects via a multi-criteria approach: methodology and case study

    NASA Astrophysics Data System (ADS)

    Polatidis, Heracles; Morales, Jan Borràs

    2016-11-01

    In this paper a methodological framework for increasing the actual applicability of wind farms is developed and applied. The framework is based on multi-criteria decision aid techniques that perform an integrated technical and societal evaluation of a number of potential wind power projects that are a variation of a pre-existing actual proposal that faces implementation difficulties. A number of evaluation criteria are established and assessed via particular related software or are comparatively evaluated among each other on a semi-qualitative basis. The preference of a diverse audience of pertinent stakeholders can be also incorporated in the overall analysis. The result of the process is an identification of a new project that will exhibit increased actual implementation potential compared with the original proposal. The methodology is tested in a case study of a wind farm in the UK and relevant conclusions are drawn.

  15. Non-Fourier based thermal-mechanical tissue damage prediction for thermal ablation.

    PubMed

    Li, Xin; Zhong, Yongmin; Smith, Julian; Gu, Chengfan

    2017-01-02

    Prediction of tissue damage under thermal loads plays important role for thermal ablation planning. A new methodology is presented in this paper by combing non-Fourier bio-heat transfer, constitutive elastic mechanics as well as non-rigid motion of dynamics to predict and analyze thermal distribution, thermal-induced mechanical deformation and thermal-mechanical damage of soft tissues under thermal loads. Simulations and comparison analysis demonstrate that the proposed methodology based on the non-Fourier bio-heat transfer can account for the thermal-induced mechanical behaviors of soft tissues and predict tissue thermal damage more accurately than classical Fourier bio-heat transfer based model.

  16. Non-Fourier based thermal-mechanical tissue damage prediction for thermal ablation

    PubMed Central

    Li, Xin; Zhong, Yongmin; Smith, Julian; Gu, Chengfan

    2017-01-01

    ABSTRACT Prediction of tissue damage under thermal loads plays important role for thermal ablation planning. A new methodology is presented in this paper by combing non-Fourier bio-heat transfer, constitutive elastic mechanics as well as non-rigid motion of dynamics to predict and analyze thermal distribution, thermal-induced mechanical deformation and thermal-mechanical damage of soft tissues under thermal loads. Simulations and comparison analysis demonstrate that the proposed methodology based on the non-Fourier bio-heat transfer can account for the thermal-induced mechanical behaviors of soft tissues and predict tissue thermal damage more accurately than classical Fourier bio-heat transfer based model. PMID:27690290

  17. Risk analysis within environmental impact assessment of proposed construction activity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeleňáková, Martina; Zvijáková, Lenka

    Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less

  18. Comparison of power curve monitoring methods

    NASA Astrophysics Data System (ADS)

    Cambron, Philippe; Masson, Christian; Tahan, Antoine; Torres, David; Pelletier, Francis

    2017-11-01

    Performance monitoring is an important aspect of operating wind farms. This can be done through the power curve monitoring (PCM) of wind turbines (WT). In the past years, important work has been conducted on PCM. Various methodologies have been proposed, each one with interesting results. However, it is difficult to compare these methods because they have been developed using their respective data sets. The objective of this actual work is to compare some of the proposed PCM methods using common data sets. The metric used to compare the PCM methods is the time needed to detect a change in the power curve. Two power curve models will be covered to establish the effect the model type has on the monitoring outcomes. Each model was tested with two control charts. Other methodologies and metrics proposed in the literature for power curve monitoring such as areas under the power curve and the use of statistical copulas have also been covered. Results demonstrate that model-based PCM methods are more reliable at the detecting a performance change than other methodologies and that the effectiveness of the control chart depends on the types of shift observed.

  19. Soil Moisture Content Estimation Based on Sentinel-1 and Auxiliary Earth Observation Products. A Hydrological Approach

    PubMed Central

    Alexakis, Dimitrios D.; Mexis, Filippos-Dimitrios K.; Vozinaki, Anthi-Eirini K.; Daliakopoulos, Ioannis N.; Tsanis, Ioannis K.

    2017-01-01

    A methodology for elaborating multi-temporal Sentinel-1 and Landsat 8 satellite images for estimating topsoil Soil Moisture Content (SMC) to support hydrological simulation studies is proposed. After pre-processing the remote sensing data, backscattering coefficient, Normalized Difference Vegetation Index (NDVI), thermal infrared temperature and incidence angle parameters are assessed for their potential to infer ground measurements of SMC, collected at the top 5 cm. A non-linear approach using Artificial Neural Networks (ANNs) is tested. The methodology is applied in Western Crete, Greece, where a SMC gauge network was deployed during 2015. The performance of the proposed algorithm is evaluated using leave-one-out cross validation and sensitivity analysis. ANNs prove to be the most efficient in SMC estimation yielding R2 values between 0.7 and 0.9. The proposed methodology is used to support a hydrological simulation with the HEC-HMS model, applied at the Keramianos basin which is ungauged for SMC. Results and model sensitivity highlight the contribution of combining Sentinel-1 SAR and Landsat 8 images for improving SMC estimates and supporting hydrological studies. PMID:28635625

  20. Soil Moisture Content Estimation Based on Sentinel-1 and Auxiliary Earth Observation Products. A Hydrological Approach.

    PubMed

    Alexakis, Dimitrios D; Mexis, Filippos-Dimitrios K; Vozinaki, Anthi-Eirini K; Daliakopoulos, Ioannis N; Tsanis, Ioannis K

    2017-06-21

    A methodology for elaborating multi-temporal Sentinel-1 and Landsat 8 satellite images for estimating topsoil Soil Moisture Content (SMC) to support hydrological simulation studies is proposed. After pre-processing the remote sensing data, backscattering coefficient, Normalized Difference Vegetation Index (NDVI), thermal infrared temperature and incidence angle parameters are assessed for their potential to infer ground measurements of SMC, collected at the top 5 cm. A non-linear approach using Artificial Neural Networks (ANNs) is tested. The methodology is applied in Western Crete, Greece, where a SMC gauge network was deployed during 2015. The performance of the proposed algorithm is evaluated using leave-one-out cross validation and sensitivity analysis. ANNs prove to be the most efficient in SMC estimation yielding R² values between 0.7 and 0.9. The proposed methodology is used to support a hydrological simulation with the HEC-HMS model, applied at the Keramianos basin which is ungauged for SMC. Results and model sensitivity highlight the contribution of combining Sentinel-1 SAR and Landsat 8 images for improving SMC estimates and supporting hydrological studies.

  1. VARIABLE SELECTION FOR REGRESSION MODELS WITH MISSING DATA

    PubMed Central

    Garcia, Ramon I.; Ibrahim, Joseph G.; Zhu, Hongtu

    2009-01-01

    We consider the variable selection problem for a class of statistical models with missing data, including missing covariate and/or response data. We investigate the smoothly clipped absolute deviation penalty (SCAD) and adaptive LASSO and propose a unified model selection and estimation procedure for use in the presence of missing data. We develop a computationally attractive algorithm for simultaneously optimizing the penalized likelihood function and estimating the penalty parameters. Particularly, we propose to use a model selection criterion, called the ICQ statistic, for selecting the penalty parameters. We show that the variable selection procedure based on ICQ automatically and consistently selects the important covariates and leads to efficient estimates with oracle properties. The methodology is very general and can be applied to numerous situations involving missing data, from covariates missing at random in arbitrary regression models to nonignorably missing longitudinal responses and/or covariates. Simulations are given to demonstrate the methodology and examine the finite sample performance of the variable selection procedures. Melanoma data from a cancer clinical trial is presented to illustrate the proposed methodology. PMID:20336190

  2. Locally-Based Kernal PLS Smoothing to Non-Parametric Regression Curve Fitting

    NASA Technical Reports Server (NTRS)

    Rosipal, Roman; Trejo, Leonard J.; Wheeler, Kevin; Korsmeyer, David (Technical Monitor)

    2002-01-01

    We present a novel smoothing approach to non-parametric regression curve fitting. This is based on kernel partial least squares (PLS) regression in reproducing kernel Hilbert space. It is our concern to apply the methodology for smoothing experimental data where some level of knowledge about the approximate shape, local inhomogeneities or points where the desired function changes its curvature is known a priori or can be derived based on the observed noisy data. We propose locally-based kernel PLS regression that extends the previous kernel PLS methodology by incorporating this knowledge. We compare our approach with existing smoothing splines, hybrid adaptive splines and wavelet shrinkage techniques on two generated data sets.

  3. Quantifying construction and demolition waste: An analytical review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Zezhou; Yu, Ann T.W., E-mail: bsannyu@polyu.edu.hk; Shen, Liyin

    2014-09-15

    Highlights: • Prevailing C and D waste quantification methodologies are identified and compared. • One specific methodology cannot fulfill all waste quantification scenarios. • A relevance tree for appropriate quantification methodology selection is proposed. • More attentions should be paid to civil and infrastructural works. • Classified information is suggested for making an effective waste management plan. - Abstract: Quantifying construction and demolition (C and D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C and D waste generation at both regional and projectmore » levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C and D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested.« less

  4. Qualitative Assessment of Inquiry-Based Teaching Methods

    ERIC Educational Resources Information Center

    Briggs, Michael; Long, George; Owens, Katrina

    2011-01-01

    A new approach to teaching method assessment using student focused qualitative studies and the theoretical framework of mental models is proposed. The methodology is considered specifically for the advantages it offers when applied to the assessment of inquiry-based teaching methods. The theoretical foundation of mental models is discussed, and…

  5. Conducting "Good" Equity Research in Mathematics Education: A Question of Methodology

    ERIC Educational Resources Information Center

    Bullock, Erika C.

    2012-01-01

    Hostetler (2005) describes "good" education research as that which attends to the well-being of students, teachers, and communities. In this essay, the author proposes measuring equity-based research in mathematics education against Hostetler's standard. She argues that current equity-based research has not adequately addressed the…

  6. Web-Based Learning Design Tool

    ERIC Educational Resources Information Center

    Bruno, F. B.; Silva, T. L. K.; Silva, R. P.; Teixeira, F. G.

    2012-01-01

    Purpose: The purpose of this paper is to propose a web-based tool that enables the development and provision of learning designs and its reuse and re-contextualization as generative learning objects, aimed at developing educational materials. Design/methodology/approach: The use of learning objects can facilitate the process of production and…

  7. Psychological methodology will change profoundly due to the necessity to focus on intra-individual variation.

    PubMed

    Molenaar, Peter C M

    2007-03-01

    I am in general agreement with Toomela's (Integrative Psychological and Behavioral Science doi:10.1007/s12124-007-9004-0, 2007) plea for an alternative psychological methodology inspired by his description of the German-Austrian orientation. I will argue, however, that this alternative methodology has to be based on the classical ergodic theorems, using state-of-the-art statistical time series analysis of intra-individual variation as its main tool. Some more specific points made by Toomela will be criticized, while for others a more extreme elaboration along the lines indicated by Toomela is proposed.

  8. Methodological proposal for validation of the disinfecting efficacy of an automated flexible endoscope reprocessor

    PubMed Central

    Graziano, Kazuko Uchikawa; Pereira, Marta Elisa Auler; Koda, Elaine

    2016-01-01

    ABSTRACT Objective: to elaborate and apply a method to assess the efficacy of automated flexible endoscope reprocessors at a time when there is not an official method or trained laboratories to comply with the requirements described in specific standards for this type of health product in Brazil. Method: the present methodological study was developed based on the following theoretical references: International Organization for Standardization (ISO) standard ISO 15883-4/2008 and Brazilian Health Surveillance Agency (Agência Nacional de Vigilância Sanitária - ANVISA) Collegiate Board Resolution (Resolução de Diretoria Colegiada - RDC) no. 35/2010 and 15/2012. The proposed method was applied to a commercially available device using a high-level 0.2% peracetic acid-based disinfectant. Results: the proposed method of assessment was found to be robust when the recommendations made in the relevant legislation were incorporated with some adjustments to ensure their feasibility. Application of the proposed method provided evidence of the efficacy of the tested equipment for the high-level disinfection of endoscopes. Conclusion: the proposed method may serve as a reference for the assessment of flexible endoscope reprocessors, thereby providing solid ground for the purchase of this category of health products. PMID:27508915

  9. Strain gage based determination of mixed mode SIFs

    NASA Astrophysics Data System (ADS)

    Murthy, K. S. R. K.; Sarangi, H.; Chakraborty, D.

    2018-05-01

    Accurate determination of mixed mode stress intensity factors (SIFs) is essential in understanding and analysis of mixed mode fracture of engineering components. Only a few strain gage determination of mixed mode SIFs are reported in literatures and those also do not provide any prescription for radial locations of strain gages to ensure accuracy of measurement. The present investigation experimentally demonstrates the efficacy of a proposed methodology for the accurate determination of mixed mode I/II SIFs using strain gages. The proposed approach is based on the modified Dally and Berger's mixed mode technique. Using the proposed methodology appropriate gage locations (optimal locations) for a given configuration have also been suggested ensuring accurate determination of mixed mode SIFs. Experiments have been conducted by locating the gages at optimal and non-optimal locations to study the efficacy of the proposed approach. The experimental results from the present investigation show that highly accurate SIFs (0.064%) can be determined using the proposed approach if the gages are located at the suggested optimal locations. On the other hand, results also show the very high errors (212.22%) in measured SIFs possible if the gages are located at non-optimal locations. The present work thus clearly substantiates the importance of knowing the optimal locations of the strain gages apriori in accurate determination of SIFs.

  10. New methodology of designing inexpensive hybrid control-acquisition systems for mechatronic constructions.

    PubMed

    Augustyn, Jacek

    2013-12-13

    This article presents a new methodology for designing a hybrid control and acquisition system consisting of a 32-bit SoC microsystem connected via a direct Universal Serial Bus (USB) with a standard commercial off-the-shelf (COTS) component running the Android operating system. It is proposed to utilize it avoiding the use of an additional converter. An Android-based component was chosen to explore the potential for a mobile, compact and energy efficient solution with easy to build user interfaces and easy wireless integration with other computer systems. This paper presents results of practical implementation and analysis of experimental real-time performance. It covers closed control loop time between the sensor/actuator module and the Android operating system as well as the real-time sensor data stream within such a system. Some optimisations are proposed and their influence on real-time performance was investigated. The proposed methodology is intended for acquisition and control of mechatronic systems, especially mobile robots. It can be used in a wide range of control applications as well as embedded acquisition-recording devices, including energy quality measurements, smart-grids and medicine. It is demonstrated that the proposed methodology can be employed without developing specific device drivers. The latency achieved was less than 0.5 ms and the sensor data stream throughput was on the order of 750 KB/s (compared to 3 ms latency and 300 KB/s in traditional solutions).

  11. New Methodology of Designing Inexpensive Hybrid Control-Acquisition Systems for Mechatronic Constructions

    PubMed Central

    Augustyn, Jacek

    2013-01-01

    This article presents a new methodology for designing a hybrid control and acquisition system consisting of a 32-bit SoC microsystem connected via a direct Universal Serial Bus (USB) with a standard commercial off-the-shelf (COTS) component running the Android operating system. It is proposed to utilize it avoiding the use of an additional converter. An Android-based component was chosen to explore the potential for a mobile, compact and energy efficient solution with easy to build user interfaces and easy wireless integration with other computer systems. This paper presents results of practical implementation and analysis of experimental real-time performance. It covers closed control loop time between the sensor/actuator module and the Android operating system as well as the real-time sensor data stream within such a system. Some optimisations are proposed and their influence on real-time performance was investigated. The proposed methodology is intended for acquisition and control of mechatronic systems, especially mobile robots. It can be used in a wide range of control applications as well as embedded acquisition-recording devices, including energy quality measurements, smart-grids and medicine. It is demonstrated that the proposed methodology can be employed without developing specific device drivers. The latency achieved was less than 0.5 ms and the sensor data stream throughput was on the order of 750 KB/s (compared to 3 ms latency and 300 KB/s in traditional solutions). PMID:24351633

  12. A Proposed Methodology for the Conceptualization, Operationalization, and Empirical Validation of the Concept of Information Need

    ERIC Educational Resources Information Center

    Afzal, Waseem

    2017-01-01

    Introduction: The purpose of this paper is to propose a methodology to conceptualize, operationalize, and empirically validate the concept of information need. Method: The proposed methodology makes use of both qualitative and quantitative perspectives, and includes a broad array of approaches such as literature reviews, expert opinions, focus…

  13. Students' Performance When Aurally Identifying Musical Harmonic Intervals: Experimentation of a Teaching Innovation Proposal

    ERIC Educational Resources Information Center

    Ponsatí, Imma; Miranda, Joaquim; Amador, Miquel; Godall, Pere

    2016-01-01

    The aim of the study was to measure the performance reached by students (N = 138) when aurally identifying musical harmonic intervals (from m2 to P8) after having experienced a teaching innovation proposal for the Music Conservatories of Catalonia (Spain) based on observational methodology. Its design took into account several issues, which had…

  14. Evaluation of arterial propagation velocity based on the automated analysis of the Pulse Wave Shape

    NASA Astrophysics Data System (ADS)

    Clara, F. M.; Scandurra, A. G.; Meschino, G. J.; Passoni, L. I.

    2011-12-01

    This paper proposes the automatic estimation of the arterial propagation velocity from the pulse wave raw records measured in the region of the radial artery. A fully automatic process is proposed to select and analyze typical pulse cycles from the raw data. An adaptive neuro-fuzzy inference system, together with a heuristic search is used to find a functional approximation of the pulse wave. The estimation of the propagation velocity is carried out via the analysis of the functional approximation obtained with the fuzzy model. The analysis of the pulse wave records with the proposed methodology showed small differences compared with the method used so far, based on a strong interaction with the user. To evaluate the proposed methodology, we estimated the propagation velocity in a population of healthy men from a wide range of ages. It has been found in these studies that propagation velocity increases linearly with age and it presents a considerable dispersion of values in healthy individuals. We conclude that this process could be used to evaluate indirectly the propagation velocity of the aorta, which is related to physiological age in healthy individuals and with the expectation of life in cardiovascular patients.

  15. A Chip and Pixel Qualification Methodology on Imaging Sensors

    NASA Technical Reports Server (NTRS)

    Chen, Yuan; Guertin, Steven M.; Petkov, Mihail; Nguyen, Duc N.; Novak, Frank

    2004-01-01

    This paper presents a qualification methodology on imaging sensors. In addition to overall chip reliability characterization based on sensor s overall figure of merit, such as Dark Rate, Linearity, Dark Current Non-Uniformity, Fixed Pattern Noise and Photon Response Non-Uniformity, a simulation technique is proposed and used to project pixel reliability. The projected pixel reliability is directly related to imaging quality and provides additional sensor reliability information and performance control.

  16. Determination of fragrance content in perfume by Raman spectroscopy and multivariate calibration.

    PubMed

    Godinho, Robson B; Santos, Mauricio C; Poppi, Ronei J

    2016-03-15

    An alternative methodology is herein proposed for determination of fragrance content in perfumes and their classification according to the guidelines established by fine perfume manufacturers. The methodology is based on Raman spectroscopy associated with multivariate calibration, allowing the determination of fragrance content in a fast, nondestructive, and sustainable manner. The results were considered consistent with the conventional method, whose standard error of prediction values was lower than the 1.0%. This result indicates that the proposed technology is a feasible analytical tool for determination of the fragrance content in a hydro-alcoholic solution for use in manufacturing, quality control and regulatory agencies. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Rapid determination of Faraday rotation in optical glasses by means of secondary Faraday modulator.

    PubMed

    Sofronie, M; Elisa, M; Sava, B A; Boroica, L; Valeanu, M; Kuncser, V

    2015-05-01

    A rapid high sensitive method for determining the Faraday rotation of optical glasses is proposed. Starting from an experimental setup based on a Faraday rod coupled to a lock-in amplifier in the detection chain, two methodologies were developed for providing reliable results on samples presenting low and large Faraday rotations. The proposed methodologies were critically discussed and compared, via results obtained in transmission geometry, on a new series of aluminophosphate glasses with or without rare-earth doping ions. An example on how the method can be used for a rapid examination of the optical homogeneity of the sample with respect to magneto-optical effects is also provided.

  18. Superresolution restoration of an image sequence: adaptive filtering approach.

    PubMed

    Elad, M; Feuer, A

    1999-01-01

    This paper presents a new method based on adaptive filtering theory for superresolution restoration of continuous image sequences. The proposed methodology suggests least squares (LS) estimators which adapt in time, based on adaptive filters, least mean squares (LMS) or recursive least squares (RLS). The adaptation enables the treatment of linear space and time-variant blurring and arbitrary motion, both of them assumed known. The proposed new approach is shown to be of relatively low computational requirements. Simulations demonstrating the superresolution restoration algorithms are presented.

  19. A Design Methodology for Medical Processes.

    PubMed

    Ferrante, Simona; Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient's needs, the uncertainty of the patient's response, and the indeterminacy of patient's compliance to treatment. Also, the multiple actors involved in patient's care need clear and transparent communication to ensure care coordination. In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution.

  20. Relative Panoramic Camera Position Estimation for Image-Based Virtual Reality Networks in Indoor Environments

    NASA Astrophysics Data System (ADS)

    Nakagawa, M.; Akano, K.; Kobayashi, T.; Sekiguchi, Y.

    2017-09-01

    Image-based virtual reality (VR) is a virtual space generated with panoramic images projected onto a primitive model. In imagebased VR, realistic VR scenes can be generated with lower rendering cost, and network data can be described as relationships among VR scenes. The camera network data are generated manually or by an automated procedure using camera position and rotation data. When panoramic images are acquired in indoor environments, network data should be generated without Global Navigation Satellite Systems (GNSS) positioning data. Thus, we focused on image-based VR generation using a panoramic camera in indoor environments. We propose a methodology to automate network data generation using panoramic images for an image-based VR space. We verified and evaluated our methodology through five experiments in indoor environments, including a corridor, elevator hall, room, and stairs. We confirmed that our methodology can automatically reconstruct network data using panoramic images for image-based VR in indoor environments without GNSS position data.

  1. Virtual-pulse time integral methodology: A new explicit approach for computational dynamics - Theoretical developments for general nonlinear structural dynamics

    NASA Technical Reports Server (NTRS)

    Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong

    1993-01-01

    The present paper describes a new explicit virtual-pulse time integral methodology for nonlinear structural dynamics problems. The purpose of the paper is to provide the theoretical basis of the methodology and to demonstrate applicability of the proposed formulations to nonlinear dynamic structures. Different from the existing numerical methods such as direct time integrations or mode superposition techniques, the proposed methodology offers new perspectives and methodology of development, and possesses several unique and attractive computational characteristics. The methodology is tested and compared with the implicit Newmark method (trapezoidal rule) through a nonlinear softening and hardening spring dynamic models. The numerical results indicate that the proposed explicit virtual-pulse time integral methodology is an excellent alternative for solving general nonlinear dynamic problems.

  2. The added value of thorough economic evaluation of telemedicine networks.

    PubMed

    Le Goff-Pronost, Myriam; Sicotte, Claude

    2010-02-01

    This paper proposes a thorough framework for the economic evaluation of telemedicine networks. A standard cost analysis methodology was used as the initial base, similar to the evaluation method currently being applied to telemedicine, and to which we suggest adding subsequent stages that enhance the scope and sophistication of the analytical methodology. We completed the methodology with a longitudinal and stakeholder analysis, followed by the calculation of a break-even threshold, a calculation of the economic outcome based on net present value (NPV), an estimate of the social gain through external effects, and an assessment of the probability of social benefits. In order to illustrate the advantages, constraints and limitations of the proposed framework, we tested it in a paediatric cardiology tele-expertise network. The results demonstrate that the project threshold was not reached after the 4 years of the study. Also, the calculation of the project's NPV remained negative. However, the additional analytical steps of the proposed framework allowed us to highlight alternatives that can make this service economically viable. These included: use over an extended period of time, extending the network to other telemedicine specialties, or including it in the services offered by other community hospitals. In sum, the results presented here demonstrate the usefulness of an economic evaluation framework as a way of offering decision makers the tools they need to make comprehensive evaluations of telemedicine networks.

  3. Evaluation models of some morphological characteristics for talent scouting in sport.

    PubMed

    Rogulj, Nenad; Papić, Vladan; Cavala, Marijana

    2009-03-01

    In this paper, for the purpose of expert system evaluation within the scientific project "Talent scouting in sport", two methodological approaches for recognizing an athlete's morphological compatibility for various sports has been presented, evaluated and compared. First approach is based on the fuzzy logic and expert opinion about compatibility of proposed hypothetical morphological models for 14 different sports which are part of the expert system. Second approach is based on determining the differences between morphological characteristics of a tested individual and top athlete's morphological characteristics for particular sport. Logical and mathematical bases of both methodological approaches have been explained in detail. High prognostic efficiency in recognition of individual's sport has been determined. Some improvements in further development of both methods have been proposed. Results of the research so far suggest that this or similar approaches can be successfully used for detection of individual's morphological compatibility for different sports. Also, it is expected to be useful in the selection of young talents for particular sport.

  4. Thermodynamic analysis of steam-injected advanced gas turbine cycles

    NASA Astrophysics Data System (ADS)

    Pandey, Devendra; Bade, Mukund H.

    2017-12-01

    This paper deals with thermodynamic analysis of steam-injected gas turbine (STIGT) cycle. To analyse the thermodynamic performance of steam-injected gas turbine (STIGT) cycles, a methodology based on pinch analysis is proposed. This graphical methodology is a systematic approach proposed for a selection of gas turbine with steam injection. The developed graphs are useful for selection of steam-injected gas turbine (STIGT) for optimal operation of it and helps designer to take appropriate decision. The selection of steam-injected gas turbine (STIGT) cycle can be done either at minimum steam ratio (ratio of mass flow rate of steam to air) with maximum efficiency or at maximum steam ratio with maximum net work conditions based on the objective of plants designer. Operating the steam injection based advanced gas turbine plant at minimum steam ratio improves efficiency, resulting in reduction of pollution caused by the emission of flue gases. On the other hand, operating plant at maximum steam ratio can result in maximum work output and hence higher available power.

  5. Evolving RBF neural networks for adaptive soft-sensor design.

    PubMed

    Alexandridis, Alex

    2013-12-01

    This work presents an adaptive framework for building soft-sensors based on radial basis function (RBF) neural network models. The adaptive fuzzy means algorithm is utilized in order to evolve an RBF network, which approximates the unknown system based on input-output data from it. The methodology gradually builds the RBF network model, based on two separate levels of adaptation: On the first level, the structure of the hidden layer is modified by adding or deleting RBF centers, while on the second level, the synaptic weights are adjusted with the recursive least squares with exponential forgetting algorithm. The proposed approach is tested on two different systems, namely a simulated nonlinear DC Motor and a real industrial reactor. The results show that the produced soft-sensors can be successfully applied to model the two nonlinear systems. A comparison with two different adaptive modeling techniques, namely a dynamic evolving neural-fuzzy inference system (DENFIS) and neural networks trained with online backpropagation, highlights the advantages of the proposed methodology.

  6. Analog design optimization methodology for ultralow-power circuits using intuitive inversion-level and saturation-level parameters

    NASA Astrophysics Data System (ADS)

    Eimori, Takahisa; Anami, Kenji; Yoshimatsu, Norifumi; Hasebe, Tetsuya; Murakami, Kazuaki

    2014-01-01

    A comprehensive design optimization methodology using intuitive nondimensional parameters of inversion-level and saturation-level is proposed, especially for ultralow-power, low-voltage, and high-performance analog circuits with mixed strong, moderate, and weak inversion metal-oxide-semiconductor transistor (MOST) operations. This methodology is based on the synthesized charge-based MOST model composed of Enz-Krummenacher-Vittoz (EKV) basic concepts and advanced-compact-model (ACM) physics-based equations. The key concept of this methodology is that all circuit and system characteristics are described as some multivariate functions of inversion-level parameters, where the inversion level is used as an independent variable representative of each MOST. The analog circuit design starts from the first step of inversion-level design using universal characteristics expressed by circuit currents and inversion-level parameters without process-dependent parameters, followed by the second step of foundry-process-dependent design and the last step of verification using saturation-level criteria. This methodology also paves the way to an intuitive and comprehensive design approach for many kinds of analog circuit specifications by optimization using inversion-level log-scale diagrams and saturation-level criteria. In this paper, we introduce an example of our design methodology for a two-stage Miller amplifier.

  7. A new wave front shape-based approach for acoustic source localization in an anisotropic plate without knowing its material properties.

    PubMed

    Sen, Novonil; Kundu, Tribikram

    2018-07-01

    Estimating the location of an acoustic source in a structure is an important step towards passive structural health monitoring. Techniques for localizing an acoustic source in isotropic structures are well developed in the literature. Development of similar techniques for anisotropic structures, however, has gained attention only in the recent years and has a scope of further improvement. Most of the existing techniques for anisotropic structures either assume a straight line wave propagation path between the source and an ultrasonic sensor or require the material properties to be known. This study considers different shapes of the wave front generated during an acoustic event and develops a methodology to localize the acoustic source in an anisotropic plate from those wave front shapes. An elliptical wave front shape-based technique was developed first, followed by the development of a parametric curve-based technique for non-elliptical wave front shapes. The source coordinates are obtained by minimizing an objective function. The proposed methodology does not assume a straight line wave propagation path and can predict the source location without any knowledge of the elastic properties of the material. A numerical study presented here illustrates how the proposed methodology can accurately estimate the source coordinates. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. The microwave-assisted ionic-liquid method: a promising methodology in nanomaterials.

    PubMed

    Ma, Ming-Guo; Zhu, Jie-Fang; Zhu, Ying-Jie; Sun, Run-Cang

    2014-09-01

    In recent years, the microwave-assisted ionic-liquid method has been accepted as a promising methodology for the preparation of nanomaterials and cellulose-based nanocomposites. Applications of this method in the preparation of cellulose-based nanocomposites comply with the major principles of green chemistry, that is, they use an environmentally friendly method in environmentally preferable solvents to make use of renewable materials. This minireview focuses on the recent development of the synthesis of nanomaterials and cellulose-based nanocomposites by means of the microwave-assisted ionic-liquid method. We first discuss the preparation of nanomaterials including noble metals, metal oxides, complex metal oxides, metal sulfides, and other nanomaterials by means of this method. Then we provide an overview of the synthesis of cellulose-based nanocomposites by using this method. The emphasis is on the synthesis, microstructure, and properties of nanostructured materials obtained through this methodology. Our recent research on nanomaterials and cellulose-based nanocomposites by this rapid method is summarized. In addition, the formation mechanisms involved in the microwave-assisted ionic-liquid synthesis of nanostructured materials are discussed briefly. Finally, the future perspectives of this methodology in the synthesis of nanostructured materials are proposed. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. A novel multi-model neuro-fuzzy-based MPPT for three-phase grid-connected photovoltaic system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chaouachi, Aymen; Kamel, Rashad M.; Nagasaka, Ken

    This paper presents a novel methodology for Maximum Power Point Tracking (MPPT) of a grid-connected 20 kW photovoltaic (PV) system using neuro-fuzzy network. The proposed method predicts the reference PV voltage guarantying optimal power transfer between the PV generator and the main utility grid. The neuro-fuzzy network is composed of a fuzzy rule-based classifier and three multi-layered feed forwarded Artificial Neural Networks (ANN). Inputs of the network (irradiance and temperature) are classified before they are fed into the appropriated ANN for either training or estimation process while the output is the reference voltage. The main advantage of the proposed methodology,more » comparing to a conventional single neural network-based approach, is the distinct generalization ability regarding to the nonlinear and dynamic behavior of a PV generator. In fact, the neuro-fuzzy network is a neural network based multi-model machine learning that defines a set of local models emulating the complex and nonlinear behavior of a PV generator under a wide range of operating conditions. Simulation results under several rapid irradiance variations proved that the proposed MPPT method fulfilled the highest efficiency comparing to a conventional single neural network and the Perturb and Observe (P and O) algorithm dispositive. (author)« less

  10. A new methodology for determining dispersion coefficient using ordinary and partial differential transport equations.

    PubMed

    Cho, Kyung Hwa; Lee, Seungwon; Ham, Young Sik; Hwang, Jin Hwan; Cha, Sung Min; Park, Yongeun; Kim, Joon Ha

    2009-01-01

    The present study proposes a methodology for determining the effective dispersion coefficient based on the field measurements performed in Gwangju (GJ) Creek in South Korea which is environmentally degraded by the artificial interferences such as weirs and culverts. Many previous works determining the dispersion coefficient were limited in application due to the complexity and artificial interferences in natural stream. Therefore, the sequential combination of N-Tank-In-Series (NTIS) model and Advection-Dispersion-Reaction (ADR) model was proposed for evaluating dispersion process in complex stream channel in this study. The series of water quality data were intensively monitored in the field to determine the effective dispersion coefficient of E. coli in rainy day. As a result, the suggested methodology reasonably estimates the dispersion coefficient for GJ Creek with 1.25 m(2)/s. Also, the sequential combined method provided Number of tank-Velocity-Dispersion coefficient (NVD) curves for convenient evaluation of dispersion coefficient of other rivers or streams. Comparing the previous studies, the present methodology is quite general and simple for determining the effective dispersion coefficients which are applicable for other rivers and streams.

  11. A Methodology for the Hybridization Based in Active Components: The Case of cGA and Scatter Search

    PubMed Central

    Alba, Enrique; Leguizamón, Guillermo

    2016-01-01

    This work presents the results of a new methodology for hybridizing metaheuristics. By first locating the active components (parts) of one algorithm and then inserting them into second one, we can build efficient and accurate optimization, search, and learning algorithms. This gives a concrete way of constructing new techniques that contrasts the spread ad hoc way of hybridizing. In this paper, the enhanced algorithm is a Cellular Genetic Algorithm (cGA) which has been successfully used in the past to find solutions to such hard optimization problems. In order to extend and corroborate the use of active components as an emerging hybridization methodology, we propose here the use of active components taken from Scatter Search (SS) to improve cGA. The results obtained over a varied set of benchmarks are highly satisfactory in efficacy and efficiency when compared with a standard cGA. Moreover, the proposed hybrid approach (i.e., cGA+SS) has shown encouraging results with regard to earlier applications of our methodology. PMID:27403153

  12. Report on FY17 testing in support of integrated EPP-SMT design methods development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yanli .; Jetter, Robert I.; Sham, T. -L.

    The goal of the proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology is to incorporate a SMT data-based approach for creep-fatigue damage evaluation into the EPP methodology to avoid the separate evaluation of creep and fatigue damage and eliminate the requirement for stress classification in current methods; thus greatly simplifying evaluation of elevated temperature cyclic service. The purpose of this methodology is to minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed methodology and to verify the applicability of the code rules, thermomechanical tests continued in FY17. Thismore » report presents the recent test results for Type 1 SMT specimens on Alloy 617 with long hold times, pressurization SMT on Alloy 617, and two-bar thermal ratcheting test results on SS316H at the temperature range of 405 °C to 705 °C. Preliminary EPP strain range analysis on the two-bar tests are critically evaluated and compared with the experimental results.« less

  13. Methodology for the Assessment of the Ecotoxicological Potential of Construction Materials

    PubMed Central

    Rodrigues, Patrícia; Silvestre, José D.; Flores-Colen, Inês; Viegas, Cristina A.; de Brito, Jorge; Kurad, Rawaz; Demertzi, Martha

    2017-01-01

    Innovation in construction materials (CM) implies changing their composition by incorporating raw materials, usually non-traditional ones, which confer the desired characteristics. However, this practice may have unknown risks. This paper discusses the ecotoxicological potential associated with raw and construction materials, and proposes and applies a methodology for the assessment of their ecotoxicological potential. This methodology is based on existing laws, such as Regulation (European Commission) No. 1907/2006 (REACH—Registration, Evaluation, Authorization and Restriction of Chemicals) and Regulation (European Commission) No. 1272/2008 (CLP—Classification, Labelling and Packaging). Its application and validation showed that raw material without clear evidence of ecotoxicological potential, but with some ability to release chemicals, can lead to the formulation of a CM with a slightly lower hazardousness in terms of chemical characterization despite a slightly higher ecotoxicological potential than the raw materials. The proposed methodology can be a useful tool for the development and manufacturing of products and the design choice of the most appropriate CM, aiming at the reduction of their environmental impact and contributing to construction sustainability. PMID:28773011

  14. DB4US: A Decision Support System for Laboratory Information Management

    PubMed Central

    Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael

    2012-01-01

    Background Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. Objective To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. Methods We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. Results DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. Conclusions The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources. PMID:23608745

  15. Building a common pipeline for rule-based document classification.

    PubMed

    Patterson, Olga V; Ginter, Thomas; DuVall, Scott L

    2013-01-01

    Instance-based classification of clinical text is a widely used natural language processing task employed as a step for patient classification, document retrieval, or information extraction. Rule-based approaches rely on concept identification and context analysis in order to determine the appropriate class. We propose a five-step process that enables even small research teams to develop simple but powerful rule-based NLP systems by taking advantage of a common UIMA AS based pipeline for classification. Our proposed methodology coupled with the general-purpose solution provides researchers with access to the data locked in clinical text in cases of limited human resources and compact timelines.

  16. A data-driven feature extraction framework for predicting the severity of condition of congestive heart failure patients.

    PubMed

    Sideris, Costas; Alshurafa, Nabil; Pourhomayoun, Mohammad; Shahmohammadi, Farhad; Samy, Lauren; Sarrafzadeh, Majid

    2015-01-01

    In this paper, we propose a novel methodology for utilizing disease diagnostic information to predict severity of condition for Congestive Heart Failure (CHF) patients. Our methodology relies on a novel, clustering-based, feature extraction framework using disease diagnostic information. To reduce the dimensionality we identify disease clusters using cooccurence frequencies. We then utilize these clusters as features to predict patient severity of condition. We build our clustering and feature extraction algorithm using the 2012 National Inpatient Sample (NIS), Healthcare Cost and Utilization Project (HCUP) which contains 7 million discharge records and ICD-9-CM codes. The proposed framework is tested on Ronald Reagan UCLA Medical Center Electronic Health Records (EHR) from 3041 patients. We compare our cluster-based feature set with another that incorporates the Charlson comorbidity score as a feature and demonstrate an accuracy improvement of up to 14% in the predictability of the severity of condition.

  17. Telemedicine security: a systematic review.

    PubMed

    Garg, Vaibhav; Brewer, Jeffrey

    2011-05-01

    Telemedicine is a technology-based alternative to traditional health care delivery. However, poor security measures in telemedicine services can have an adverse impact on the quality of care provided, regardless of the chronic condition being studied. We undertook a systematic review of 58 journal articles pertaining to telemedicine security. These articles were selected based on a keyword search on 14 relevant journals. The articles were coded to evaluate the methodology and to identify the key areas of research in security that are being reviewed. Seventy-six percent of the articles defined the security problem they were addressing, and only 47% formulated a research question pertaining to security. Sixty-one percent proposed a solution, and 20% of these tested the security solutions that they proposed. Prior research indicates inadequate reporting of methodology in telemedicine research. We found that to be true for security research as well. We also identified other issues such as using outdated security standards. © 2011 Diabetes Technology Society.

  18. Telemedicine Security: A Systematic Review

    PubMed Central

    Garg, Vaibhav; Brewer, Jeffrey

    2011-01-01

    Telemedicine is a technology-based alternative to traditional health care delivery. However, poor security measures in telemedicine services can have an adverse impact on the quality of care provided, regardless of the chronic condition being studied. We undertook a systematic review of 58 journal articles pertaining to telemedicine security. These articles were selected based on a keyword search on 14 relevant journals. The articles were coded to evaluate the methodology and to identify the key areas of research in security that are being reviewed. Seventy-six percent of the articles defined the security problem they were addressing, and only 47% formulated a research question pertaining to security. Sixty-one percent proposed a solution, and 20% of these tested the security solutions that they proposed. Prior research indicates inadequate reporting of methodology in telemedicine research. We found that to be true for security research as well. We also identified other issues such as using outdated security standards. PMID:21722592

  19. Report on an Assessment of the Application of EPP Results from the Strain Limit Evaluation Procedure to the Prediction of Cyclic Life Based on the SMT Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jetter, R. I.; Messner, M. C.; Sham, T. -L.

    The goal of the proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology is to incorporate an SMT data based approach for creep-fatigue damage evaluation into the EPP methodology to avoid the separate evaluation of creep and fatigue damage and eliminate the requirement for stress classification in current methods; thus greatly simplifying evaluation of elevated temperature cyclic service. This methodology should minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed methodology and to verify the applicability of the code rules, analytical studies and evaluation of thermomechanical test results continuedmore » in FY17. This report presents the results of those studies. An EPP strain limits methodology assessment was based on recent two-bar thermal ratcheting test results on 316H stainless steel in the temperature range of 405 to 7050C. Strain range predictions from the EPP evaluation of the two-bar tests were also evaluated and compared with the experimental results. The role of sustained primary loading on cyclic life was assessed using the results of pressurized SMT data from tests on Alloy 617 at 9500C. A viscoplastic material model was used in an analytic simulation of two-bar tests to compare with EPP strain limits assessments using isochronous stress strain curves that are consistent with the viscoplastic material model. A finite element model of a prior 304H stainless steel Oak Ridge National Laboratory (ORNL) nozzle-to-sphere test was developed and used for an EPP strain limits and creep-fatigue code case damage evaluations. A theoretical treatment of a recurring issue with convergence criteria for plastic shakedown illustrated the role of computer machine precision in EPP calculations.« less

  20. Development of a Design Methodology for Reconfigurable Flight Control Systems

    NASA Technical Reports Server (NTRS)

    Hess, Ronald A.; McLean, C.

    2000-01-01

    A methodology is presented for the design of flight control systems that exhibit stability and performance-robustness in the presence of actuator failures. The design is based upon two elements. The first element consists of a control law that will ensure at least stability in the presence of a class of actuator failures. This law is created by inner-loop, reduced-order, linear dynamic inversion, and outer-loop compensation based upon Quantitative Feedback Theory. The second element consists of adaptive compensators obtained from simple and approximate time-domain identification of the dynamics of the 'effective vehicle' with failed actuator(s). An example involving the lateral-directional control of a fighter aircraft is employed both to introduce the proposed methodology and to demonstrate its effectiveness and limitations.

  1. Stability and performance of propulsion control systems with distributed control architectures and failures

    NASA Astrophysics Data System (ADS)

    Belapurkar, Rohit K.

    Future aircraft engine control systems will be based on a distributed architecture, in which, the sensors and actuators will be connected to the Full Authority Digital Engine Control (FADEC) through an engine area network. Distributed engine control architecture will allow the implementation of advanced, active control techniques along with achieving weight reduction, improvement in performance and lower life cycle cost. The performance of a distributed engine control system is predominantly dependent on the performance of the communication network. Due to the serial data transmission policy, network-induced time delays and sampling jitter are introduced between the sensor/actuator nodes and the distributed FADEC. Communication network faults and transient node failures may result in data dropouts, which may not only degrade the control system performance but may even destabilize the engine control system. Three different architectures for a turbine engine control system based on a distributed framework are presented. A partially distributed control system for a turbo-shaft engine is designed based on ARINC 825 communication protocol. Stability conditions and control design methodology are developed for the proposed partially distributed turbo-shaft engine control system to guarantee the desired performance under the presence of network-induced time delay and random data loss due to transient sensor/actuator failures. A fault tolerant control design methodology is proposed to benefit from the availability of an additional system bandwidth and from the broadcast feature of the data network. It is shown that a reconfigurable fault tolerant control design can help to reduce the performance degradation in presence of node failures. A T-700 turbo-shaft engine model is used to validate the proposed control methodology based on both single input and multiple-input multiple-output control design techniques.

  2. A new methodology for automatic detection of reference points in 3D cephalometry: A pilot study.

    PubMed

    Ed-Dhahraouy, Mohammed; Riri, Hicham; Ezzahmouly, Manal; Bourzgui, Farid; El Moutaoukkil, Abdelmajid

    2018-04-05

    The aim of this study was to develop a new method for an automatic detection of reference points in 3D cephalometry to overcome the limits of 2D cephalometric analyses. A specific application was designed using the C++ language for automatic and manual identification of 21 (reference) points on the craniofacial structures. Our algorithm is based on the implementation of an anatomical and geometrical network adapted to the craniofacial structure. This network was constructed based on the anatomical knowledge of the 3D cephalometric (reference) points. The proposed algorithm was tested on five CBCT images. The proposed approach for the automatic 3D cephalometric identification was able to detect 21 points with a mean error of 2.32mm. In this pilot study, we propose an automated methodology for the identification of the 3D cephalometric (reference) points. A larger sample will be implemented in the future to assess the method validity and reliability. Copyright © 2018 CEO. Published by Elsevier Masson SAS. All rights reserved.

  3. Developing mobile- and BIM-based integrated visual facility maintenance management system.

    PubMed

    Lin, Yu-Cheng; Su, Yu-Chih

    2013-01-01

    Facility maintenance management (FMM) has become an important topic for research on the operation phase of the construction life cycle. Managing FMM effectively is extremely difficult owing to various factors and environments. One of the difficulties is the performance of 2D graphics when depicting maintenance service. Building information modeling (BIM) uses precise geometry and relevant data to support the maintenance service of facilities depicted in 3D object-oriented CAD. This paper proposes a new and practical methodology with application to FMM using BIM technology. Using BIM technology, this study proposes a BIM-based facility maintenance management (BIMFMM) system for maintenance staff in the operation and maintenance phase. The BIMFMM system is then applied in selected case study of a commercial building project in Taiwan to verify the proposed methodology and demonstrate its effectiveness in FMM practice. Using the BIMFMM system, maintenance staff can access and review 3D BIM models for updating related maintenance records in a digital format. Moreover, this study presents a generic system architecture and its implementation. The combined results demonstrate that a BIMFMM-like system can be an effective visual FMM tool.

  4. Predicting the graft survival for heart-lung transplantation patients: an integrated data mining methodology.

    PubMed

    Oztekin, Asil; Delen, Dursun; Kong, Zhenyu James

    2009-12-01

    Predicting the survival of heart-lung transplant patients has the potential to play a critical role in understanding and improving the matching procedure between the recipient and graft. Although voluminous data related to the transplantation procedures is being collected and stored, only a small subset of the predictive factors has been used in modeling heart-lung transplantation outcomes. The previous studies have mainly focused on applying statistical techniques to a small set of factors selected by the domain-experts in order to reveal the simple linear relationships between the factors and survival. The collection of methods known as 'data mining' offers significant advantages over conventional statistical techniques in dealing with the latter's limitations such as normality assumption of observations, independence of observations from each other, and linearity of the relationship between the observations and the output measure(s). There are statistical methods that overcome these limitations. Yet, they are computationally more expensive and do not provide fast and flexible solutions as do data mining techniques in large datasets. The main objective of this study is to improve the prediction of outcomes following combined heart-lung transplantation by proposing an integrated data-mining methodology. A large and feature-rich dataset (16,604 cases with 283 variables) is used to (1) develop machine learning based predictive models and (2) extract the most important predictive factors. Then, using three different variable selection methods, namely, (i) machine learning methods driven variables-using decision trees, neural networks, logistic regression, (ii) the literature review-based expert-defined variables, and (iii) common sense-based interaction variables, a consolidated set of factors is generated and used to develop Cox regression models for heart-lung graft survival. The predictive models' performance in terms of 10-fold cross-validation accuracy rates for two multi-imputed datasets ranged from 79% to 86% for neural networks, from 78% to 86% for logistic regression, and from 71% to 79% for decision trees. The results indicate that the proposed integrated data mining methodology using Cox hazard models better predicted the graft survival with different variables than the conventional approaches commonly used in the literature. This result is validated by the comparison of the corresponding Gains charts for our proposed methodology and the literature review based Cox results, and by the comparison of Akaike information criteria (AIC) values received from each. Data mining-based methodology proposed in this study reveals that there are undiscovered relationships (i.e. interactions of the existing variables) among the survival-related variables, which helps better predict the survival of the heart-lung transplants. It also brings a different set of variables into the scene to be evaluated by the domain-experts and be considered prior to the organ transplantation.

  5. A methodology to design heuristics for model selection based on the characteristics of data: Application to investigate when the Negative Binomial Lindley (NB-L) is preferred over the Negative Binomial (NB).

    PubMed

    Shirazi, Mohammadali; Dhavala, Soma Sekhar; Lord, Dominique; Geedipally, Srinivas Reddy

    2017-10-01

    Safety analysts usually use post-modeling methods, such as the Goodness-of-Fit statistics or the Likelihood Ratio Test, to decide between two or more competitive distributions or models. Such metrics require all competitive distributions to be fitted to the data before any comparisons can be accomplished. Given the continuous growth in introducing new statistical distributions, choosing the best one using such post-modeling methods is not a trivial task, in addition to all theoretical or numerical issues the analyst may face during the analysis. Furthermore, and most importantly, these measures or tests do not provide any intuitions into why a specific distribution (or model) is preferred over another (Goodness-of-Logic). This paper ponders into these issues by proposing a methodology to design heuristics for Model Selection based on the characteristics of data, in terms of descriptive summary statistics, before fitting the models. The proposed methodology employs two analytic tools: (1) Monte-Carlo Simulations and (2) Machine Learning Classifiers, to design easy heuristics to predict the label of the 'most-likely-true' distribution for analyzing data. The proposed methodology was applied to investigate when the recently introduced Negative Binomial Lindley (NB-L) distribution is preferred over the Negative Binomial (NB) distribution. Heuristics were designed to select the 'most-likely-true' distribution between these two distributions, given a set of prescribed summary statistics of data. The proposed heuristics were successfully compared against classical tests for several real or observed datasets. Not only they are easy to use and do not need any post-modeling inputs, but also, using these heuristics, the analyst can attain useful information about why the NB-L is preferred over the NB - or vice versa- when modeling data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Modeling and clustering water demand patterns from real-world smart meter data

    NASA Astrophysics Data System (ADS)

    Cheifetz, Nicolas; Noumir, Zineb; Samé, Allou; Sandraz, Anne-Claire; Féliers, Cédric; Heim, Véronique

    2017-08-01

    Nowadays, drinking water utilities need an acute comprehension of the water demand on their distribution network, in order to efficiently operate the optimization of resources, manage billing and propose new customer services. With the emergence of smart grids, based on automated meter reading (AMR), a better understanding of the consumption modes is now accessible for smart cities with more granularities. In this context, this paper evaluates a novel methodology for identifying relevant usage profiles from the water consumption data produced by smart meters. The methodology is fully data-driven using the consumption time series which are seen as functions or curves observed with an hourly time step. First, a Fourier-based additive time series decomposition model is introduced to extract seasonal patterns from time series. These patterns are intended to represent the customer habits in terms of water consumption. Two functional clustering approaches are then used to classify the extracted seasonal patterns: the functional version of K-means, and the Fourier REgression Mixture (FReMix) model. The K-means approach produces a hard segmentation and K representative prototypes. On the other hand, the FReMix is a generative model and also produces K profiles as well as a soft segmentation based on the posterior probabilities. The proposed approach is applied to a smart grid deployed on the largest water distribution network (WDN) in France. The two clustering strategies are evaluated and compared. Finally, a realistic interpretation of the consumption habits is given for each cluster. The extensive experiments and the qualitative interpretation of the resulting clusters allow one to highlight the effectiveness of the proposed methodology.

  7. Human-Centered Systems Analysis of Aircraft Separation from Adverse Weather: Implications for Icing Remote Sensing

    NASA Technical Reports Server (NTRS)

    Vigeant-Langlois, Laurence; Hansman, R. John, Jr.

    2003-01-01

    The objective of this project was to propose a means to improve aviation weather information, training procedures based on a human-centered systems approach. Methodology: cognitive analysis of pilot's tasks; trajectory-based approach to weather information; contingency planning support; and implications for improving weather information.

  8. Towards an Intelligent Possibilistic Web Information Retrieval Using Multiagent System

    ERIC Educational Resources Information Center

    Elayeb, Bilel; Evrard, Fabrice; Zaghdoud, Montaceur; Ahmed, Mohamed Ben

    2009-01-01

    Purpose: The purpose of this paper is to make a scientific contribution to web information retrieval (IR). Design/methodology/approach: A multiagent system for web IR is proposed based on new technologies: Hierarchical Small-Worlds (HSW) and Possibilistic Networks (PN). This system is based on a possibilistic qualitative approach which extends the…

  9. Broad-Based National Education in Globalisation: Conceptualisation, Multiple Functions and Management

    ERIC Educational Resources Information Center

    Cheng, Yin Cheong; Yuen, Timothy W. W.

    2017-01-01

    Purpose: The purpose of this paper is to contribute to the worldwide discussion of conceptualization, multiple functions and management of national education in an era of globalisation by proposing a new comprehensive framework for research, policy analysis and practical implementation. Design/Methodology/Approach: Based on a review of the…

  10. 76 FR 14662 - Notice of Public Information Collection(s) Being Reviewed by the Federal Communications...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-17

    ... (STS) based on the Multi-state Average Rate Structure (MARS) plan proposed by Hamilton Relay, Inc., (2... intrastate Internet-Protocol (IP) Captioned Telephone Service (IP CTS) based on the MARS plan, (3) a cost... with the MARS plan cost recovery methodology for compensation from the Fund. Specifically, TRS...

  11. Organising Workplace Learning: An Inter-Organisational Perspective

    ERIC Educational Resources Information Center

    Svensson, Lennart; Randle, Hanne; Bennich, Maria

    2009-01-01

    Purpose: The purpose of this paper is to argue that both the supply-based model and the demand-based form of vocational education and training (VET) have their limitations and propose a "third way" in which reflective learning in the workplace is a central ingredient. Design/methodology/approach: The data was collected from several…

  12. [Discovery-based teaching and learning strategies in health: problematization and problem-based learning].

    PubMed

    Cyrino, Eliana Goldfarb; Toralles-Pereira, Maria Lúcia

    2004-01-01

    Considering the changes in teaching in the health field and the demand for new ways of dealing with knowledge in higher learning, the article discusses two innovative methodological approaches: problem-based learning (PBL) and problematization. Describing the two methods' theoretical roots, the article attempts to identify their main foundations. As distinct proposals, both contribute to a review of the teaching and learning process: problematization, focused on knowledge construction in the context of the formation of a critical awareness; PBL, focused on cognitive aspects in the construction of concepts and appropriation of basic mechanisms in science. Both problematization and PBL lead to breaks with the traditional way of teaching and learning, stimulating participatory management by actors in the experience and reorganization of the relationship between theory and practice. The critique of each proposal's possibilities and limits using the analysis of their theoretical and methodological foundations leads us to conclude that pedagogical experiences based on PBL and/or problematization can represent an innovative trend in the context of health education, fostering breaks and more sweeping changes.

  13. Strategy for design NIR calibration sets based on process spectrum and model space: An innovative approach for process analytical technology.

    PubMed

    Cárdenas, V; Cordobés, M; Blanco, M; Alcalà, M

    2015-10-10

    The pharmaceutical industry is under stringent regulations on quality control of their products because is critical for both, productive process and consumer safety. According to the framework of "process analytical technology" (PAT), a complete understanding of the process and a stepwise monitoring of manufacturing are required. Near infrared spectroscopy (NIRS) combined with chemometrics have lately performed efficient, useful and robust for pharmaceutical analysis. One crucial step in developing effective NIRS-based methodologies is selecting an appropriate calibration set to construct models affording accurate predictions. In this work, we developed calibration models for a pharmaceutical formulation during its three manufacturing stages: blending, compaction and coating. A novel methodology is proposed for selecting the calibration set -"process spectrum"-, into which physical changes in the samples at each stage are algebraically incorporated. Also, we established a "model space" defined by Hotelling's T(2) and Q-residuals statistics for outlier identification - inside/outside the defined space - in order to select objectively the factors to be used in calibration set construction. The results obtained confirm the efficacy of the proposed methodology for stepwise pharmaceutical quality control, and the relevance of the study as a guideline for the implementation of this easy and fast methodology in the pharma industry. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Methodology to incorporate EIA in land-use ordering -- case study: The Cataniapo River basin, Venezuela

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sebastiani, M.; Llambi, L.D.; Marquez, E.

    1998-07-01

    In Venezuela, the idea of tiering information between land-use ordering instruments and impact assessment is absent. In this article the authors explore a methodological alternative to bridge the information presented in land-use ordering instruments with the information requirements for impact assessment. The methodology is based on the steps carried out for an environmental impact assessment as well as on those considered to develop land-use ordering instruments. The methodology is applied to the territorial ordering plan and its proposal for the protection zone of the Cataniapo River basin. The purpose of the protection zone is to preserve the water quality andmore » quantity of the river basin for human consumption.« less

  15. Image Processing for Planetary Limb/Terminator Extraction

    NASA Technical Reports Server (NTRS)

    Udomkesmalee, S.; Zhu, D. Q.; Chu, C. -C.

    1995-01-01

    A novel image segmentation technique for extracting limb and terminator of planetary bodies is proposed. Conventional edge- based histogramming approaches are used to trace object boundaries. The limb and terminator bifurcation is achieved by locating the harmonized segment in the two equations representing the 2-D parameterized boundary curve. Real planetary images from Voyager 1 and 2 served as representative test cases to verify the proposed methodology.

  16. Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Knox, Lenora A.

    The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.

  17. Ultrasensitive low noise voltage amplifier for spectral analysis.

    PubMed

    Giusi, G; Crupi, F; Pace, C

    2008-08-01

    Recently we have proposed several voltage noise measurement methods that allow, at least in principle, the complete elimination of the noise introduced by the measurement amplifier. The most severe drawback of these methods is that they require a multistep measurement procedure. Since environmental conditions may change in the different measurement steps, the final result could be affected by these changes. This problem is solved by the one-step voltage noise measurement methodology based on a novel amplifier topology proposed in this paper. Circuit implementations for the amplifier building blocks based on operational amplifiers are critically discussed. The proposed approach is validated through measurements performed on a prototype circuit.

  18. Large-scale optimization-based non-negative computational framework for diffusion equations: Parallel implementation and performance studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Justin; Karra, Satish; Nakshatrala, Kalyana B.

    It is well-known that the standard Galerkin formulation, which is often the formulation of choice under the finite element method for solving self-adjoint diffusion equations, does not meet maximum principles and the non-negative constraint for anisotropic diffusion equations. Recently, optimization-based methodologies that satisfy maximum principles and the non-negative constraint for steady-state and transient diffusion-type equations have been proposed. To date, these methodologies have been tested only on small-scale academic problems. The purpose of this paper is to systematically study the performance of the non-negative methodology in the context of high performance computing (HPC). PETSc and TAO libraries are, respectively, usedmore » for the parallel environment and optimization solvers. For large-scale problems, it is important for computational scientists to understand the computational performance of current algorithms available in these scientific libraries. The numerical experiments are conducted on the state-of-the-art HPC systems, and a single-core performance model is used to better characterize the efficiency of the solvers. Furthermore, our studies indicate that the proposed non-negative computational framework for diffusion-type equations exhibits excellent strong scaling for real-world large-scale problems.« less

  19. Improvement of Steam Turbine Operational Performance and Reliability with using Modern Information Technologies

    NASA Astrophysics Data System (ADS)

    Brezgin, V. I.; Brodov, Yu M.; Kultishev, A. Yu

    2017-11-01

    The report presents improvement methods review in the fields of the steam turbine units design and operation based on modern information technologies application. In accordance with the life cycle methodology support, a conceptual model of the information support system during life cycle main stages (LC) of steam turbine unit is suggested. A classifying system, which ensures the creation of sustainable information links between the engineer team (manufacture’s plant) and customer organizations (power plants), is proposed. Within report, the principle of parameterization expansion beyond the geometric constructions at the design and improvement process of steam turbine unit equipment is proposed, studied and justified. The report presents the steam turbine unit equipment design methodology based on the brand new oil-cooler design system that have been developed and implemented by authors. This design system combines the construction subsystem, which is characterized by extensive usage of family tables and templates, and computation subsystem, which includes a methodology for the thermal-hydraulic zone-by-zone oil coolers design calculations. The report presents data about the developed software for operational monitoring, assessment of equipment parameters features as well as its implementation on five power plants.

  20. On sustainable and efficient design of ground-source heat pump systems

    NASA Astrophysics Data System (ADS)

    Grassi, W.; Conti, P.; Schito, E.; Testi, D.

    2015-11-01

    This paper is mainly aimed at stressing some fundamental features of the GSHP design and is based on a broad research we are performing at the University of Pisa. In particular, we focus the discussion on an environmentally sustainable approach, based on performance optimization during the entire operational life. The proposed methodology aims at investigating design and management strategies to find the optimal level of exploitation of the ground source and refer to other technical means to cover the remaining energy requirements and modulate the power peaks. The method is holistic, considering the system as a whole, rather than focusing only on some components, usually considered as the most important ones. Each subsystem is modeled and coupled to the others in a full set of equations, which is used within an optimization routine to reproduce the operative performances of the overall GSHP system. As a matter of fact, the recommended methodology is a 4-in-1 activity, including sizing of components, lifecycle performance evaluation, optimization process, and feasibility analysis. The paper reviews also some previous works concerning possible applications of the proposed methodology. In conclusion, we describe undergoing research activities and objectives of future works.

  1. Large-scale optimization-based non-negative computational framework for diffusion equations: Parallel implementation and performance studies

    DOE PAGES

    Chang, Justin; Karra, Satish; Nakshatrala, Kalyana B.

    2016-07-26

    It is well-known that the standard Galerkin formulation, which is often the formulation of choice under the finite element method for solving self-adjoint diffusion equations, does not meet maximum principles and the non-negative constraint for anisotropic diffusion equations. Recently, optimization-based methodologies that satisfy maximum principles and the non-negative constraint for steady-state and transient diffusion-type equations have been proposed. To date, these methodologies have been tested only on small-scale academic problems. The purpose of this paper is to systematically study the performance of the non-negative methodology in the context of high performance computing (HPC). PETSc and TAO libraries are, respectively, usedmore » for the parallel environment and optimization solvers. For large-scale problems, it is important for computational scientists to understand the computational performance of current algorithms available in these scientific libraries. The numerical experiments are conducted on the state-of-the-art HPC systems, and a single-core performance model is used to better characterize the efficiency of the solvers. Furthermore, our studies indicate that the proposed non-negative computational framework for diffusion-type equations exhibits excellent strong scaling for real-world large-scale problems.« less

  2. Use of Cdse/ZnS quantum dots for sensitive detection and quantification of paraquat in water samples.

    PubMed

    Durán, Gema M; Contento, Ana M; Ríos, Ángel

    2013-11-01

    Based on the highly sensitive fluorescence change of water-soluble CdSe/ZnS core-shell quantum dots (QD) by paraquat herbicide, a simple, rapid and reproducible methodology was developed to selectively determine paraquat (PQ) in water samples. The methodology enabled the use of simple pretreatment procedure based on the simple water solubilization of CdSe/ZnS QDs with hydrophilic heterobifunctional thiol ligands, such as 3-mercaptopropionic acid (3-MPA), using microwave irradiation. The resulting water-soluble QDs exhibit a strong fluorescence emission at 596 nm with a high and reproducible photostability. The proposed analytical method thus satisfies the need for a simple, sensible and rapid methodology to determine residues of paraquat in water samples, as required by the increasingly strict regulations for health protection introduced in recent years. The sensitivity of the method, expressed as detection limits, was as low as 3.0 ng L(-1). The lineal range was between 10-5×10(3) ng L(-1). RSD values in the range of 71-102% were obtained. The analytical applicability of proposed method was demonstrated by analyzing water samples from different procedence. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. 78 FR 4369 - Rates for Interstate Inmate Calling Services

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-22

    .... Marginal Location Methodology. In 2008, ICS providers submitted the ICS Provider Proposal for ICS rates. The ICS Provider Proposal uses the ``marginal location'' methodology, previously adopted by the... ``marginal location'' methodology provides a ``basis for rates that represent `fair compensation' as set...

  4. 78 FR 48720 - Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-09

    ... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB Number 1121-NEW] Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological Research To Support the National... Redesign Research (NCVS-RR) program: Methodological Research to Support the National Crime Victimization...

  5. 78 FR 66954 - Agency Information Collection Activities: Proposed Collection; Comments Requested Methodological...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-07

    ... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB No. 1121-NEW] Agency Information Collection Activities: Proposed Collection; Comments Requested Methodological Research To Support the National Crime... related to the National Crime Victimization Survey Redesign Research (NCVS-RR) program: Methodological...

  6. A methodology to model causal relationships on offshore safety assessment focusing on human and organizational factors.

    PubMed

    Ren, J; Jenkinson, I; Wang, J; Xu, D L; Yang, J B

    2008-01-01

    Focusing on people and organizations, this paper aims to contribute to offshore safety assessment by proposing a methodology to model causal relationships. The methodology is proposed in a general sense that it will be capable of accommodating modeling of multiple risk factors considered in offshore operations and will have the ability to deal with different types of data that may come from different resources. Reason's "Swiss cheese" model is used to form a generic offshore safety assessment framework, and Bayesian Network (BN) is tailored to fit into the framework to construct a causal relationship model. The proposed framework uses a five-level-structure model to address latent failures within the causal sequence of events. The five levels include Root causes level, Trigger events level, Incidents level, Accidents level, and Consequences level. To analyze and model a specified offshore installation safety, a BN model was established following the guideline of the proposed five-level framework. A range of events was specified, and the related prior and conditional probabilities regarding the BN model were assigned based on the inherent characteristics of each event. This paper shows that Reason's "Swiss cheese" model and BN can be jointly used in offshore safety assessment. On the one hand, the five-level conceptual model is enhanced by BNs that are capable of providing graphical demonstration of inter-relationships as well as calculating numerical values of occurrence likelihood for each failure event. Bayesian inference mechanism also makes it possible to monitor how a safety situation changes when information flow travel forwards and backwards within the networks. On the other hand, BN modeling relies heavily on experts' personal experiences and is therefore highly domain specific. "Swiss cheese" model is such a theoretic framework that it is based on solid behavioral theory and therefore can be used to provide industry with a roadmap for BN modeling and implications. A case study of the collision risk between a Floating Production, Storage and Offloading (FPSO) unit and authorized vessels caused by human and organizational factors (HOFs) during operations is used to illustrate an industrial application of the proposed methodology.

  7. Predictor-Based Model Reference Adaptive Control

    NASA Technical Reports Server (NTRS)

    Lavretsky, Eugene; Gadient, Ross; Gregory, Irene M.

    2009-01-01

    This paper is devoted to robust, Predictor-based Model Reference Adaptive Control (PMRAC) design. The proposed adaptive system is compared with the now-classical Model Reference Adaptive Control (MRAC) architecture. Simulation examples are presented. Numerical evidence indicates that the proposed PMRAC tracking architecture has better than MRAC transient characteristics. In this paper, we presented a state-predictor based direct adaptive tracking design methodology for multi-input dynamical systems, with partially known dynamics. Efficiency of the design was demonstrated using short period dynamics of an aircraft. Formal proof of the reported PMRAC benefits constitute future research and will be reported elsewhere.

  8. A new approach to hand-based authentication

    NASA Astrophysics Data System (ADS)

    Amayeh, G.; Bebis, G.; Erol, A.; Nicolescu, M.

    2007-04-01

    Hand-based authentication is a key biometric technology with a wide range of potential applications both in industry and government. Traditionally, hand-based authentication is performed by extracting information from the whole hand. To account for hand and finger motion, guidance pegs are employed to fix the position and orientation of the hand. In this paper, we consider a component-based approach to hand-based verification. Our objective is to investigate the discrimination power of different parts of the hand in order to develop a simpler, faster, and possibly more accurate and robust verification system. Specifically, we propose a new approach which decomposes the hand in different regions, corresponding to the fingers and the back of the palm, and performs verification using information from certain parts of the hand only. Our approach operates on 2D images acquired by placing the hand on a flat lighting table. Using a part-based representation of the hand allows the system to compensate for hand and finger motion without using any guidance pegs. To decompose the hand in different regions, we use a robust methodology based on morphological operators which does not require detecting any landmark points on the hand. To capture the geometry of the back of the palm and the fingers in suffcient detail, we employ high-order Zernike moments which are computed using an effcient methodology. The proposed approach has been evaluated on a database of 100 subjects with 10 images per subject, illustrating promising performance. Comparisons with related approaches using the whole hand for verification illustrate the superiority of the proposed approach. Moreover, qualitative comparisons with state-of-the-art approaches indicate that the proposed approach has comparable or better performance.

  9. Multi-resolution analysis for region of interest extraction in thermographic nondestructive evaluation

    NASA Astrophysics Data System (ADS)

    Ortiz-Jaramillo, B.; Fandiño Toro, H. A.; Benitez-Restrepo, H. D.; Orjuela-Vargas, S. A.; Castellanos-Domínguez, G.; Philips, W.

    2012-03-01

    Infrared Non-Destructive Testing (INDT) is known as an effective and rapid method for nondestructive inspection. It can detect a broad range of near-surface structuring flaws in metallic and composite components. Those flaws are modeled as a smooth contour centered at peaks of stored thermal energy, termed Regions of Interest (ROI). Dedicated methodologies must detect the presence of those ROIs. In this paper, we present a methodology for ROI extraction in INDT tasks. The methodology deals with the difficulties due to the non-uniform heating. The non-uniform heating affects low spatial/frequencies and hinders the detection of relevant points in the image. In this paper, a methodology for ROI extraction in INDT using multi-resolution analysis is proposed, which is robust to ROI low contrast and non-uniform heating. The former methodology includes local correlation, Gaussian scale analysis and local edge detection. In this methodology local correlation between image and Gaussian window provides interest points related to ROIs. We use a Gaussian window because thermal behavior is well modeled by Gaussian smooth contours. Also, the Gaussian scale is used to analyze details in the image using multi-resolution analysis avoiding low contrast, non-uniform heating and selection of the Gaussian window size. Finally, local edge detection is used to provide a good estimation of the boundaries in the ROI. Thus, we provide a methodology for ROI extraction based on multi-resolution analysis that is better or equal compared with the other dedicate algorithms proposed in the state of art.

  10. Reduced-order modeling of soft robots

    PubMed Central

    Chenevier, Jean; González, David; Aguado, J. Vicente; Chinesta, Francisco

    2018-01-01

    We present a general strategy for the modeling and simulation-based control of soft robots. Although the presented methodology is completely general, we restrict ourselves to the analysis of a model robot made of hyperelastic materials and actuated by cables or tendons. To comply with the stringent real-time constraints imposed by control algorithms, a reduced-order modeling strategy is proposed that allows to minimize the amount of online CPU cost. Instead, an offline training procedure is proposed that allows to determine a sort of response surface that characterizes the response of the robot. Contrarily to existing strategies, the proposed methodology allows for a fully non-linear modeling of the soft material in a hyperelastic setting as well as a fully non-linear kinematic description of the movement without any restriction nor simplifying assumption. Examples of different configurations of the robot were analyzed that show the appeal of the method. PMID:29470496

  11. A novel hybrid ensemble learning paradigm for tourism forecasting

    NASA Astrophysics Data System (ADS)

    Shabri, Ani

    2015-02-01

    In this paper, a hybrid forecasting model based on Empirical Mode Decomposition (EMD) and Group Method of Data Handling (GMDH) is proposed to forecast tourism demand. This methodology first decomposes the original visitor arrival series into several Intrinsic Model Function (IMFs) components and one residual component by EMD technique. Then, IMFs components and the residual components is forecasted respectively using GMDH model whose input variables are selected by using Partial Autocorrelation Function (PACF). The final forecasted result for tourism series is produced by aggregating all the forecasted results. For evaluating the performance of the proposed EMD-GMDH methodologies, the monthly data of tourist arrivals from Singapore to Malaysia are used as an illustrative example. Empirical results show that the proposed EMD-GMDH model outperforms the EMD-ARIMA as well as the GMDH and ARIMA (Autoregressive Integrated Moving Average) models without time series decomposition.

  12. Coordinated Dynamic Behaviors for Multirobot Systems With Collision Avoidance.

    PubMed

    Sabattini, Lorenzo; Secchi, Cristian; Fantuzzi, Cesare

    2017-12-01

    In this paper, we propose a novel methodology for achieving complex dynamic behaviors in multirobot systems. In particular, we consider a multirobot system partitioned into two subgroups: 1) dependent and 2) independent robots. Independent robots are utilized as a control input, and their motion is controlled in such a way that the dependent robots solve a tracking problem, that is following arbitrarily defined setpoint trajectories, in a coordinated manner. The control strategy proposed in this paper explicitly addresses the collision avoidance problem, utilizing a null space-based behavioral approach: this leads to combining, in a non conflicting manner, the tracking control law with a collision avoidance strategy. The combination of these control actions allows the robots to execute their task in a safe way. Avoidance of collisions is formally proven in this paper, and the proposed methodology is validated by means of simulations and experiments on real robots.

  13. 'Worst case' methodology for the initial assessment of societal risk from proposed major accident installations.

    PubMed

    Carter, D A; Hirst, I L

    2000-01-07

    This paper considers the application of one of the weighted risk indicators used by the Major Hazards Assessment Unit (MHAU) of the Health and Safety Executive (HSE) in formulating advice to local planning authorities on the siting of new major accident hazard installations. In such cases the primary consideration is to ensure that the proposed installation would not be incompatible with existing developments in the vicinity, as identified by the categorisation of the existing developments and the estimation of individual risk values at those developments. In addition a simple methodology, described here, based on MHAU's "Risk Integral" and a single "worst case" even analysis, is used to enable the societal risk aspects of the hazardous installation to be considered at an early stage of the proposal, and to determine the degree of analysis that will be necessary to enable HSE to give appropriate advice.

  14. Fast underdetermined BSS architecture design methodology for real time applications.

    PubMed

    Mopuri, Suresh; Reddy, P Sreenivasa; Acharyya, Amit; Naik, Ganesh R

    2015-01-01

    In this paper, we propose a high speed architecture design methodology for the Under-determined Blind Source Separation (UBSS) algorithm using our recently proposed high speed Discrete Hilbert Transform (DHT) targeting real time applications. In UBSS algorithm, unlike the typical BSS, the number of sensors are less than the number of the sources, which is of more interest in the real time applications. The DHT architecture has been implemented based on sub matrix multiplication method to compute M point DHT, which uses N point architecture recursively and where M is an integer multiples of N. The DHT architecture and state of the art architecture are coded in VHDL for 16 bit word length and ASIC implementation is carried out using UMC 90 - nm technology @V DD = 1V and @ 1MHZ clock frequency. The proposed architecture implementation and experimental comparison results show that the DHT design is two times faster than state of the art architecture.

  15. A top-down design methodology and its implementation for VCSEL-based optical links design

    NASA Astrophysics Data System (ADS)

    Li, Jiguang; Cao, Mingcui; Cai, Zilong

    2005-01-01

    In order to find the optimal design for a given specification of an optical communication link, an integrated simulation of electronic, optoelectronic, and optical components of a complete system is required. It is very important to be able to simulate at both system level and detailed model level. This kind of model is feasible due to the high potential of Verilog-AMS language. In this paper, we propose an effective top-down design methodology and employ it in the development of a complete VCSEL-based optical links simulation. The principle of top-down methodology is that the development would proceed from the system to device level. To design a hierarchical model for VCSEL based optical links, the design framework is organized in three levels of hierarchy. The models are developed, and implemented in Verilog-AMS. Therefore, the model parameters are fitted to measured data. A sample transient simulation demonstrates the functioning of our implementation. Suggestions for future directions in top-down methodology used for optoelectronic systems technology are also presented.

  16. A Wavelet-Based Methodology for Grinding Wheel Condition Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, T. W.; Ting, C.F.; Qu, Jun

    2007-01-01

    Grinding wheel surface condition changes as more material is removed. This paper presents a wavelet-based methodology for grinding wheel condition monitoring based on acoustic emission (AE) signals. Grinding experiments in creep feed mode were conducted to grind alumina specimens with a resinoid-bonded diamond wheel using two different conditions. During the experiments, AE signals were collected when the wheel was 'sharp' and when the wheel was 'dull'. Discriminant features were then extracted from each raw AE signal segment using the discrete wavelet decomposition procedure. An adaptive genetic clustering algorithm was finally applied to the extracted features in order to distinguish differentmore » states of grinding wheel condition. The test results indicate that the proposed methodology can achieve 97% clustering accuracy for the high material removal rate condition, 86.7% for the low material removal rate condition, and 76.7% for the combined grinding conditions if the base wavelet, the decomposition level, and the GA parameters are properly selected.« less

  17. Proposal of a method for evaluating tsunami risk using response-surface methodology

    NASA Astrophysics Data System (ADS)

    Fukutani, Y.

    2017-12-01

    Information on probabilistic tsunami inundation hazards is needed to define and evaluate tsunami risk. Several methods for calculating these hazards have been proposed (e.g. Løvholt et al. (2012), Thio (2012), Fukutani et al. (2014), Goda et al. (2015)). However, these methods are inefficient, and their calculation cost is high, since they require multiple tsunami numerical simulations, therefore lacking versatility. In this study, we proposed a simpler method for tsunami risk evaluation using response-surface methodology. Kotani et al. (2016) proposed an evaluation method for the probabilistic distribution of tsunami wave-height using a response-surface methodology. We expanded their study and developed a probabilistic distribution of tsunami inundation depth. We set the depth (x1) and the slip (x2) of an earthquake fault as explanatory variables and tsunami inundation depth (y) as an object variable. Subsequently, tsunami risk could be evaluated by conducting a Monte Carlo simulation, assuming that the generation probability of an earthquake follows a Poisson distribution, the probability distribution of tsunami inundation depth follows the distribution derived from a response-surface, and the damage probability of a target follows a log normal distribution. We applied the proposed method to a wood building located on the coast of Tokyo Bay. We implemented a regression analysis based on the results of 25 tsunami numerical calculations and developed a response-surface, which was defined as y=ax1+bx2+c (a:0.2615, b:3.1763, c=-1.1802). We assumed proper probabilistic distribution for earthquake generation, inundation height, and vulnerability. Based on these probabilistic distributions, we conducted Monte Carlo simulations of 1,000,000 years. We clarified that the expected damage probability of the studied wood building is 22.5%, assuming that an earthquake occurs. The proposed method is therefore a useful and simple way to evaluate tsunami risk using a response-surface and Monte Carlo simulation without conducting multiple tsunami numerical simulations.

  18. Improving the Method of Roof Fall Susceptibility Assessment based on Fuzzy Approach

    NASA Astrophysics Data System (ADS)

    Ghasemi, Ebrahim; Ataei, Mohammad; Shahriar, Kourosh

    2017-03-01

    Retreat mining is always accompanied by a great amount of accidents and most of them are due to roof fall. Therefore, development of methodologies to evaluate the roof fall susceptibility (RFS) seems essential. Ghasemi et al. (2012) proposed a systematic methodology to assess the roof fall risk during retreat mining based on risk assessment classic approach. The main defect of this method is ignorance of subjective uncertainties due to linguistic input value of some factors, low resolution, fixed weighting, sharp class boundaries, etc. To remove this defection and improve the mentioned method, in this paper, a novel methodology is presented to assess the RFS using fuzzy approach. The application of fuzzy approach provides an effective tool to handle the subjective uncertainties. Furthermore, fuzzy analytical hierarchy process (AHP) is used to structure and prioritize various risk factors and sub-factors during development of this method. This methodology is applied to identify the susceptibility of roof fall occurrence in main panel of Tabas Central Mine (TCM), Iran. The results indicate that this methodology is effective and efficient in assessing RFS.

  19. A Generalizable Methodology for Quantifying User Satisfaction

    NASA Astrophysics Data System (ADS)

    Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung

    Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

  20. Extending the essential dynamics analysis to investigate molecular properties: application to the redox potential of proteins.

    PubMed

    Zanetti-Polzi, Laura; Corni, Stefano; Daidone, Isabella; Amadei, Andrea

    2016-07-21

    Here, a methodology is proposed to investigate the collective fluctuation modes of an arbitrary set of observables, maximally contributing to the fluctuation of another functionally relevant observable. The methodology, based on the analysis of fully classical molecular dynamics (MD) simulations, exploits the essential dynamics (ED) method, originally developed to analyse the collective motions in proteins. We apply this methodology to identify the residues that are more relevant for determining the reduction potential (E(0)) of a redox-active protein. To this aim, the fluctuation modes of the single-residue electrostatic potentials mostly contributing to the fluctuations of the total electrostatic potential (the main determinant of E(0)) are investigated for wild-type azurin and two of its mutants with a higher E(0). By comparing the results here obtained with a previous study on the same systems [Zanetti-Polzi et al., Org. Biomol. Chem., 2015, 13, 11003] we show that the proposed methodology is able to identify the key sites that determine E(0). This information can be used for a general deeper understanding of the molecular mechanisms on the basis of the redox properties of the proteins under investigation, as well as for the rational design of mutants with a higher or lower E(0). From the results of the present analysis we propose a new azurin mutant that, according to our calculations, shows a further increase of E(0).

  1. The methodology of database design in organization management systems

    NASA Astrophysics Data System (ADS)

    Chudinov, I. L.; Osipova, V. V.; Bobrova, Y. V.

    2017-01-01

    The paper describes the unified methodology of database design for management information systems. Designing the conceptual information model for the domain area is the most important and labor-intensive stage in database design. Basing on the proposed integrated approach to design, the conceptual information model, the main principles of developing the relation databases are provided and user’s information needs are considered. According to the methodology, the process of designing the conceptual information model includes three basic stages, which are defined in detail. Finally, the article describes the process of performing the results of analyzing user’s information needs and the rationale for use of classifiers.

  2. General implementation of arbitrary nonlinear quadrature phase gates

    NASA Astrophysics Data System (ADS)

    Marek, Petr; Filip, Radim; Ogawa, Hisashi; Sakaguchi, Atsushi; Takeda, Shuntaro; Yoshikawa, Jun-ichi; Furusawa, Akira

    2018-02-01

    We propose general methodology of deterministic single-mode quantum interaction nonlinearly modifying single quadrature variable of a continuous-variable system. The methodology is based on linear coupling of the system to ancillary systems subsequently measured by quadrature detectors. The nonlinear interaction is obtained by using the data from the quadrature detection for dynamical manipulation of the coupling parameters. This measurement-induced methodology enables direct realization of arbitrary nonlinear quadrature interactions without the need to construct them from the lowest-order gates. Such nonlinear interactions are crucial for more practical and efficient manipulation of continuous quadrature variables as well as qubits encoded in continuous-variable systems.

  3. 78 FR 50111 - Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-16

    ... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB Number 1121-NEW] Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological Research to Support the National...: Methodological Research to Support the National Crime Victimization Survey: Self-Report Data on Rape and Sexual...

  4. Value-Based Assessment of New Medical Technologies: Towards a Robust Methodological Framework for the Application of Multiple Criteria Decision Analysis in the Context of Health Technology Assessment.

    PubMed

    Angelis, Aris; Kanavos, Panos

    2016-05-01

    In recent years, multiple criteria decision analysis (MCDA) has emerged as a likely alternative to address shortcomings in health technology assessment (HTA) by offering a more holistic perspective to value assessment and acting as an alternative priority setting tool. In this paper, we argue that MCDA needs to subscribe to robust methodological processes related to the selection of objectives, criteria and attributes in order to be meaningful in the context of healthcare decision making and fulfil its role in value-based assessment (VBA). We propose a methodological process, based on multi-attribute value theory (MAVT) methods comprising five distinct phases, outline the stages involved in each phase and discuss their relevance in the HTA process. Importantly, criteria and attributes need to satisfy a set of desired properties, otherwise the outcome of the analysis can produce spurious results and misleading recommendations. Assuming the methodological process we propose is adhered to, the application of MCDA presents three very distinct advantages to decision makers in the context of HTA and VBA: first, it acts as an instrument for eliciting preferences on the performance of alternative options across a wider set of explicit criteria, leading to a more complete assessment of value; second, it allows the elicitation of preferences across the criteria themselves to reflect differences in their relative importance; and, third, the entire process of preference elicitation can be informed by direct stakeholder engagement, and can therefore reflect their own preferences. All features are fully transparent and facilitate decision making.

  5. Variability-aware compact modeling and statistical circuit validation on SRAM test array

    NASA Astrophysics Data System (ADS)

    Qiao, Ying; Spanos, Costas J.

    2016-03-01

    Variability modeling at the compact transistor model level can enable statistically optimized designs in view of limitations imposed by the fabrication technology. In this work we propose a variability-aware compact model characterization methodology based on stepwise parameter selection. Transistor I-V measurements are obtained from bit transistor accessible SRAM test array fabricated using a collaborating foundry's 28nm FDSOI technology. Our in-house customized Monte Carlo simulation bench can incorporate these statistical compact models; and simulation results on SRAM writability performance are very close to measurements in distribution estimation. Our proposed statistical compact model parameter extraction methodology also has the potential of predicting non-Gaussian behavior in statistical circuit performances through mixtures of Gaussian distributions.

  6. Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics

    NASA Technical Reports Server (NTRS)

    Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.

  7. A Design Methodology for Medical Processes

    PubMed Central

    Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415

  8. Kernel PLS-SVC for Linear and Nonlinear Discrimination

    NASA Technical Reports Server (NTRS)

    Rosipal, Roman; Trejo, Leonard J.; Matthews, Bryan

    2003-01-01

    A new methodology for discrimination is proposed. This is based on kernel orthonormalized partial least squares (PLS) dimensionality reduction of the original data space followed by support vector machines for classification. Close connection of orthonormalized PLS and Fisher's approach to linear discrimination or equivalently with canonical correlation analysis is described. This gives preference to use orthonormalized PLS over principal component analysis. Good behavior of the proposed method is demonstrated on 13 different benchmark data sets and on the real world problem of the classification finger movement periods versus non-movement periods based on electroencephalogram.

  9. Kepler: Analogies in the search for the law of refraction.

    PubMed

    Cardona, Carlos Alberto

    2016-10-01

    This paper examines the methodology used by Kepler to discover a quantitative law of refraction. The aim is to argue that this methodology follows a heuristic method based on the following two Pythagorean principles: (1) sameness is made known by sameness, and (2) harmony arises from establishing a limit to what is unlimited. We will analyse some of the author's proposed analogies to find the aforementioned law and argue that the investigation's heuristic pursues such principles. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Fuzzy Linear Programming and its Application in Home Textile Firm

    NASA Astrophysics Data System (ADS)

    Vasant, P.; Ganesan, T.; Elamvazuthi, I.

    2011-06-01

    In this paper, new fuzzy linear programming (FLP) based methodology using a specific membership function, named as modified logistic membership function is proposed. The modified logistic membership function is first formulated and its flexibility in taking up vagueness in parameter is established by an analytical approach. The developed methodology of FLP has provided a confidence in applying to real life industrial production planning problem. This approach of solving industrial production planning problem can have feedback with the decision maker, the implementer and the analyst.

  11. The methodology of the search for a correlated signal from a supernova explosion using the data of gravitational wave detectors and neutrino observatories

    NASA Astrophysics Data System (ADS)

    Gromov, M. B.

    2017-11-01

    The proposed methodology developed in cooperation of the LIGO, VIRGO, Borexino, LVD, and IceCube collaborations is based on a joint analysis of data from neutrino and gravitational wave detectors which record corresponding radiations, almost undistorted by the interstellar medium and propagating with similar speeds. This approach allows to increase the reliability of observations, detect the so-called Silent supernovae and explore the properties and generation mechanisms of gravitational waves.

  12. How to convince your manager to invest in an HIS preimplementation methodology for appraisal of material, process and human costs and benefits.

    PubMed Central

    Bossard, B.; Renard, J. M.; Capelle, P.; Paradis, P.; Beuscart, M. C.

    2000-01-01

    Investing in information technology has become a crucial process in hospital management today. Medical and administrative managers are faced with difficulties in measuring medical information technology costs and benefits due to the complexity of the domain. This paper proposes a preimplementation methodology for evaluating and appraising material, process and human costs and benefits. Based on the users needs and organizational process analysis, the methodology provides an evaluative set of financial and non financial indicators which can be integrated in a decision making and investment evaluation process. We describe the first results obtained after a few months of operation for the Computer-Based Patient Record (CPR) project. Its full acceptance, in spite of some difficulties, encourages us to diffuse the method for the entire project. PMID:11079851

  13. Integrating the human element into the systems engineering process and MBSE methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tadros, Michael Samir

    In response to the challenges related to the increasing size and complexity of systems, organizations have recognized the need to integrate human considerations in the beginning stages of systems development. Human Systems Integration (HSI) seeks to accomplish this objective by incorporating human factors within systems engineering (SE) processes and methodologies, which is the focus of this paper. A representative set of HSI methods from multiple sources are organized, analyzed, and mapped to the systems engineering Vee-model. These methods are then consolidated and evaluated against the SE process and Models-Based Systems Engineering (MBSE) methodology to determine where and how they couldmore » integrate within systems development activities in the form of specific enhancements. Overall conclusions based on these evaluations are presented and future research areas are proposed.« less

  14. Temperature - Emissivity Separation Assessment in a Sub-Urban Scenario

    NASA Astrophysics Data System (ADS)

    Moscadelli, M.; Diani, M.; Corsini, G.

    2017-10-01

    In this paper, a methodology that aims at evaluating the effectiveness of different TES strategies is presented. The methodology takes into account the specific material of interest in the monitored scenario, sensor characteristics, and errors in the atmospheric compensation step. The methodology is proposed in order to predict and analyse algorithms performances during the planning of a remote sensing mission, aimed to discover specific materials of interest in the monitored scenario. As case study, the proposed methodology is applied to a real airborne data set of a suburban scenario. In order to perform the TES problem, three state-of-the-art algorithms, and a recently proposed one, are investigated: Temperature-Emissivity Separation '98 (TES-98) algorithm, Stepwise Refining TES (SRTES) algorithm, Linear piecewise TES (LTES) algorithm, and Optimized Smoothing TES (OSTES) algorithm. At the end, the accuracy obtained with real data, and the ones predicted by means of the proposed methodology are compared and discussed.

  15. Operations Optimization of Hybrid Energy Systems under Variable Markets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Jun; Garcia, Humberto E.

    Hybrid energy systems (HES) have been proposed to be an important element to enable increasing penetration of clean energy. This paper investigates the operations flexibility of HES, and develops a methodology for operations optimization to maximize its economic value based on predicted renewable generation and market information. The proposed operations optimizer allows systematic control of energy conversion for maximal economic value, and is illustrated by numerical results.

  16. Topology-optimized broadband surface relief transmission grating

    NASA Astrophysics Data System (ADS)

    Andkjær, Jacob; Ryder, Christian P.; Nielsen, Peter C.; Rasmussen, Thomas; Buchwald, Kristian; Sigmund, Ole

    2014-03-01

    We propose a design methodology for systematic design of surface relief transmission gratings with optimized diffraction efficiency. The methodology is based on a gradient-based topology optimization formulation along with 2D frequency domain finite element simulations for TE and TM polarized plane waves. The goal of the optimization is to find a grating design that maximizes diffraction efficiency for the -1st transmission order when illuminated by unpolarized plane waves. Results indicate that a surface relief transmission grating can be designed with a diffraction efficiency of more than 40% in a broadband range going from the ultraviolet region, through the visible region and into the near-infrared region.

  17. SNMG: a social-level norm-based methodology for macro-governing service collaboration processes

    NASA Astrophysics Data System (ADS)

    Gao, Ji; Lv, Hexin; Jin, Zhiyong; Xu, Ping

    2017-08-01

    In order to adapt to the accelerative open tendency of collaborations between enterprises, this paper proposes a Social-level Norm-based methodology for Macro-Governing service collaboration processes, called SNMG, to regulate and control the social-level visible macro-behaviors of the social individuals participating in collaborations. SNMG not only can remove effectively the uncontrollability hindrance confronted with by open social activities, but also enables across-management-domain collaborations to be implemented by uniting the centralized controls of social individuals for respective social activities. Therefore, this paper provides a brand-new system construction mode to promote the development and large-scale deployment of service collaborations.

  18. Set-membership fault detection under noisy environment with application to the detection of abnormal aircraft control surface positions

    NASA Astrophysics Data System (ADS)

    El Houda Thabet, Rihab; Combastel, Christophe; Raïssi, Tarek; Zolghadri, Ali

    2015-09-01

    The paper develops a set membership detection methodology which is applied to the detection of abnormal positions of aircraft control surfaces. Robust and early detection of such abnormal positions is an important issue for early system reconfiguration and overall optimisation of aircraft design. In order to improve fault sensitivity while ensuring a high level of robustness, the method combines a data-driven characterisation of noise and a model-driven approach based on interval prediction. The efficiency of the proposed methodology is illustrated through simulation results obtained based on data recorded in several flight scenarios of a highly representative aircraft benchmark.

  19. White blood cells identification system based on convolutional deep neural learning networks.

    PubMed

    Shahin, A I; Guo, Yanhui; Amin, K M; Sharawi, Amr A

    2017-11-16

    White blood cells (WBCs) differential counting yields valued information about human health and disease. The current developed automated cell morphology equipments perform differential count which is based on blood smear image analysis. Previous identification systems for WBCs consist of successive dependent stages; pre-processing, segmentation, feature extraction, feature selection, and classification. There is a real need to employ deep learning methodologies so that the performance of previous WBCs identification systems can be increased. Classifying small limited datasets through deep learning systems is a major challenge and should be investigated. In this paper, we propose a novel identification system for WBCs based on deep convolutional neural networks. Two methodologies based on transfer learning are followed: transfer learning based on deep activation features and fine-tuning of existed deep networks. Deep acrivation featues are extracted from several pre-trained networks and employed in a traditional identification system. Moreover, a novel end-to-end convolutional deep architecture called "WBCsNet" is proposed and built from scratch. Finally, a limited balanced WBCs dataset classification is performed through the WBCsNet as a pre-trained network. During our experiments, three different public WBCs datasets (2551 images) have been used which contain 5 healthy WBCs types. The overall system accuracy achieved by the proposed WBCsNet is (96.1%) which is more than different transfer learning approaches or even the previous traditional identification system. We also present features visualization for the WBCsNet activation which reflects higher response than the pre-trained activated one. a novel WBCs identification system based on deep learning theory is proposed and a high performance WBCsNet can be employed as a pre-trained network. Copyright © 2017. Published by Elsevier B.V.

  20. Integrated assessment of urban drainage system under the framework of uncertainty analysis.

    PubMed

    Dong, X; Chen, J; Zeng, S; Zhao, D

    2008-01-01

    Due to a rapid urbanization as well as the presence of large number of aging urban infrastructures in China, the urban drainage system is facing a dual pressure of construction and renovation nationwide. This leads to the need for an integrated assessment when an urban drainage system is under planning or re-design. In this paper, an integrated assessment methodology is proposed based upon the approaches of analytic hierarchy process (AHP), uncertainty analysis, mathematical simulation of urban drainage system and fuzzy assessment. To illustrate this methodology, a case study in Shenzhen City of south China has been implemented to evaluate and compare two different urban drainage system renovation plans, i.e., the distributed plan and the centralized plan. By comparing their water quality impacts, ecological impacts, technological feasibility and economic costs, the integrated performance of the distributed plan is found to be both better and robust. The proposed methodology is also found to be both effective and practical. (c) IWA Publishing 2008.

  1. Event-Based Tone Mapping for Asynchronous Time-Based Image Sensor

    PubMed Central

    Simon Chane, Camille; Ieng, Sio-Hoi; Posch, Christoph; Benosman, Ryad B.

    2016-01-01

    The asynchronous time-based neuromorphic image sensor ATIS is an array of autonomously operating pixels able to encode luminance information with an exceptionally high dynamic range (>143 dB). This paper introduces an event-based methodology to display data from this type of event-based imagers, taking into account the large dynamic range and high temporal accuracy that go beyond available mainstream display technologies. We introduce an event-based tone mapping methodology for asynchronously acquired time encoded gray-level data. A global and a local tone mapping operator are proposed. Both are designed to operate on a stream of incoming events rather than on time frame windows. Experimental results on real outdoor scenes are presented to evaluate the performance of the tone mapping operators in terms of quality, temporal stability, adaptation capability, and computational time. PMID:27642275

  2. Empirically Guided Coordination of Multiple Evidence-Based Treatments: An Illustration of Relevance Mapping in Children's Mental Health Services

    ERIC Educational Resources Information Center

    Chorpita, Bruce F.; Bernstein, Adam; Daleiden, Eric L.

    2011-01-01

    Objective: Despite substantial progress in the development and identification of psychosocial evidence-based treatments (EBTs) in mental health, there is minimal empirical guidance for selecting an optimal "set" of EBTs maximally applicable and generalizable to a chosen service sample. Relevance mapping is a proposed methodology that…

  3. Problem-Based Learning in Tertiary Education: Teaching Old "Dogs" New Tricks?

    ERIC Educational Resources Information Center

    Yeo, Roland K.

    2005-01-01

    Purpose--The paper sets out to explore the challenges of problem-based learning (PBL) in tertiary education and to propose a framework with implications for practice and learning. Design/Methodology/Approach--A total of 18 tertiary students divided into three groups participated in the focus group discussions. A quantitative instrument was used as…

  4. Toward a Web Based Environment for Evaluation and Design of Pedagogical Hypermedia

    ERIC Educational Resources Information Center

    Trigano, Philippe C.; Pacurar-Giacomini, Ecaterina

    2004-01-01

    We are working on a method, called CEPIAH. We propose a web based system used to help teachers to design multimedia documents and to evaluate their prototypes. Our current research objectives are to create a methodology to sustain the educational hypermedia design and evaluation. A module is used to evaluate multimedia software applied in…

  5. Implementing Motivational Features in Reactive Blended Learning: Application to an Introductory Control Engineering Course

    ERIC Educational Resources Information Center

    Mendez, J. A.; Gonzalez, E. J.

    2011-01-01

    This paper presents a significant advance in a reactive blended learning methodology applied to an introductory control engineering course. This proposal was based on the inclusion of a reactive element (a fuzzy-logic-based controller) designed to regulate the workload for each student according to his/her activity and performance. The…

  6. Analysis of a Knowledge-Management-Based Process of Transferring Project Management Skills

    ERIC Educational Resources Information Center

    Ioi, Toshihiro; Ono, Masakazu; Ishii, Kota; Kato, Kazuhiko

    2012-01-01

    Purpose: The purpose of this paper is to propose a method for the transfer of knowledge and skills in project management (PM) based on techniques in knowledge management (KM). Design/methodology/approach: The literature contains studies on methods to extract experiential knowledge in PM, but few studies exist that focus on methods to convert…

  7. Retrieval of an available water-based soil moisture proxy from thermal infrared remote sensing. Part I: Methodology and validation

    USDA-ARS?s Scientific Manuscript database

    A retrieval of soil moisture is proposed using surface flux estimates from satellite-based thermal infrared (TIR) imagery and the Atmosphere-Land-Exchange-Inversion (ALEXI) model. The ability of ALEXI to provide valuable information about the partitioning of the surface energy budget, which can be l...

  8. 75 FR 14165 - National Institute of Child Health and Human Development; Revision to Proposed Collection...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-24

    ... Information Collection: The purpose of the proposed methodological study is to evaluate the feasibility... the NCS, the multiple methodological studies conducted during the Vanguard phase will inform the... methodological study is identification of recruitment strategies and components of recruitment strategies that...

  9. 48 CFR 1552.215-72 - Instructions for the Preparation of Proposals.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... used. If escalation is included, state the degree (percent) and methodology. The methodology shall.... If so, state the number required, the professional or technical level and the methodology used to... for which the salary is applicable; (C) List of other research Projects or proposals for which...

  10. Management of contaminated marine marketable resources after oil and HNS spills in Europe.

    PubMed

    Cunha, Isabel; Neuparth, Teresa; Moreira, Susana; Santos, Miguel M; Reis-Henriques, Maria Armanda

    2014-03-15

    Different risk evaluation approaches have been used to face oil and hazardous and noxious substances (HNS) spills all over the world. To minimize health risks and mitigate economic losses due to a long term ban on the sale of sea products after a spill, it is essential to preemptively set risk evaluation criteria and standard methodologies based on previous experience and appropriate scientifically sound criteria. Standard methodologies are analyzed and proposed in order to improve the definition of criteria for reintegrating previously contaminated marine marketable resources into the commercialization chain in Europe. The criteria used in former spills for the closing of and lifting of bans on fisheries and harvesting are analyzed. European legislation was identified regarding food sampling, food chemical analysis and maximum levels of contaminants allowed in seafood, which ought to be incorporated in the standard methodologies for the evaluation of the decision criteria defined for oil and HNS spills in Europe. A decision flowchart is proposed that opens the current decision criteria to new material that may be incorporated in the decision process. Decision criteria are discussed and compared among countries and incidents. An a priori definition of risk criteria and an elaboration of action plans are proposed to speed up actions that will lead to prompt final decisions. These decisions, based on the best available scientific data and conducing to lift or ban economic activity, will tend to be better understood and respected by citizens. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Wavelet-based de-noising algorithm for images acquired with parallel magnetic resonance imaging (MRI).

    PubMed

    Delakis, Ioannis; Hammad, Omer; Kitney, Richard I

    2007-07-07

    Wavelet-based de-noising has been shown to improve image signal-to-noise ratio in magnetic resonance imaging (MRI) while maintaining spatial resolution. Wavelet-based de-noising techniques typically implemented in MRI require that noise displays uniform spatial distribution. However, images acquired with parallel MRI have spatially varying noise levels. In this work, a new algorithm for filtering images with parallel MRI is presented. The proposed algorithm extracts the edges from the original image and then generates a noise map from the wavelet coefficients at finer scales. The noise map is zeroed at locations where edges have been detected and directional analysis is also used to calculate noise in regions of low-contrast edges that may not have been detected. The new methodology was applied on phantom and brain images and compared with other applicable de-noising techniques. The performance of the proposed algorithm was shown to be comparable with other techniques in central areas of the images, where noise levels are high. In addition, finer details and edges were maintained in peripheral areas, where noise levels are low. The proposed methodology is fully automated and can be applied on final reconstructed images without requiring sensitivity profiles or noise matrices of the receiver coils, therefore making it suitable for implementation in a clinical MRI setting.

  12. Determination of Time Dependent Virus Inactivation Rates

    NASA Astrophysics Data System (ADS)

    Chrysikopoulos, C. V.; Vogler, E. T.

    2003-12-01

    A methodology is developed for estimating temporally variable virus inactivation rate coefficients from experimental virus inactivation data. The methodology consists of a technique for slope estimation of normalized virus inactivation data in conjunction with a resampling parameter estimation procedure. The slope estimation technique is based on a relatively flexible geostatistical method known as universal kriging. Drift coefficients are obtained by nonlinear fitting of bootstrap samples and the corresponding confidence intervals are obtained by bootstrap percentiles. The proposed methodology yields more accurate time dependent virus inactivation rate coefficients than those estimated by fitting virus inactivation data to a first-order inactivation model. The methodology is successfully applied to a set of poliovirus batch inactivation data. Furthermore, the importance of accurate inactivation rate coefficient determination on virus transport in water saturated porous media is demonstrated with model simulations.

  13. Validating agent oriented methodology (AOM) for netlogo modelling and simulation

    NASA Astrophysics Data System (ADS)

    WaiShiang, Cheah; Nissom, Shane; YeeWai, Sim; Sharbini, Hamizan

    2017-10-01

    AOM (Agent Oriented Modeling) is a comprehensive and unified agent methodology for agent oriented software development. AOM methodology was proposed to aid developers with the introduction of technique, terminology, notation and guideline during agent systems development. Although AOM methodology is claimed to be capable of developing a complex real world system, its potential is yet to be realized and recognized by the mainstream software community and the adoption of AOM is still at its infancy. Among the reason is that there are not much case studies or success story of AOM. This paper presents two case studies on the adoption of AOM for individual based modelling and simulation. It demonstrate how the AOM is useful for epidemiology study and ecological study. Hence, it further validate the AOM in a qualitative manner.

  14. Finite Element Method-Based Kinematics and Closed-Loop Control of Soft, Continuum Manipulators.

    PubMed

    Bieze, Thor Morales; Largilliere, Frederick; Kruszewski, Alexandre; Zhang, Zhongkai; Merzouki, Rochdi; Duriez, Christian

    2018-06-01

    This article presents a modeling methodology and experimental validation for soft manipulators to obtain forward kinematic model (FKM) and inverse kinematic model (IKM) under quasi-static conditions (in the literature, these manipulators are usually classified as continuum robots. However, their main characteristic of interest in this article is that they create motion by deformation, as opposed to the classical use of articulations). It offers a way to obtain the kinematic characteristics of this type of soft robots that is suitable for offline path planning and position control. The modeling methodology presented relies on continuum mechanics, which does not provide analytic solutions in the general case. Our approach proposes a real-time numerical integration strategy based on finite element method with a numerical optimization based on Lagrange multipliers to obtain FKM and IKM. To reduce the dimension of the problem, at each step, a projection of the model to the constraint space (gathering actuators, sensors, and end-effector) is performed to obtain the smallest number possible of mathematical equations to be solved. This methodology is applied to obtain the kinematics of two different manipulators with complex structural geometry. An experimental comparison is also performed in one of the robots, between two other geometric approaches and the approach that is showcased in this article. A closed-loop controller based on a state estimator is proposed. The controller is experimentally validated and its robustness is evaluated using Lypunov stability method.

  15. A GIS-based time-dependent seismic source modeling of Northern Iran

    NASA Astrophysics Data System (ADS)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2017-01-01

    The first step in any seismic hazard study is the definition of seismogenic sources and the estimation of magnitude-frequency relationships for each source. There is as yet no standard methodology for source modeling and many researchers have worked on this topic. This study is an effort to define linear and area seismic sources for Northern Iran. The linear or fault sources are developed based on tectonic features and characteristic earthquakes while the area sources are developed based on spatial distribution of small to moderate earthquakes. Time-dependent recurrence relationships are developed for fault sources using renewal approach while time-independent frequency-magnitude relationships are proposed for area sources based on Poisson process. GIS functionalities are used in this study to introduce and incorporate spatial-temporal and geostatistical indices in delineating area seismic sources. The proposed methodology is used to model seismic sources for an area of about 500 by 400 square kilometers around Tehran. Previous researches and reports are studied to compile an earthquake/fault catalog that is as complete as possible. All events are transformed to uniform magnitude scale; duplicate events and dependent shocks are removed. Completeness and time distribution of the compiled catalog is taken into account. The proposed area and linear seismic sources in conjunction with defined recurrence relationships can be used to develop time-dependent probabilistic seismic hazard analysis of Northern Iran.

  16. Explosion/Blast Dynamics for Constellation Launch Vehicles Assessment

    NASA Technical Reports Server (NTRS)

    Baer, Mel; Crawford, Dave; Hickox, Charles; Kipp, Marlin; Hertel, Gene; Morgan, Hal; Ratzel, Arthur; Cragg, Clinton H.

    2009-01-01

    An assessment methodology is developed to guide quantitative predictions of adverse physical environments and the subsequent effects on the Ares-1 crew launch vehicle associated with the loss of containment of cryogenic liquid propellants from the upper stage during ascent. Development of the methodology is led by a team at Sandia National Laboratories (SNL) with guidance and support from a number of National Aeronautics and Space Administration (NASA) personnel. The methodology is based on the current Ares-1 design and feasible accident scenarios. These scenarios address containment failure from debris impact or structural response to pressure or blast loading from an external source. Once containment is breached, the envisioned assessment methodology includes predictions for the sequence of physical processes stemming from cryogenic tank failure. The investigative techniques, analysis paths, and numerical simulations that comprise the proposed methodology are summarized and appropriate simulation software is identified in this report.

  17. Diagnosis of retrofit fatigue crack re-initiation and growth in steel-girder bridges for proactive repair and emergency planning.

    DOT National Transportation Integrated Search

    2014-07-01

    This report presents a vibration : - : based damage : - : detection methodology that is capable of effectively capturing crack growth : near connections and crack re : - : initiation of retrofitted connections. The proposed damage detection algorithm...

  18. On the Application of Syntactic Methodologies in Automatic Text Analysis.

    ERIC Educational Resources Information Center

    Salton, Gerard; And Others

    1990-01-01

    Summarizes various linguistic approaches proposed for document analysis in information retrieval environments. Topics discussed include syntactic analysis; use of machine-readable dictionary information; knowledge base construction; the PLNLP English Grammar (PEG) system; phrase normalization; and statistical and syntactic phrase evaluation used…

  19. A Methodology for Calculating EGS Electricity Generation Potential Based on the Gringarten Model for Heat Extraction From Fractured Rock

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Augustine, Chad

    Existing methodologies for estimating the electricity generation potential of Enhanced Geothermal Systems (EGS) assume thermal recovery factors of 5% or less, resulting in relatively low volumetric electricity generation potentials for EGS reservoirs. This study proposes and develops a methodology for calculating EGS electricity generation potential based on the Gringarten conceptual model and analytical solution for heat extraction from fractured rock. The electricity generation potential of a cubic kilometer of rock as a function of temperature is calculated assuming limits on the allowed produced water temperature decline and reservoir lifetime based on surface power plant constraints. The resulting estimates of EGSmore » electricity generation potential can be one to nearly two-orders of magnitude larger than those from existing methodologies. The flow per unit fracture surface area from the Gringarten solution is found to be a key term in describing the conceptual reservoir behavior. The methodology can be applied to aid in the design of EGS reservoirs by giving minimum reservoir volume, fracture spacing, number of fractures, and flow requirements for a target reservoir power output. Limitations of the idealized model compared to actual reservoir performance and the implications on reservoir design are discussed.« less

  20. 77 FR 67363 - Sunshine Act Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-09

    ... 20571. Open Agenda Items: Item No. 1: Proposed Economic Impact Procedures and Methodological Guidelines. Documentation including the proposed Economic Impact Procedures and Methodological Guidelines as well as the...

  1. Virtual Estimator for Piecewise Linear Systems Based on Observability Analysis

    PubMed Central

    Morales-Morales, Cornelio; Adam-Medina, Manuel; Cervantes, Ilse; Vela-Valdés and, Luis G.; García Beltrán, Carlos Daniel

    2013-01-01

    This article proposes a virtual sensor for piecewise linear systems based on observability analysis that is in function of a commutation law related with the system's outpu. This virtual sensor is also known as a state estimator. Besides, it presents a detector of active mode when the commutation sequences of each linear subsystem are arbitrary and unknown. For the previous, this article proposes a set of virtual estimators that discern the commutation paths of the system and allow estimating their output. In this work a methodology in order to test the observability for piecewise linear systems with discrete time is proposed. An academic example is presented to show the obtained results. PMID:23447007

  2. A New Local Bipolar Autoassociative Memory Based on External Inputs of Discrete Recurrent Neural Networks With Time Delay.

    PubMed

    Zhou, Caigen; Zeng, Xiaoqin; Luo, Chaomin; Zhang, Huaguang

    In this paper, local bipolar auto-associative memories are presented based on discrete recurrent neural networks with a class of gain type activation function. The weight parameters of neural networks are acquired by a set of inequalities without the learning procedure. The global exponential stability criteria are established to ensure the accuracy of the restored patterns by considering time delays and external inputs. The proposed methodology is capable of effectively overcoming spurious memory patterns and achieving memory capacity. The effectiveness, robustness, and fault-tolerant capability are validated by simulated experiments.In this paper, local bipolar auto-associative memories are presented based on discrete recurrent neural networks with a class of gain type activation function. The weight parameters of neural networks are acquired by a set of inequalities without the learning procedure. The global exponential stability criteria are established to ensure the accuracy of the restored patterns by considering time delays and external inputs. The proposed methodology is capable of effectively overcoming spurious memory patterns and achieving memory capacity. The effectiveness, robustness, and fault-tolerant capability are validated by simulated experiments.

  3. Quad-polarized synthetic aperture radar and multispectral data classification using classification and regression tree and support vector machine-based data fusion system

    NASA Astrophysics Data System (ADS)

    Bigdeli, Behnaz; Pahlavani, Parham

    2017-01-01

    Interpretation of synthetic aperture radar (SAR) data processing is difficult because the geometry and spectral range of SAR are different from optical imagery. Consequently, SAR imaging can be a complementary data to multispectral (MS) optical remote sensing techniques because it does not depend on solar illumination and weather conditions. This study presents a multisensor fusion of SAR and MS data based on the use of classification and regression tree (CART) and support vector machine (SVM) through a decision fusion system. First, different feature extraction strategies were applied on SAR and MS data to produce more spectral and textural information. To overcome the redundancy and correlation between features, an intrinsic dimension estimation method based on noise-whitened Harsanyi, Farrand, and Chang determines the proper dimension of the features. Then, principal component analysis and independent component analysis were utilized on stacked feature space of two data. Afterward, SVM and CART classified each reduced feature space. Finally, a fusion strategy was utilized to fuse the classification results. To show the effectiveness of the proposed methodology, single classification on each data was compared to the obtained results. A coregistered Radarsat-2 and WorldView-2 data set from San Francisco, USA, was available to examine the effectiveness of the proposed method. The results show that combinations of SAR data with optical sensor based on the proposed methodology improve the classification results for most of the classes. The proposed fusion method provided approximately 93.24% and 95.44% for two different areas of the data.

  4. An efficient assisted history matching and uncertainty quantification workflow using Gaussian processes proxy models and variogram based sensitivity analysis: GP-VARS

    NASA Astrophysics Data System (ADS)

    Rana, Sachin; Ertekin, Turgay; King, Gregory R.

    2018-05-01

    Reservoir history matching is frequently viewed as an optimization problem which involves minimizing misfit between simulated and observed data. Many gradient and evolutionary strategy based optimization algorithms have been proposed to solve this problem which typically require a large number of numerical simulations to find feasible solutions. Therefore, a new methodology referred to as GP-VARS is proposed in this study which uses forward and inverse Gaussian processes (GP) based proxy models combined with a novel application of variogram analysis of response surface (VARS) based sensitivity analysis to efficiently solve high dimensional history matching problems. Empirical Bayes approach is proposed to optimally train GP proxy models for any given data. The history matching solutions are found via Bayesian optimization (BO) on forward GP models and via predictions of inverse GP model in an iterative manner. An uncertainty quantification method using MCMC sampling in conjunction with GP model is also presented to obtain a probabilistic estimate of reservoir properties and estimated ultimate recovery (EUR). An application of the proposed GP-VARS methodology on PUNQ-S3 reservoir is presented in which it is shown that GP-VARS provides history match solutions in approximately four times less numerical simulations as compared to the differential evolution (DE) algorithm. Furthermore, a comparison of uncertainty quantification results obtained by GP-VARS, EnKF and other previously published methods shows that the P50 estimate of oil EUR obtained by GP-VARS is in close agreement to the true values for the PUNQ-S3 reservoir.

  5. A desirability-based multi objective approach for the virtual screening discovery of broad-spectrum anti-gastric cancer agents

    PubMed Central

    Sánchez-Rodríguez, Aminael; Tejera, Eduardo; Cruz-Monteagudo, Maykel; Borges, Fernanda; Cordeiro, M. Natália D. S.; Le-Thi-Thu, Huong; Pham-The, Hai

    2018-01-01

    Gastric cancer is the third leading cause of cancer-related mortality worldwide and despite advances in prevention, diagnosis and therapy, it is still regarded as a global health concern. The efficacy of the therapies for gastric cancer is limited by a poor response to currently available therapeutic regimens. One of the reasons that may explain these poor clinical outcomes is the highly heterogeneous nature of this disease. In this sense, it is essential to discover new molecular agents capable of targeting various gastric cancer subtypes simultaneously. Here, we present a multi-objective approach for the ligand-based virtual screening discovery of chemical compounds simultaneously active against the gastric cancer cell lines AGS, NCI-N87 and SNU-1. The proposed approach relays in a novel methodology based on the development of ensemble models for the bioactivity prediction against each individual gastric cancer cell line. The methodology includes the aggregation of one ensemble per cell line using a desirability-based algorithm into virtual screening protocols. Our research leads to the proposal of a multi-targeted virtual screening protocol able to achieve high enrichment of known chemicals with anti-gastric cancer activity. Specifically, our results indicate that, using the proposed protocol, it is possible to retrieve almost 20 more times multi-targeted compounds in the first 1% of the ranked list than what is expected from a uniform distribution of the active ones in the virtual screening database. More importantly, the proposed protocol attains an outstanding initial enrichment of known multi-targeted anti-gastric cancer agents. PMID:29420638

  6. A new scenario-based approach to damage detection using operational modal parameter estimates

    NASA Astrophysics Data System (ADS)

    Hansen, J. B.; Brincker, R.; López-Aenlle, M.; Overgaard, C. F.; Kloborg, K.

    2017-09-01

    In this paper a vibration-based damage localization and quantification method, based on natural frequencies and mode shapes, is presented. The proposed technique is inspired by a damage assessment methodology based solely on the sensitivity of mass-normalized experimental determined mode shapes. The present method differs by being based on modal data extracted by means of Operational Modal Analysis (OMA) combined with a reasonable Finite Element (FE) representation of the test structure and implemented in a scenario-based framework. Besides a review of the basic methodology this paper addresses fundamental theoretical as well as practical considerations which are crucial to the applicability of a given vibration-based damage assessment configuration. Lastly, the technique is demonstrated on an experimental test case using automated OMA. Both the numerical study as well as the experimental test case presented in this paper are restricted to perturbations concerning mass change.

  7. Scoping reviews: time for clarity in definition, methods, and reporting.

    PubMed

    Colquhoun, Heather L; Levac, Danielle; O'Brien, Kelly K; Straus, Sharon; Tricco, Andrea C; Perrier, Laure; Kastner, Monika; Moher, David

    2014-12-01

    The scoping review has become increasingly popular as a form of knowledge synthesis. However, a lack of consensus on scoping review terminology, definition, methodology, and reporting limits the potential of this form of synthesis. In this article, we propose recommendations to further advance the field of scoping review methodology. We summarize current understanding of scoping review publication rates, terms, definitions, and methods. We propose three recommendations for clarity in term, definition and methodology. We recommend adopting the terms "scoping review" or "scoping study" and the use of a proposed definition. Until such time as further guidance is developed, we recommend the use of the methodological steps outlined in the Arksey and O'Malley framework and further enhanced by Levac et al. The development of reporting guidance for the conduct and reporting of scoping reviews is underway. Consistency in the proposed domains and methodologies of scoping reviews, along with the development of reporting guidance, will facilitate methodological advancement, reduce confusion, facilitate collaboration and improve knowledge translation of scoping review findings. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. A hybrid system identification methodology for wireless structural health monitoring systems based on dynamic substructuring

    NASA Astrophysics Data System (ADS)

    Dragos, Kosmas; Smarsly, Kay

    2016-04-01

    System identification has been employed in numerous structural health monitoring (SHM) applications. Traditional system identification methods usually rely on centralized processing of structural response data to extract information on structural parameters. However, in wireless SHM systems the centralized processing of structural response data introduces a significant communication bottleneck. Exploiting the merits of decentralization and on-board processing power of wireless SHM systems, many system identification methods have been successfully implemented in wireless sensor networks. While several system identification approaches for wireless SHM systems have been proposed, little attention has been paid to obtaining information on the physical parameters (e.g. stiffness, damping) of the monitored structure. This paper presents a hybrid system identification methodology suitable for wireless sensor networks based on the principles of component mode synthesis (dynamic substructuring). A numerical model of the monitored structure is embedded into the wireless sensor nodes in a distributed manner, i.e. the entire model is segmented into sub-models, each embedded into one sensor node corresponding to the substructure the sensor node is assigned to. The parameters of each sub-model are estimated by extracting local mode shapes and by applying the equations of the Craig-Bampton method on dynamic substructuring. The proposed methodology is validated in a laboratory test conducted on a four-story frame structure to demonstrate the ability of the methodology to yield accurate estimates of stiffness parameters. Finally, the test results are discussed and an outlook on future research directions is provided.

  9. A mixture gatekeeping procedure based on the Hommel test for clinical trial applications.

    PubMed

    Brechenmacher, Thomas; Xu, Jane; Dmitrienko, Alex; Tamhane, Ajit C

    2011-07-01

    When conducting clinical trials with hierarchically ordered objectives, it is essential to use multiplicity adjustment methods that control the familywise error rate in the strong sense while taking into account the logical relations among the null hypotheses. This paper proposes a gatekeeping procedure based on the Hommel (1988) test, which offers power advantages compared to other p value-based tests proposed in the literature. A general description of the procedure is given and details are presented on how it can be applied to complex clinical trial designs. Two clinical trial examples are given to illustrate the methodology developed in the paper.

  10. Nearest unlike neighbor (NUN): an aid to decision confidence estimation

    NASA Astrophysics Data System (ADS)

    Dasarathy, Belur V.

    1995-09-01

    The concept of nearest unlike neighbor (NUN), proposed and explored previously in the design of nearest neighbor (NN) based decision systems, is further exploited in this study to develop a measure of confidence in the decisions made by NN-based decision systems. This measure of confidence, on the basis of comparison with a user-defined threshold, may be used to determine the acceptability of the decision provided by the NN-based decision system. The concepts, associated methodology, and some illustrative numerical examples using the now classical Iris data to bring out the ease of implementation and effectiveness of the proposed innovations are presented.

  11. Efficient Surface Enhanced Raman Scattering substrates from femtosecond laser based fabrication

    NASA Astrophysics Data System (ADS)

    Parmar, Vinod; Kanaujia, Pawan K.; Bommali, Ravi Kumar; Vijaya Prakash, G.

    2017-10-01

    A fast and simple femtosecond laser based methodology for efficient Surface Enhanced Raman Scattering (SERS) substrate fabrication has been proposed. Both nano scaffold silicon (black silicon) and gold nanoparticles (Au-NP) are fabricated by femtosecond laser based technique for mass production. Nano rough silicon scaffold enables large electromagnetic fields for the localized surface plasmons from decorated metallic nanoparticles. Thus giant enhancement (approximately in the order of 104) of Raman signal arises from the mixed effects of electron-photon-phonon coupling, even at nanomolar concentrations of test organic species (Rhodamine 6G). Proposed process demonstrates the low-cost and label-less application ability from these large-area SERS substrates.

  12. Semantic integration of gene expression analysis tools and data sources using software connectors

    PubMed Central

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data. PMID:24341380

  13. Semantic integration of gene expression analysis tools and data sources using software connectors.

    PubMed

    Miyazaki, Flávia A; Guardia, Gabriela D A; Vêncio, Ricardo Z N; de Farias, Cléver R G

    2013-10-25

    The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heterogeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.

  14. Prediction of heat transfer coefficients for forced convective boiling of N2-hydrocarbon mixtures at cryogenic conditions using artificial neural networks

    NASA Astrophysics Data System (ADS)

    Barroso-Maldonado, J. M.; Belman-Flores, J. M.; Ledesma, S.; Aceves, S. M.

    2018-06-01

    A key problem faced in the design of heat exchangers, especially for cryogenic applications, is the determination of convective heat transfer coefficients in two-phase flow such as condensation and boiling of non-azeotropic refrigerant mixtures. This paper proposes and evaluates three models for estimating the convective coefficient during boiling. These models are developed using computational intelligence techniques. The performance of the proposed models is evaluated using the mean relative error (mre), and compared to two existing models: the modified Granryd's correlation and the Silver-Bell-Ghaly method. The three proposed models are distinguished by their architecture. The first is based on directly measured parameters (DMP-ANN), the second is based on equivalent Reynolds and Prandtl numbers (eq-ANN), and the third on effective Reynolds and Prandtl numbers (eff-ANN). The results demonstrate that the proposed artificial neural network (ANN)-based approaches greatly outperform available methodologies. While Granryd's correlation predicts experimental data within a mean relative error mre = 44% and the S-B-G method produces mre = 42%, DMP-ANN has mre = 7.4% and eff-ANN has mre = 3.9%. Considering that eff-ANN has the lowest mean relative error (one tenth of previously available methodologies) and the broadest range of applicability, it is recommended for future calculations. Implementation is straightforward within a variety of platforms and the matrices with the ANN weights are given in the appendix for efficient programming.

  15. Ex-ante and ex-post measurement of equality of opportunity in health: a normative decomposition.

    PubMed

    Donni, Paolo Li; Peragine, Vito; Pignataro, Giuseppe

    2014-02-01

    This paper proposes and discusses two different approaches to the definition of inequality in health: the ex-ante and the ex-post approach. It proposes strategies for measuring inequality of opportunity in health based on the path-independent Atkinson equality index. The proposed methodology is illustrated using data from the British Household Panel Survey; the results suggest that in the period 2000-2005, at least one-third of the observed health equalities in the UK were equalities of opportunity. Copyright © 2013 John Wiley & Sons, Ltd.

  16. Reflection-induced linear polarization rotation and phase modulation between orthogonal waves for refractive index variation measurement.

    PubMed

    Twu, Ruey-Ching; Wang, Jhao-Sheng

    2016-04-01

    An optical phase interrogation is proposed to study reflection-induced linear polarization rotation in a common-path homodyne interferometer. This optical methodology can also be applied to the measurement of the refractive index variation of a liquid solution. The performance of the refractive index sensing structure is discussed theoretically, and the experimental results demonstrated a very good ability based on the proposed schemes. Compared with a conventional common-path heterodyne interferometer, the proposed homodyne interferometer with only a single channel reduced the usage of optic elements.

  17. A Framework for Spatial Interaction Analysis Based on Large-Scale Mobile Phone Data

    PubMed Central

    Li, Weifeng; Cheng, Xiaoyun; Guo, Gaohua

    2014-01-01

    The overall understanding of spatial interaction and the exact knowledge of its dynamic evolution are required in the urban planning and transportation planning. This study aimed to analyze the spatial interaction based on the large-scale mobile phone data. The newly arisen mass dataset required a new methodology which was compatible with its peculiar characteristics. A three-stage framework was proposed in this paper, including data preprocessing, critical activity identification, and spatial interaction measurement. The proposed framework introduced the frequent pattern mining and measured the spatial interaction by the obtained association. A case study of three communities in Shanghai was carried out as verification of proposed method and demonstration of its practical application. The spatial interaction patterns and the representative features proved the rationality of the proposed framework. PMID:25435865

  18. [Priority research agendas: a strategic resource for health in Latin America].

    PubMed

    Becerra-Posada, Francisco; de Snyder, Nelly Salgado; Cuervo, Luis Gabriel; Montorzi, Gabriela

    2014-12-01

    Understand and analyze procedures used to create national integrated research agendas from 2007 to 2011 in Argentina, Guatemala, Mexico, Panama, and Paraguay. Descriptive, cross-sectional study using an online survey of agenda preparation processes; specifically, development, integration, implementation, and use and dissemination of the agenda. The 45 respondents reported following specific methodologies for agenda construction and had a good opinion of organizational aspects with regard to prior information provided and balance among disciplines and stakeholders. Some 60% considered the coordinators impartial, although 25% mentioned biases favoring some subject; 42% received technical support from consultants, reading matter, and methodological guidelines; 40% engaged in subject-matter priority-setting; and 55% confirmed dissemination and communication of the agenda. However, only 22% reported inclusion of agenda topics in national calls for research proposals. In the countries studied, development of the health research agenda was characterized by prior planning and appropriate organization to achieve - consensus-based outcomes. Nevertheless, the agendas were not used in national calls for research proposals, reflecting lack of coordination in national health research systems and lack of connection between funders and researchers. It is recommended that stakeholders strengthen integration and advocacy efforts to modify processes and structures of agenda-based calls for research proposals.

  19. Developing Mobile- and BIM-Based Integrated Visual Facility Maintenance Management System

    PubMed Central

    Su, Yu-Chih

    2013-01-01

    Facility maintenance management (FMM) has become an important topic for research on the operation phase of the construction life cycle. Managing FMM effectively is extremely difficult owing to various factors and environments. One of the difficulties is the performance of 2D graphics when depicting maintenance service. Building information modeling (BIM) uses precise geometry and relevant data to support the maintenance service of facilities depicted in 3D object-oriented CAD. This paper proposes a new and practical methodology with application to FMM using BIM technology. Using BIM technology, this study proposes a BIM-based facility maintenance management (BIMFMM) system for maintenance staff in the operation and maintenance phase. The BIMFMM system is then applied in selected case study of a commercial building project in Taiwan to verify the proposed methodology and demonstrate its effectiveness in FMM practice. Using the BIMFMM system, maintenance staff can access and review 3D BIM models for updating related maintenance records in a digital format. Moreover, this study presents a generic system architecture and its implementation. The combined results demonstrate that a BIMFMM-like system can be an effective visual FMM tool. PMID:24227995

  20. Measurement-based auralization methodology for the assessment of noise mitigation measures

    NASA Astrophysics Data System (ADS)

    Thomas, Pieter; Wei, Weigang; Van Renterghem, Timothy; Botteldooren, Dick

    2016-09-01

    The effect of noise mitigation measures is generally expressed by noise levels only, neglecting the listener's perception. In this study, an auralization methodology is proposed that enables an auditive preview of noise abatement measures for road traffic noise, based on the direction dependent attenuation of a priori recordings made with a dedicated 32-channel spherical microphone array. This measurement-based auralization has the advantage that all non-road traffic sounds that create the listening context are present. The potential of this auralization methodology is evaluated through the assessment of the effect of an L-shaped mound. The angular insertion loss of the mound is estimated by using the ISO 9613-2 propagation model, the Pierce barrier diffraction model and the Harmonoise point-to-point model. The realism of the auralization technique is evaluated by listening tests, indicating that listeners had great difficulty in differentiating between a posteriori recordings and auralized samples, which shows the validity of the followed approaches.

  1. The determination of operational and support requirements and costs during the conceptual design of space systems

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles; Beasley, Kenneth D.

    1992-01-01

    The first year of research to provide NASA support in predicting operational and support parameters and costs of proposed space systems is reported. Some of the specific research objectives were (1) to develop a methodology for deriving reliability and maintainability parameters and, based upon their estimates, determine the operational capability and support costs, and (2) to identify data sources and establish an initial data base to implement the methodology. Implementation of the methodology is accomplished through the development of a comprehensive computer model. While the model appears to work reasonably well when applied to aircraft systems, it was not accurate when used for space systems. The model is dynamic and should be updated as new data become available. It is particularly important to integrate the current aircraft data base with data obtained from the Space Shuttle and other space systems since subsystems unique to a space vehicle require data not available from aircraft. This research only addressed the major subsystems on the vehicle.

  2. An entropy-based method for determining the flow depth distribution in natural channels

    NASA Astrophysics Data System (ADS)

    Moramarco, Tommaso; Corato, Giovanni; Melone, Florisa; Singh, Vijay P.

    2013-08-01

    A methodology for determining the bathymetry of river cross-sections during floods by the sampling of surface flow velocity and existing low flow hydraulic data is developed . Similar to Chiu (1988) who proposed an entropy-based velocity distribution, the flow depth distribution in a cross-section of a natural channel is derived by entropy maximization. The depth distribution depends on one parameter, whose estimate is straightforward, and on the maximum flow depth. Applying to a velocity data set of five river gage sites, the method modeled the flow area observed during flow measurements and accurately assessed the corresponding discharge by coupling the flow depth distribution and the entropic relation between mean velocity and maximum velocity. The methodology unfolds a new perspective for flow monitoring by remote sensing, considering that the two main quantities on which the methodology is based, i.e., surface flow velocity and flow depth, might be potentially sensed by new sensors operating aboard an aircraft or satellite.

  3. On Robust Methodologies for Managing Public Health Care Systems

    PubMed Central

    Nimmagadda, Shastri L.; Dreher, Heinz V.

    2014-01-01

    Authors focus on ontology-based multidimensional data warehousing and mining methodologies, addressing various issues on organizing, reporting and documenting diabetic cases and their associated ailments, including causalities. Map and other diagnostic data views, depicting similarity and comparison of attributes, extracted from warehouses, are used for understanding the ailments, based on gender, age, geography, food-habits and other hereditary event attributes. In addition to rigor on data mining and visualization, an added focus is on values of interpretation of data views, from processed full-bodied diagnosis, subsequent prescription and appropriate medications. The proposed methodology, is a robust back-end application, for web-based patient-doctor consultations and e-Health care management systems through which, billions of dollars spent on medical services, can be saved, in addition to improving quality of life and average life span of a person. Government health departments and agencies, private and government medical practitioners including social welfare organizations are typical users of these systems. PMID:24445953

  4. Analyzing parameters optimisation in minimising warpage on side arm using response surface methodology (RSM)

    NASA Astrophysics Data System (ADS)

    Rayhana, N.; Fathullah, M.; Shayfull, Z.; Nasir, S. M.; Hazwan, M. H. M.

    2017-09-01

    This paper presents a systematic methodology to analyse the warpage of the side arm part using Autodesk Moldflow Insight software. Response Surface Methodology (RSM) was proposed to optimise the processing parameters that will result in optimal solutions by efficiently minimising the warpage of the side arm part. The variable parameters considered in this study was based on most significant parameters affecting warpage stated by previous researchers, that is melt temperature, mould temperature and packing pressure while adding packing time and cooling time as these is the commonly used parameters by researchers. The results show that warpage was improved by 10.15% and the most significant parameters affecting warpage are packing pressure.

  5. FY16 Progress Report on Test Results In Support Of Integrated EPP and SMT Design Methods Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yanli; Jetter, Robert I.; Sham, T. -L.

    2016-08-08

    The proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology consists of incorporating an SMT data-based approach for creep-fatigue damage evaluation into the EPP methodology to avoid using the creep-fatigue interaction diagram (the D diagram) and to minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed code rules and to verify their applicability, a series of thermomechanical tests have been initiated. This report presents the recent test results for Type 2 SMT specimens on Alloy 617, Pressurization SMT on Alloy 617, Type 1 SMT on Gr. 91, and two-barmore » thermal ratcheting test results on Alloy 617 with a new thermal loading profile.« less

  6. Stacked Autoencoders for Outlier Detection in Over-the-Horizon Radar Signals

    PubMed Central

    Protopapadakis, Eftychios; Doulamis, Anastasios; Doulamis, Nikolaos; Dres, Dimitrios; Bimpas, Matthaios

    2017-01-01

    Detection of outliers in radar signals is a considerable challenge in maritime surveillance applications. High-Frequency Surface-Wave (HFSW) radars have attracted significant interest as potential tools for long-range target identification and outlier detection at over-the-horizon (OTH) distances. However, a number of disadvantages, such as their low spatial resolution and presence of clutter, have a negative impact on their accuracy. In this paper, we explore the applicability of deep learning techniques for detecting deviations from the norm in behavioral patterns of vessels (outliers) as they are tracked from an OTH radar. The proposed methodology exploits the nonlinear mapping capabilities of deep stacked autoencoders in combination with density-based clustering. A comparative experimental evaluation of the approach shows promising results in terms of the proposed methodology's performance. PMID:29312449

  7. Multiqubit subradiant states in N -port waveguide devices: ɛ-and-μ-near-zero hubs and nonreciprocal circulators

    NASA Astrophysics Data System (ADS)

    Liberal, Iñigo; Engheta, Nader

    2018-02-01

    Quantum emitters interacting through a waveguide setup have been proposed as a promising platform for basic research on light-matter interactions and quantum information processing. We propose to augment waveguide setups with the use of multiport devices. Specifically, we demonstrate theoretically the possibility of exciting N -qubit subradiant, maximally entangled, states with the use of suitably designed N -port devices. Our general methodology is then applied based on two different devices: an epsilon-and-mu-near-zero waveguide hub and a nonreciprocal circulator. A sensitivity analysis is carried out to assess the robustness of the system against a number of nonidealities. These findings link and merge the designs of devices for quantum state engineering with classical communication network methodologies.

  8. 76 FR 1889 - Risk-Based Capital Guidelines: Market Risk

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-11

    ... regulatory measurement methodologies; and increase transparency through enhanced disclosures. The proposal... banks \\1\\ entitled International Convergence of Capital Measurement and Capital Standards (1988 Capital... the internal models approach (see http://www.bis.org/press/p970918a.htm ). \\5\\ 61 FR 47358 (September...

  9. Set-Based Discrete Particle Swarm Optimization Based on Decomposition for Permutation-Based Multiobjective Combinatorial Optimization Problems.

    PubMed

    Yu, Xue; Chen, Wei-Neng; Gu, Tianlong; Zhang, Huaxiang; Yuan, Huaqiang; Kwong, Sam; Zhang, Jun

    2018-07-01

    This paper studies a specific class of multiobjective combinatorial optimization problems (MOCOPs), namely the permutation-based MOCOPs. Many commonly seen MOCOPs, e.g., multiobjective traveling salesman problem (MOTSP), multiobjective project scheduling problem (MOPSP), belong to this problem class and they can be very different. However, as the permutation-based MOCOPs share the inherent similarity that the structure of their search space is usually in the shape of a permutation tree, this paper proposes a generic multiobjective set-based particle swarm optimization methodology based on decomposition, termed MS-PSO/D. In order to coordinate with the property of permutation-based MOCOPs, MS-PSO/D utilizes an element-based representation and a constructive approach. Through this, feasible solutions under constraints can be generated step by step following the permutation-tree-shaped structure. And problem-related heuristic information is introduced in the constructive approach for efficiency. In order to address the multiobjective optimization issues, the decomposition strategy is employed, in which the problem is converted into multiple single-objective subproblems according to a set of weight vectors. Besides, a flexible mechanism for diversity control is provided in MS-PSO/D. Extensive experiments have been conducted to study MS-PSO/D on two permutation-based MOCOPs, namely the MOTSP and the MOPSP. Experimental results validate that the proposed methodology is promising.

  10. An approach to solve replacement problems under intuitionistic fuzzy nature

    NASA Astrophysics Data System (ADS)

    Balaganesan, M.; Ganesan, K.

    2018-04-01

    Due to impreciseness to solve the day to day problems the researchers use fuzzy sets in their discussions of the replacement problems. The aim of this paper is to solve the replacement theory problems with triangular intuitionistic fuzzy numbers. An effective methodology based on fuzziness index and location index is proposed to determine the optimal solution of the replacement problem. A numerical example is illustrated to validate the proposed method.

  11. Extension of the firefly algorithm and preference rules for solving MINLP problems

    NASA Astrophysics Data System (ADS)

    Costa, M. Fernanda P.; Francisco, Rogério B.; Rocha, Ana Maria A. C.; Fernandes, Edite M. G. P.

    2017-07-01

    An extension of the firefly algorithm (FA) for solving mixed-integer nonlinear programming (MINLP) problems is presented. Although penalty functions are nowadays frequently used to handle integrality conditions and inequality and equality constraints, this paper proposes the implementation within the FA of a simple rounded-based heuristic and four preference rules to find and converge to MINLP feasible solutions. Preliminary numerical experiments are carried out to validate the proposed methodology.

  12. Examination and Implementation of a Proposal for a Ph.D. Program in Administrative Sciences

    DTIC Science & Technology

    1992-03-01

    Review of two proposals recently approved by the Academic Council (i.e., Computer Science and Mathematics Departments). C. SCOPE OF THE STUDY Since WWII...and through the computer age, the application of administrative science theory and methodologies from the behavioral sciences and quantitative...roles in the U.S. Navy and DoD, providing people who firmly understand the technical and organizational aspects of computer -based systems which support

  13. Simultaneous estimation of human and exoskeleton motion: A simplified protocol.

    PubMed

    Alvarez, M T; Torricelli, D; Del-Ama, A J; Pinto, D; Gonzalez-Vargas, J; Moreno, J C; Gil-Agudo, A; Pons, J L

    2017-07-01

    Adequate benchmarking procedures in the area of wearable robots is gaining importance in order to compare different devices on a quantitative basis, improve them and support the standardization and regulation procedures. Performance assessment usually focuses on the execution of locomotion tasks, and is mostly based on kinematic-related measures. Typical drawbacks of marker-based motion capture systems, gold standard for measure of human limb motion, become challenging when measuring limb kinematics, due to the concomitant presence of the robot. This work answers the question of how to reliably assess the subject's body motion by placing markers over the exoskeleton. Focusing on the ankle joint, the proposed methodology showed that it is possible to reconstruct the trajectory of the subject's joint by placing markers on the exoskeleton, although foot flexibility during walking can impact the reconstruction accuracy. More experiments are needed to confirm this hypothesis, and more subjects and walking conditions are needed to better characterize the errors of the proposed methodology, although our results are promising, indicating small errors.

  14. A new variance stabilizing transformation for gene expression data analysis.

    PubMed

    Kelmansky, Diana M; Martínez, Elena J; Leiva, Víctor

    2013-12-01

    In this paper, we introduce a new family of power transformations, which has the generalized logarithm as one of its members, in the same manner as the usual logarithm belongs to the family of Box-Cox power transformations. Although the new family has been developed for analyzing gene expression data, it allows a wider scope of mean-variance related data to be reached. We study the analytical properties of the new family of transformations, as well as the mean-variance relationships that are stabilized by using its members. We propose a methodology based on this new family, which includes a simple strategy for selecting the family member adequate for a data set. We evaluate the finite sample behavior of different classical and robust estimators based on this strategy by Monte Carlo simulations. We analyze real genomic data by using the proposed transformation to empirically show how the new methodology allows the variance of these data to be stabilized.

  15. LSHSIM: A Locality Sensitive Hashing based method for multiple-point geostatistics

    NASA Astrophysics Data System (ADS)

    Moura, Pedro; Laber, Eduardo; Lopes, Hélio; Mesejo, Daniel; Pavanelli, Lucas; Jardim, João; Thiesen, Francisco; Pujol, Gabriel

    2017-10-01

    Reservoir modeling is a very important task that permits the representation of a geological region of interest, so as to generate a considerable number of possible scenarios. Since its inception, many methodologies have been proposed and, in the last two decades, multiple-point geostatistics (MPS) has been the dominant one. This methodology is strongly based on the concept of training image (TI) and the use of its characteristics, which are called patterns. In this paper, we propose a new MPS method that combines the application of a technique called Locality Sensitive Hashing (LSH), which permits to accelerate the search for patterns similar to a target one, with a Run-Length Encoding (RLE) compression technique that speeds up the calculation of the Hamming similarity. Experiments with both categorical and continuous images show that LSHSIM is computationally efficient and produce good quality realizations. In particular, for categorical data, the results suggest that LSHSIM is faster than MS-CCSIM, one of the state-of-the-art methods.

  16. Domain Anomaly Detection in Machine Perception: A System Architecture and Taxonomy.

    PubMed

    Kittler, Josef; Christmas, William; de Campos, Teófilo; Windridge, David; Yan, Fei; Illingworth, John; Osman, Magda

    2014-05-01

    We address the problem of anomaly detection in machine perception. The concept of domain anomaly is introduced as distinct from the conventional notion of anomaly used in the literature. We propose a unified framework for anomaly detection which exposes the multifaceted nature of anomalies and suggest effective mechanisms for identifying and distinguishing each facet as instruments for domain anomaly detection. The framework draws on the Bayesian probabilistic reasoning apparatus which clearly defines concepts such as outlier, noise, distribution drift, novelty detection (object, object primitive), rare events, and unexpected events. Based on these concepts we provide a taxonomy of domain anomaly events. One of the mechanisms helping to pinpoint the nature of anomaly is based on detecting incongruence between contextual and noncontextual sensor(y) data interpretation. The proposed methodology has wide applicability. It underpins in a unified way the anomaly detection applications found in the literature. To illustrate some of its distinguishing features, in here the domain anomaly detection methodology is applied to the problem of anomaly detection for a video annotation system.

  17. Multi-Dielectric Brownian Dynamics and Design-Space-Exploration Studies of Permeation in Ion Channels.

    PubMed

    Siksik, May; Krishnamurthy, Vikram

    2017-09-01

    This paper proposes a multi-dielectric Brownian dynamics simulation framework for design-space-exploration (DSE) studies of ion-channel permeation. The goal of such DSE studies is to estimate the channel modeling-parameters that minimize the mean-squared error between the simulated and expected "permeation characteristics." To address this computational challenge, we use a methodology based on statistical inference that utilizes the knowledge of channel structure to prune the design space. We demonstrate the proposed framework and DSE methodology using a case study based on the KcsA ion channel, in which the design space is successfully reduced from a 6-D space to a 2-D space. Our results show that the channel dielectric map computed using the framework matches with that computed directly using molecular dynamics with an error of 7%. Finally, the scalability and resolution of the model used are explored, and it is shown that the memory requirements needed for DSE remain constant as the number of parameters (degree of heterogeneity) increases.

  18. Methodological accuracy of image-based electron density assessment using dual-energy computed tomography.

    PubMed

    Möhler, Christian; Wohlfahrt, Patrick; Richter, Christian; Greilich, Steffen

    2017-06-01

    Electron density is the most important tissue property influencing photon and ion dose distributions in radiotherapy patients. Dual-energy computed tomography (DECT) enables the determination of electron density by combining the information on photon attenuation obtained at two different effective x-ray energy spectra. Most algorithms suggested so far use the CT numbers provided after image reconstruction as input parameters, i.e., are imaged-based. To explore the accuracy that can be achieved with these approaches, we quantify the intrinsic methodological and calibration uncertainty of the seemingly simplest approach. In the studied approach, electron density is calculated with a one-parametric linear superposition ('alpha blending') of the two DECT images, which is shown to be equivalent to an affine relation between the photon attenuation cross sections of the two x-ray energy spectra. We propose to use the latter relation for empirical calibration of the spectrum-dependent blending parameter. For a conclusive assessment of the electron density uncertainty, we chose to isolate the purely methodological uncertainty component from CT-related effects such as noise and beam hardening. Analyzing calculated spectrally weighted attenuation coefficients, we find universal applicability of the investigated approach to arbitrary mixtures of human tissue with an upper limit of the methodological uncertainty component of 0.2%, excluding high-Z elements such as iodine. The proposed calibration procedure is bias-free and straightforward to perform using standard equipment. Testing the calibration on five published data sets, we obtain very small differences in the calibration result in spite of different experimental setups and CT protocols used. Employing a general calibration per scanner type and voltage combination is thus conceivable. Given the high suitability for clinical application of the alpha-blending approach in combination with a very small methodological uncertainty, we conclude that further refinement of image-based DECT-algorithms for electron density assessment is not advisable. © 2017 American Association of Physicists in Medicine.

  19. Reverse Engineering and Security Evaluation of Commercial Tags for RFID-Based IoT Applications.

    PubMed

    Fernández-Caramés, Tiago M; Fraga-Lamas, Paula; Suárez-Albela, Manuel; Castedo, Luis

    2016-12-24

    The Internet of Things (IoT) is a distributed system of physical objects that requires the seamless integration of hardware (e.g., sensors, actuators, electronics) and network communications in order to collect and exchange data. IoT smart objects need to be somehow identified to determine the origin of the data and to automatically detect the elements around us. One of the best positioned technologies to perform identification is RFID (Radio Frequency Identification), which in the last years has gained a lot of popularity in applications like access control, payment cards or logistics. Despite its popularity, RFID security has not been properly handled in numerous applications. To foster security in such applications, this article includes three main contributions. First, in order to establish the basics, a detailed review of the most common flaws found in RFID-based IoT systems is provided, including the latest attacks described in the literature. Second, a novel methodology that eases the detection and mitigation of such flaws is presented. Third, the latest RFID security tools are analyzed and the methodology proposed is applied through one of them (Proxmark 3) to validate it. Thus, the methodology is tested in different scenarios where tags are commonly used for identification. In such systems it was possible to clone transponders, extract information, and even emulate both tags and readers. Therefore, it is shown that the methodology proposed is useful for auditing security and reverse engineering RFID communications in IoT applications. It must be noted that, although this paper is aimed at fostering RFID communications security in IoT applications, the methodology can be applied to any RFID communications protocol.

  20. Reverse Engineering and Security Evaluation of Commercial Tags for RFID-Based IoT Applications

    PubMed Central

    Fernández-Caramés, Tiago M.; Fraga-Lamas, Paula; Suárez-Albela, Manuel; Castedo, Luis

    2016-01-01

    The Internet of Things (IoT) is a distributed system of physical objects that requires the seamless integration of hardware (e.g., sensors, actuators, electronics) and network communications in order to collect and exchange data. IoT smart objects need to be somehow identified to determine the origin of the data and to automatically detect the elements around us. One of the best positioned technologies to perform identification is RFID (Radio Frequency Identification), which in the last years has gained a lot of popularity in applications like access control, payment cards or logistics. Despite its popularity, RFID security has not been properly handled in numerous applications. To foster security in such applications, this article includes three main contributions. First, in order to establish the basics, a detailed review of the most common flaws found in RFID-based IoT systems is provided, including the latest attacks described in the literature. Second, a novel methodology that eases the detection and mitigation of such flaws is presented. Third, the latest RFID security tools are analyzed and the methodology proposed is applied through one of them (Proxmark 3) to validate it. Thus, the methodology is tested in different scenarios where tags are commonly used for identification. In such systems it was possible to clone transponders, extract information, and even emulate both tags and readers. Therefore, it is shown that the methodology proposed is useful for auditing security and reverse engineering RFID communications in IoT applications. It must be noted that, although this paper is aimed at fostering RFID communications security in IoT applications, the methodology can be applied to any RFID communications protocol. PMID:28029119

  1. Teaching learning algorithm based optimization of kerf deviations in pulsed Nd:YAG laser cutting of Kevlar-29 composite laminates

    NASA Astrophysics Data System (ADS)

    Gautam, Girish Dutt; Pandey, Arun Kumar

    2018-03-01

    Kevlar is the most popular aramid fiber and most commonly used in different technologically advanced industries for various applications. But the precise cutting of Kevlar composite laminates is a difficult task. The conventional cutting methods face various defects such as delamination, burr formation, fiber pullout with poor surface quality and their mechanical performance is greatly affected by these defects. The laser beam machining may be an alternative of the conventional cutting processes due to its non-contact nature, requirement of low specific energy with higher production rate. But this process also faces some problems that may be minimized by operating the machine at optimum parameters levels. This research paper examines the effective utilization of the Nd:YAG laser cutting system on difficult-to-cut Kevlar-29 composite laminates. The objective of the proposed work is to find the optimum process parameters settings for getting the minimum kerf deviations at both sides. The experiments have been conducted on Kevlar-29 composite laminates having thickness 1.25 mm by using Box-Benkhen design with two center points. The experimental data have been used for the optimization by using the proposed methodology. For the optimization, Teaching learning Algorithm based approach has been employed to obtain the minimum kerf deviation at bottom and top sides. A self coded Matlab program has been developed by using the proposed methodology and this program has been used for the optimization. Finally, the confirmation tests have been performed to compare the experimental and optimum results obtained by the proposed methodology. The comparison results show that the machining performance in the laser beam cutting process has been remarkably improved through proposed approach. Finally, the influence of different laser cutting parameters such as lamp current, pulse frequency, pulse width, compressed air pressure and cutting speed on top kerf deviation and bottom kerf deviation during the Nd:YAG laser cutting of Kevlar-29 laminates have been discussed.

  2. Sol-Gel Application for Consolidating Stone: An Example of Project-Based Learning in a Physical Chemistry Lab

    ERIC Educational Resources Information Center

    de los Santos, Desiree´ M.; Montes, Antonio; Sa´nchez-Coronilla, Antonio; Navas, Javier

    2014-01-01

    A Project Based Learning (PBL) methodology was used in the practical laboratories of the Advanced Physical Chemistry department. The project type proposed simulates "real research" focusing on sol-gel synthesis and the application of the obtained sol as a stone consolidant. Students were divided into small groups (2 to 3 students) to…

  3. Deconstructing the Tower of Babel: A Design Method to Improve Empathy and Teamwork Competences of Informatics Students

    ERIC Educational Resources Information Center

    Blanco, Teresa; López-Forniés, Ignacio; Zarazaga-Soria, Francisco Javier

    2017-01-01

    The competence-based education recently launched in Spanish universities presents a set of abilities and skills that are difficult to teach to students in higher and more technologically-oriented grades. In this paper, a teaching intervention that is based on design methodologies is proposed, to upgrade the competitive capacities of computer…

  4. Using the Knowledge Transfer Partnership Approach in Undergraduate Education and Practice-Based Training to Encourage Employer Engagement

    ERIC Educational Resources Information Center

    Harris, Margaret; Chisholm, Colin; Burns, George

    2013-01-01

    Purpose: The purpose of this paper is to provide a conceptual viewpoint which proposes the use of the post graduate Knowledge Transfer Partnership (KTP) approach to learning in undergraduate education and practice-based training. Design/methodology/approach: This is an examination of the KTP approach and how this could be used effectively in…

  5. Estimation of evaporation and sensible heat flux from open water using a large-aperture scintillometer

    NASA Astrophysics Data System (ADS)

    McJannet, D. L.; Cook, F. J.; McGloin, R. P.; McGowan, H. A.; Burn, S.

    2011-05-01

    The use of scintillometers to determine sensible and latent heat flux is becoming increasingly common because of their ability to quantify convective fluxes over distances of hundreds of meters to several kilometers. The majority of investigations using scintillometry have focused on processes above land surfaces, but here we propose a new methodology for obtaining sensible and latent heat fluxes from a scintillometer deployed over open water. This methodology has been tested by comparison with eddy covariance measurements and through comparison with alternative scintillometer calculation approaches that are commonly used in the literature. The methodology is based on linearization of the Bowen ratio, which is a common assumption in models such as Penman's model and its derivatives. Comparison of latent heat flux estimates from the eddy covariance system and the scintillometer showed excellent agreement across a range of weather conditions and flux rates, giving a high level of confidence in scintillometry-derived latent heat fluxes. The proposed approach produced better estimates than other scintillometry calculation methods because of the reliance of alternative methods on measurements of water temperature or water body heat storage, which are both notoriously hard to quantify. The proposed methodology requires less instrumentation than alternative scintillometer calculation approaches, and the spatial scales of required measurements are arguably more compatible. In addition to scintillometer measurements of the structure parameter of the refractive index of air, the only measurements required are atmospheric pressure, air temperature, humidity, and wind speed at one height over the water body.

  6. Fusion of Visible and Thermal Descriptors Using Genetic Algorithms for Face Recognition Systems.

    PubMed

    Hermosilla, Gabriel; Gallardo, Francisco; Farias, Gonzalo; San Martin, Cesar

    2015-07-23

    The aim of this article is to present a new face recognition system based on the fusion of visible and thermal features obtained from the most current local matching descriptors by maximizing face recognition rates through the use of genetic algorithms. The article considers a comparison of the performance of the proposed fusion methodology against five current face recognition methods and classic fusion techniques used commonly in the literature. These were selected by considering their performance in face recognition. The five local matching methods and the proposed fusion methodology are evaluated using the standard visible/thermal database, the Equinox database, along with a new database, the PUCV-VTF, designed for visible-thermal studies in face recognition and described for the first time in this work. The latter is created considering visible and thermal image sensors with different real-world conditions, such as variations in illumination, facial expression, pose, occlusion, etc. The main conclusions of this article are that two variants of the proposed fusion methodology surpass current face recognition methods and the classic fusion techniques reported in the literature, attaining recognition rates of over 97% and 99% for the Equinox and PUCV-VTF databases, respectively. The fusion methodology is very robust to illumination and expression changes, as it combines thermal and visible information efficiently by using genetic algorithms, thus allowing it to choose optimal face areas where one spectrum is more representative than the other.

  7. Fusion of Visible and Thermal Descriptors Using Genetic Algorithms for Face Recognition Systems

    PubMed Central

    Hermosilla, Gabriel; Gallardo, Francisco; Farias, Gonzalo; San Martin, Cesar

    2015-01-01

    The aim of this article is to present a new face recognition system based on the fusion of visible and thermal features obtained from the most current local matching descriptors by maximizing face recognition rates through the use of genetic algorithms. The article considers a comparison of the performance of the proposed fusion methodology against five current face recognition methods and classic fusion techniques used commonly in the literature. These were selected by considering their performance in face recognition. The five local matching methods and the proposed fusion methodology are evaluated using the standard visible/thermal database, the Equinox database, along with a new database, the PUCV-VTF, designed for visible-thermal studies in face recognition and described for the first time in this work. The latter is created considering visible and thermal image sensors with different real-world conditions, such as variations in illumination, facial expression, pose, occlusion, etc. The main conclusions of this article are that two variants of the proposed fusion methodology surpass current face recognition methods and the classic fusion techniques reported in the literature, attaining recognition rates of over 97% and 99% for the Equinox and PUCV-VTF databases, respectively. The fusion methodology is very robust to illumination and expression changes, as it combines thermal and visible information efficiently by using genetic algorithms, thus allowing it to choose optimal face areas where one spectrum is more representative than the other. PMID:26213932

  8. Towards a cognitive robotics methodology for reward-based decision-making: dynamical systems modelling of the Iowa Gambling Task

    NASA Astrophysics Data System (ADS)

    Lowe, Robert; Ziemke, Tom

    2010-09-01

    The somatic marker hypothesis (SMH) posits that the role of emotions and mental states in decision-making manifests through bodily responses to stimuli of import to the organism's welfare. The Iowa Gambling Task (IGT), proposed by Bechara and Damasio in the mid-1990s, has provided the major source of empirical validation to the role of somatic markers in the service of flexible and cost-effective decision-making in humans. In recent years the IGT has been the subject of much criticism concerning: (1) whether measures of somatic markers reveal that they are important for decision-making as opposed to behaviour preparation; (2) the underlying neural substrate posited as critical to decision-making of the type relevant to the task; and (3) aspects of the methodological approach used, particularly on the canonical version of the task. In this paper, a cognitive robotics methodology is proposed to explore a dynamical systems approach as it applies to the neural computation of reward-based learning and issues concerning embodiment. This approach is particularly relevant in light of a strongly emerging alternative hypothesis to the SMH, the reversal learning hypothesis, which links, behaviourally and neurocomputationally, a number of more or less complex reward-based decision-making tasks, including the 'A-not-B' task - already subject to dynamical systems investigations with a focus on neural activation dynamics. It is also suggested that the cognitive robotics methodology may be used to extend systematically the IGT benchmark to more naturalised, but nevertheless controlled, settings that might better explore the extent to which the SMH, and somatic states per se, impact on complex decision-making.

  9. Design of interpolation functions for subpixel-accuracy stereo-vision systems.

    PubMed

    Haller, Istvan; Nedevschi, Sergiu

    2012-02-01

    Traditionally, subpixel interpolation in stereo-vision systems was designed for the block-matching algorithm. During the evaluation of different interpolation strategies, a strong correlation was observed between the type of the stereo algorithm and the subpixel accuracy of the different solutions. Subpixel interpolation should be adapted to each stereo algorithm to achieve maximum accuracy. In consequence, it is more important to propose methodologies for interpolation function generation than specific function shapes. We propose two such methodologies based on data generated by the stereo algorithms. The first proposal uses a histogram to model the environment and applies histogram equalization to an existing solution adapting it to the data. The second proposal employs synthetic images of a known environment and applies function fitting to the resulted data. The resulting function matches the algorithm and the data as best as possible. An extensive evaluation set is used to validate the findings. Both real and synthetic test cases were employed in different scenarios. The test results are consistent and show significant improvements compared with traditional solutions. © 2011 IEEE

  10. Windowed Green function method for the Helmholtz equation in the presence of multiply layered media

    NASA Astrophysics Data System (ADS)

    Bruno, O. P.; Pérez-Arancibia, C.

    2017-06-01

    This paper presents a new methodology for the solution of problems of two- and three-dimensional acoustic scattering (and, in particular, two-dimensional electromagnetic scattering) by obstacles and defects in the presence of an arbitrary number of penetrable layers. Relying on the use of certain slow-rise windowing functions, the proposed windowed Green function approach efficiently evaluates oscillatory integrals over unbounded domains, with high accuracy, without recourse to the highly expensive Sommerfeld integrals that have typically been used to account for the effect of underlying planar multilayer structures. The proposed methodology, whose theoretical basis was presented in the recent contribution (Bruno et al. 2016 SIAM J. Appl. Math. 76, 1871-1898. (doi:10.1137/15M1033782)), is fast, accurate, flexible and easy to implement. Our numerical experiments demonstrate that the numerical errors resulting from the proposed approach decrease faster than any negative power of the window size. In a number of examples considered in this paper, the proposed method is up to thousands of times faster, for a given accuracy, than corresponding methods based on the use of Sommerfeld integrals.

  11. Windowed Green function method for the Helmholtz equation in the presence of multiply layered media.

    PubMed

    Bruno, O P; Pérez-Arancibia, C

    2017-06-01

    This paper presents a new methodology for the solution of problems of two- and three-dimensional acoustic scattering (and, in particular, two-dimensional electromagnetic scattering) by obstacles and defects in the presence of an arbitrary number of penetrable layers. Relying on the use of certain slow-rise windowing functions, the proposed windowed Green function approach efficiently evaluates oscillatory integrals over unbounded domains, with high accuracy, without recourse to the highly expensive Sommerfeld integrals that have typically been used to account for the effect of underlying planar multilayer structures. The proposed methodology, whose theoretical basis was presented in the recent contribution (Bruno et al. 2016 SIAM J. Appl. Math. 76 , 1871-1898. (doi:10.1137/15M1033782)), is fast, accurate, flexible and easy to implement. Our numerical experiments demonstrate that the numerical errors resulting from the proposed approach decrease faster than any negative power of the window size. In a number of examples considered in this paper, the proposed method is up to thousands of times faster, for a given accuracy, than corresponding methods based on the use of Sommerfeld integrals.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz-Fellenz, Emily S.

    A portion of LANL’s FY15 SPE objectives includes initial ground-based or ground-proximal investigations at the SPE Phase 2 site. The area of interest is the U2ez location in Yucca Flat. This collection serves as a baseline for discrimination of surface features and acquisition of topographic signatures prior to any development or pre-shot activities associated with SPE Phase 2. Our team originally intended to perform our field investigations using previously vetted ground-based (GB) LIDAR methodologies. However, the extended proposed time frame of the GB LIDAR data collection, and associated data processing time and delivery date, were unacceptable. After technical consultation andmore » careful literature research, LANL identified an alternative methodology to achieve our technical objectives and fully support critical model parameterization. Very-low-altitude unmanned aerial systems (UAS) photogrammetry appeared to satisfy our objectives in lieu of GB LIDAR. The SPE Phase 2 baseline collection was used as a test of this UAS photogrammetric methodology.« less

  13. Spectral Target Detection using Schroedinger Eigenmaps

    NASA Astrophysics Data System (ADS)

    Dorado-Munoz, Leidy P.

    Applications of optical remote sensing processes include environmental monitoring, military monitoring, meteorology, mapping, surveillance, etc. Many of these tasks include the detection of specific objects or materials, usually few or small, which are surrounded by other materials that clutter the scene and hide the relevant information. This target detection process has been boosted lately by the use of hyperspectral imagery (HSI) since its high spectral dimension provides more detailed spectral information that is desirable in data exploitation. Typical spectral target detectors rely on statistical or geometric models to characterize the spectral variability of the data. However, in many cases these parametric models do not fit well HSI data that impacts the detection performance. On the other hand, non-linear transformation methods, mainly based on manifold learning algorithms, have shown a potential use in HSI transformation, dimensionality reduction and classification. In target detection, non-linear transformation algorithms are used as preprocessing techniques that transform the data to a more suitable lower dimensional space, where the statistical or geometric detectors are applied. One of these non-linear manifold methods is the Schroedinger Eigenmaps (SE) algorithm that has been introduced as a technique for semi-supervised classification. The core tool of the SE algorithm is the Schroedinger operator that includes a potential term that encodes prior information about the materials present in a scene, and enables the embedding to be steered in some convenient directions in order to cluster similar pixels together. A completely novel target detection methodology based on SE algorithm is proposed for the first time in this thesis. The proposed methodology does not just include the transformation of the data to a lower dimensional space but also includes the definition of a detector that capitalizes on the theory behind SE. The fact that target pixels and those similar pixels are clustered in a predictable region of the low-dimensional representation is used to define a decision rule that allows one to identify target pixels over the rest of pixels in a given image. In addition, a knowledge propagation scheme is used to combine spectral and spatial information as a means to propagate the "potential constraints" to nearby points. The propagation scheme is introduced to reinforce weak connections and improve the separability between most of the target pixels and the background. Experiments using different HSI data sets are carried out in order to test the proposed methodology. The assessment is performed from a quantitative and qualitative point of view, and by comparing the SE-based methodology against two other detection methodologies that use linear/non-linear algorithms as transformations and the well-known Adaptive Coherence/Cosine Estimator (ACE) detector. Overall results show that the SE-based detector outperforms the other two detection methodologies, which indicates the usefulness of the SE transformation in spectral target detection problems.

  14. A false-alarm aware methodology to develop robust and efficient multi-scale infrared small target detection algorithm

    NASA Astrophysics Data System (ADS)

    Moradi, Saed; Moallem, Payman; Sabahi, Mohamad Farzan

    2018-03-01

    False alarm rate and detection rate are still two contradictory metrics for infrared small target detection in an infrared search and track system (IRST), despite the development of new detection algorithms. In certain circumstances, not detecting true targets is more tolerable than detecting false items as true targets. Hence, considering background clutter and detector noise as the sources of the false alarm in an IRST system, in this paper, a false alarm aware methodology is presented to reduce false alarm rate while the detection rate remains undegraded. To this end, advantages and disadvantages of each detection algorithm are investigated and the sources of the false alarms are determined. Two target detection algorithms having independent false alarm sources are chosen in a way that the disadvantages of the one algorithm can be compensated by the advantages of the other one. In this work, multi-scale average absolute gray difference (AAGD) and Laplacian of point spread function (LoPSF) are utilized as the cornerstones of the desired algorithm of the proposed methodology. After presenting a conceptual model for the desired algorithm, it is implemented through the most straightforward mechanism. The desired algorithm effectively suppresses background clutter and eliminates detector noise. Also, since the input images are processed through just four different scales, the desired algorithm has good capability for real-time implementation. Simulation results in term of signal to clutter ratio and background suppression factor on real and simulated images prove the effectiveness and the performance of the proposed methodology. Since the desired algorithm was developed based on independent false alarm sources, our proposed methodology is expandable to any pair of detection algorithms which have different false alarm sources.

  15. Model identification and vision-based H∞ position control of 6-DoF cable-driven parallel robots

    NASA Astrophysics Data System (ADS)

    Chellal, R.; Cuvillon, L.; Laroche, E.

    2017-04-01

    This paper presents methodologies for the identification and control of 6-degrees of freedom (6-DoF) cable-driven parallel robots (CDPRs). First a two-step identification methodology is proposed to accurately estimate the kinematic parameters independently and prior to the dynamic parameters of a physics-based model of CDPRs. Second, an original control scheme is developed, including a vision-based position controller tuned with the H∞ methodology and a cable tension distribution algorithm. The position is controlled in the operational space, making use of the end-effector pose measured by a motion-tracking system. A four-block H∞ design scheme with adjusted weighting filters ensures good trajectory tracking and disturbance rejection properties for the CDPR system, which is a nonlinear-coupled MIMO system with constrained states. The tension management algorithm generates control signals that maintain the cables under feasible tensions. The paper makes an extensive review of the available methods and presents an extension of one of them. The presented methodologies are evaluated by simulations and experimentally on a redundant 6-DoF INCA 6D CDPR with eight cables, equipped with a motion-tracking system.

  16. An ergonomics based design research method for the arrangement of helicopter flight instrument panels.

    PubMed

    Alppay, Cem; Bayazit, Nigan

    2015-11-01

    In this paper, we study the arrangement of displays in flight instrument panels of multi-purpose civil helicopters following a user-centered design method based on ergonomics principles. Our methodology can also be described as a user-interface arrangement methodology based on user opinions and preferences. This study can be outlined as gathering user-centered data using two different research methods and then analyzing and integrating the collected data to come up with an optimal instrument panel design. An interview with helicopter pilots formed the first step of our research. In that interview, pilots were asked to provide a quantitative evaluation of basic interface arrangement principles. In the second phase of the research, a paper prototyping study was conducted with same pilots. The final phase of the study entailed synthesizing the findings from interviews and observational studies to formulate an optimal flight instrument arrangement methodology. The primary results that we present in our paper are the methodology that we developed and three new interface arrangement concepts, namely relationship of inseparability, integrated value and locational value. An optimum instrument panel arrangement is also proposed by the researchers. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  17. Design of Passive Power Filter for Hybrid Series Active Power Filter using Estimation, Detection and Classification Method

    NASA Astrophysics Data System (ADS)

    Swain, Sushree Diptimayee; Ray, Pravat Kumar; Mohanty, K. B.

    2016-06-01

    This research paper discover the design of a shunt Passive Power Filter (PPF) in Hybrid Series Active Power Filter (HSAPF) that employs a novel analytic methodology which is superior than FFT analysis. This novel approach consists of the estimation, detection and classification of the signals. The proposed method is applied to estimate, detect and classify the power quality (PQ) disturbance such as harmonics. This proposed work deals with three methods: the harmonic detection through wavelet transform method, the harmonic estimation by Kalman Filter algorithm and harmonic classification by decision tree method. From different type of mother wavelets in wavelet transform method, the db8 is selected as suitable mother wavelet because of its potency on transient response and crouched oscillation at frequency domain. In harmonic compensation process, the detected harmonic is compensated through Hybrid Series Active Power Filter (HSAPF) based on Instantaneous Reactive Power Theory (IRPT). The efficacy of the proposed method is verified in MATLAB/SIMULINK domain and as well as with an experimental set up. The obtained results confirm the superiority of the proposed methodology than FFT analysis. This newly proposed PPF is used to make the conventional HSAPF more robust and stable.

  18. 76 FR 39876 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-07

    ... Survey--Pretest of Proposed Questions and Methodology.'' In accordance with the Paperwork Reduction Act... Health Plan Survey-- Pretest of Proposed Questions and Methodology The Consumer Assessment of Healthcare... year to year. The CAHPS[supreg] program was designed to: Make it possible to compare survey results...

  19. 76 FR 57046 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-15

    ... Survey--Pretest of Proposed Questions and Methodology.'' In accordance with the Paperwork Reduction Act... Health Plan Survey-- Pretest of Proposed Questions and Methodology The Consumer Assessment of Healthcare... often changed from year to year. The CAHPS[reg] program was designed to: Make it possible to compare...

  20. 42 CFR 495.204 - Incentive payments to qualifying MA organizations for MA-EPs and MA-affiliated eligible hospitals.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... (iii) Methodological proposals must be submitted to CMS by June of the payment year and must be... the payment year. (4) CMS requires the qualifying MA organization to develop a methodological proposal... MA organization in the payment year. The methodological proposal— (i) Must be approved by CMS; and...

  1. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    NASA Astrophysics Data System (ADS)

    Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB

    2017-11-01

    Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  2. Introducing a new methodology for the calculation of local philicity and multiphilic descriptor: an alternative to the finite difference approximation

    NASA Astrophysics Data System (ADS)

    Sánchez-Márquez, Jesús; Zorrilla, David; García, Víctor; Fernández, Manuel

    2018-07-01

    This work presents a new development based on the condensation scheme proposed by Chamorro and Pérez, in which new terms to correct the frozen molecular orbital approximation have been introduced (improved frontier molecular orbital approximation). The changes performed on the original development allow taking into account the orbital relaxation effects, providing equivalent results to those achieved by the finite difference approximation and leading also to a methodology with great advantages. Local reactivity indices based on this new development have been obtained for a sample set of molecules and they have been compared with those indices based on the frontier molecular orbital and finite difference approximations. A new definition based on the improved frontier molecular orbital methodology for the dual descriptor index is also shown. In addition, taking advantage of the characteristics of the definitions obtained with the new condensation scheme, the descriptor local philicity is analysed by separating the components corresponding to the frontier molecular orbital approximation and orbital relaxation effects, analysing also the local parameter multiphilic descriptor in the same way. Finally, the effect of using the basis set is studied and calculations using DFT, CI and Möller-Plesset methodologies are performed to analyse the consequence of different electronic-correlation levels.

  3. Optimal design of piezoelectric transformers: a rational approach based on an analytical model and a deterministic global optimization.

    PubMed

    Pigache, Francois; Messine, Frédéric; Nogarede, Bertrand

    2007-07-01

    This paper deals with a deterministic and rational way to design piezoelectric transformers in radial mode. The proposed approach is based on the study of the inverse problem of design and on its reformulation as a mixed constrained global optimization problem. The methodology relies on the association of the analytical models for describing the corresponding optimization problem and on an exact global optimization software, named IBBA and developed by the second author to solve it. Numerical experiments are presented and compared in order to validate the proposed approach.

  4. Superiorization-based multi-energy CT image reconstruction

    PubMed Central

    Yang, Q; Cong, W; Wang, G

    2017-01-01

    The recently-developed superiorization approach is efficient and robust for solving various constrained optimization problems. This methodology can be applied to multi-energy CT image reconstruction with the regularization in terms of the prior rank, intensity and sparsity model (PRISM). In this paper, we propose a superiorized version of the simultaneous algebraic reconstruction technique (SART) based on the PRISM model. Then, we compare the proposed superiorized algorithm with the Split-Bregman algorithm in numerical experiments. The results show that both the Superiorized-SART and the Split-Bregman algorithms generate good results with weak noise and reduced artefacts. PMID:28983142

  5. The controlled growth method - A tool for structural optimization

    NASA Technical Reports Server (NTRS)

    Hajela, P.; Sobieszczanski-Sobieski, J.

    1981-01-01

    An adaptive design variable linking scheme in a NLP based optimization algorithm is proposed and evaluated for feasibility of application. The present scheme, based on an intuitive effectiveness measure for each variable, differs from existing methodology in that a single dominant variable controls the growth of all others in a prescribed optimization cycle. The proposed method is implemented for truss assemblies and a wing box structure for stress, displacement and frequency constraints. Substantial reduction in computational time, even more so for structures under multiple load conditions, coupled with a minimal accompanying loss in accuracy, vindicates the algorithm.

  6. Functional entropy variables: A new methodology for deriving thermodynamically consistent algorithms for complex fluids, with particular reference to the isothermal Navier–Stokes–Korteweg equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Ju, E-mail: jliu@ices.utexas.edu; Gomez, Hector; Evans, John A.

    2013-09-01

    We propose a new methodology for the numerical solution of the isothermal Navier–Stokes–Korteweg equations. Our methodology is based on a semi-discrete Galerkin method invoking functional entropy variables, a generalization of classical entropy variables, and a new time integration scheme. We show that the resulting fully discrete scheme is unconditionally stable-in-energy, second-order time-accurate, and mass-conservative. We utilize isogeometric analysis for spatial discretization and verify the aforementioned properties by adopting the method of manufactured solutions and comparing coarse mesh solutions with overkill solutions. Various problems are simulated to show the capability of the method. Our methodology provides a means of constructing unconditionallymore » stable numerical schemes for nonlinear non-convex hyperbolic systems of conservation laws.« less

  7. An evaluation of NASA's program in human factors research: Aircrew-vehicle system interaction

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Research in human factors in the aircraft cockpit and a proposed program augmentation were reviewed. The dramatic growth of microprocessor technology makes it entirely feasible to automate increasingly more functions in the aircraft cockpit; the promise of improved vehicle performance, efficiency, and safety through automation makes highly automated flight inevitable. An organized data base and validated methodology for predicting the effects of automation on human performance and thus on safety are lacking and without such a data base and validated methodology for analyzing human performance, increased automation may introduce new risks. Efforts should be concentrated on developing methods and techniques for analyzing man machine interactions, including human workload and prediction of performance.

  8. Model-based testing with UML applied to a roaming algorithm for bluetooth devices.

    PubMed

    Dai, Zhen Ru; Grabowski, Jens; Neukirchen, Helmut; Pals, Holger

    2004-11-01

    In late 2001, the Object Management Group issued a Request for Proposal to develop a testing profile for UML 2.0. In June 2003, the work on the UML 2.0 Testing Profile was finally adopted by the OMG. Since March 2004, it has become an official standard of the OMG. The UML 2.0 Testing Profile provides support for UML based model-driven testing. This paper introduces a methodology on how to use the testing profile in order to modify and extend an existing UML design model for test issues. The application of the methodology will be explained by applying it to an existing UML Model for a Bluetooth device.

  9. Cultural Intelligence to Overcome Educational Exclusion

    ERIC Educational Resources Information Center

    Oliver, Esther; de Botton, Lena; Soler, Marta; Merrill, Barbara

    2011-01-01

    The critical communicative methodology (CCM) is based on the premise that each person, regardless of their educational, cultural, or socioeconomic background, possesses cultural intelligence. That is, they can analyze their experiences of being excluded and propose ways to reverse their situation. In this article, we explore the fact that…

  10. Feedback Effects of Teaching Quality Assessment: Macro and Micro Evidence

    ERIC Educational Resources Information Center

    Bianchini, Stefano

    2014-01-01

    This study investigates the feedback effects of teaching quality assessment. Previous literature looked separately at the evolution of individual and aggregate scores to understand whether instructors and university performance depends on its past evaluation. I propose a new quantitative-based methodology, combining statistical distributions and…

  11. Method for Analysis of Dyadic Communication in Novels.

    ERIC Educational Resources Information Center

    DeHart, Florence E.

    A systematic approach for analysis of dyadic communication in literary works is proposed which is based on a work by Watzlawick, Beavin, and Jackson. This interdisciplinary methodology using behavioral science approaches to analyze literature consists primarily in studying relationship aspects of dyadic communication, as differentiated from…

  12. State of the art of environmentally friendly sample preparation approaches for determination of PBDEs and metabolites in environmental and biological samples: A critical review.

    PubMed

    Berton, Paula; Lana, Nerina B; Ríos, Juan M; García-Reyes, Juan F; Altamirano, Jorgelina C

    2016-01-28

    Green chemistry principles for developing methodologies have gained attention in analytical chemistry in recent decades. A growing number of analytical techniques have been proposed for determination of organic persistent pollutants in environmental and biological samples. In this light, the current review aims to present state-of-the-art sample preparation approaches based on green analytical principles proposed for the determination of polybrominated diphenyl ethers (PBDEs) and metabolites (OH-PBDEs and MeO-PBDEs) in environmental and biological samples. Approaches to lower the solvent consumption and accelerate the extraction, such as pressurized liquid extraction, microwave-assisted extraction, and ultrasound-assisted extraction, are discussed in this review. Special attention is paid to miniaturized sample preparation methodologies and strategies proposed to reduce organic solvent consumption. Additionally, extraction techniques based on alternative solvents (surfactants, supercritical fluids, or ionic liquids) are also commented in this work, even though these are scarcely used for determination of PBDEs. In addition to liquid-based extraction techniques, solid-based analytical techniques are also addressed. The development of greener, faster and simpler sample preparation approaches has increased in recent years (2003-2013). Among green extraction techniques, those based on the liquid phase predominate over those based on the solid phase (71% vs. 29%, respectively). For solid samples, solvent assisted extraction techniques are preferred for leaching of PBDEs, and liquid phase microextraction techniques are mostly used for liquid samples. Likewise, green characteristics of the instrumental analysis used after the extraction and clean-up steps are briefly discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Whole-Volume Clustering of Time Series Data from Zebrafish Brain Calcium Images via Mixture Modeling.

    PubMed

    Nguyen, Hien D; Ullmann, Jeremy F P; McLachlan, Geoffrey J; Voleti, Venkatakaushik; Li, Wenze; Hillman, Elizabeth M C; Reutens, David C; Janke, Andrew L

    2018-02-01

    Calcium is a ubiquitous messenger in neural signaling events. An increasing number of techniques are enabling visualization of neurological activity in animal models via luminescent proteins that bind to calcium ions. These techniques generate large volumes of spatially correlated time series. A model-based functional data analysis methodology via Gaussian mixtures is suggested for the clustering of data from such visualizations is proposed. The methodology is theoretically justified and a computationally efficient approach to estimation is suggested. An example analysis of a zebrafish imaging experiment is presented.

  14. An Imager Gaussian Process Machine Learning Methodology for Cloud Thermodynamic Phase classification

    NASA Astrophysics Data System (ADS)

    Marchant, B.; Platnick, S. E.; Meyer, K.

    2017-12-01

    The determination of cloud thermodynamic phase from MODIS and VIIRS instruments is an important first step in cloud optical retrievals, since ice and liquid clouds have different optical properties. To continue improving the cloud thermodynamic phase classification algorithm, a machine-learning approach, based on Gaussian processes, has been developed. The new proposed methodology provides cloud phase uncertainty quantification and improves the algorithm portability between MODIS and VIIRS. We will present new results, through comparisons between MODIS and CALIOP v4, and for VIIRS as well.

  15. A methodological survey identified eight proposed frameworks for the adaptation of health related guidelines.

    PubMed

    Darzi, Andrea; Abou-Jaoude, Elias A; Agarwal, Arnav; Lakis, Chantal; Wiercioch, Wojtek; Santesso, Nancy; Brax, Hneine; El-Jardali, Fadi; Schünemann, Holger J; Akl, Elie A

    2017-06-01

    Our objective was to identify and describe published frameworks for adaptation of clinical, public health, and health services guidelines. We included reports describing methods of adaptation of guidelines in sufficient detail to allow its reproducibility. We searched Medline and EMBASE databases. We also searched personal files, as well manuals and handbooks of organizations and professional societies that proposed methods of adaptation and adoption of guidelines. We followed standard systematic review methodology. Our search captured 12,021 citations, out of which we identified eight proposed methods of guidelines adaptation: ADAPTE, Adapted ADAPTE, Alberta Ambassador Program adaptation phase, GRADE-ADOLOPMENT, MAGIC, RAPADAPTE, Royal College of Nursing (RCN), and Systematic Guideline Review (SGR). The ADAPTE framework consists of a 24-step process to adapt guidelines to a local context taking into consideration the needs, priorities, legislation, policies, and resources. The Alexandria Center for Evidence-Based Clinical Practice Guidelines updated one of ADAPTE's tools, modified three tools, and added three new ones. In addition, they proposed optionally using three other tools. The Alberta Ambassador Program adaptation phase consists of 11 steps and focused on adapting good-quality guidelines for nonspecific low back pain into local context. GRADE-ADOLOPMENT is an eight-step process based on the GRADE Working Group's Evidence to Decision frameworks and applied in 22 guidelines in the context of national guideline development program. The MAGIC research program developed a five-step adaptation process, informed by ADAPTE and the GRADE approach in the context of adapting thrombosis guidelines. The RAPADAPTE framework consists of 12 steps based on ADAPTE and using synthesized evidence databases, retrospectively derived from the experience of producing a high-quality guideline for the treatment of breast cancer with limited resources in Costa Rica. The RCN outlines five key steps strategy for adaptation of guidelines to the local context. The SGR method consists of nine steps and takes into consideration both methodological gaps and context-specific normative issues in source guidelines. We identified through searching personal files two abandoned methods. We identified and described eight proposed frameworks for the adaptation of health-related guidelines. There is a need to evaluate these different frameworks to assess rigor, efficiency, and transparency of their proposed processes. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Ergonomics and design: traffic sign and street name sign.

    PubMed

    Moroni, Janaina Luisa da Silva; Aymone, José Luís Farinatti

    2012-01-01

    This work proposes a design methodology using ergonomics and anthropometry concepts applied to traffic sign and street name sign projects. Initially, a literature revision on cognitive ergonomics and anthropometry is performed. Several authors and their design methodologies are analyzed and the aspects to be considered in projects of traffic and street name signs are selected and other specific aspects are proposed for the design methodology. A case study of the signs of "Street of Antiques" in Porto Alegre city is presented. To do that, interviews with the population are made to evaluate the current situation of signs. After that, a new sign proposal with virtual prototyping is done using the developed methodology. The results obtained with new interviews about the proposal show the user satisfaction and the importance of cognitive ergonomics to development of this type of urban furniture.

  17. Engineering of the institutionalization of the circular economy at the level of casting production

    NASA Astrophysics Data System (ADS)

    Vescan, M. M.; Soporan, V. F.; Crișan, D. M.; Lehene, T. R.; Pădurețu, S.; Samuila, V.

    2017-06-01

    This paper is motivated by the necessity of introducing the principles of circular economy at the level of different social - economic activities, and from this point of view one of the fields with a special potential is that of the manufacture of castings. Objective: to connect to the organizing and application of the methodology of the circular economy principles. The proposed method is an innovating one, being connected to the use of institutionalization engineering. Formulating the subject: The subject formulated to be solved aims at the introduction of new approaches, defined through institutionalization engineering, which proposes to set the correlation of actions between the specifics of the circular economy and the specific elements of the manufacture of castings. Research method: An institutional structuring operation was imposed for the optimization of the research method, in which different versions interact at the following levels: the level of public policies, the level of the regulatory framework, the level of technical solutions and the level of financing solutions and financial instruments. The determination of the optimal solution established in a dynamic context, favorable for the requirements of the different actors present within the process, appeals to the elements of critical thinking, specific for the engineer’s actions. Achievement of the research activity: The research activity structures a methodology of quantifying the contributions of each stage of the manufacturing process for castings at the fulfilling of the specific conditions of the circular economy, indicating the critical areas of action for more efficient actions of the circular economy, according to the market economy requirements, where there is a potential of implementing the technical solutions by quantizing the financial solutions and the opportunity of using the financial instruments. The major contribution of the research: The proposed methodology, with examples at the level of castings manufacture, sets the bases of a new field of action of the engineering thinking, namely, that of circular economy institutionalization functioning. Conclusions of the research activity: The proposed methodology represents the bases of establishing a new instrument of action at the level of institutionalized functioning of the circular economy.

  18. A new transmission methodology for quality assurance in radiotherapy based on radiochromic film measurements

    PubMed Central

    do Amaral, Leonardo L.; Pavoni, Juliana F.; Sampaio, Francisco; Netto, Thomaz Ghilardi

    2015-01-01

    Despite individual quality assurance (QA) being recommended for complex techniques in radiotherapy (RT) treatment, the possibility of errors in dose delivery during therapeutic application has been verified. Therefore, it is fundamentally important to conduct in vivo QA during treatment. This work presents an in vivo transmission quality control methodology, using radiochromic film (RCF) coupled to the linear accelerator (linac) accessory holder. This QA methodology compares the dose distribution measured by the film in the linac accessory holder with the dose distribution expected by the treatment planning software. The calculated dose distribution is obtained in the coronal and central plane of a phantom with the same dimensions of the acrylic support used for positioning the film but in a source‐to‐detector distance (SDD) of 100 cm, as a result of transferring the IMRT plan in question with all the fields positioned with the gantry vertically, that is, perpendicular to the phantom. To validate this procedure, first of all a Monte Carlo simulation using PENELOPE code was done to evaluate the differences between the dose distributions measured by the film in a SDD of 56.8 cm and 100 cm. After that, several simple dose distribution tests were evaluated using the proposed methodology, and finally a study using IMRT treatments was done. In the Monte Carlo simulation, the mean percentage of points approved in the gamma function comparing the dose distribution acquired in the two SDDs were 99.92%±0.14%. In the simple dose distribution tests, the mean percentage of points approved in the gamma function were 99.85%±0.26% and the mean percentage differences in the normalization point doses were −1.41%. The transmission methodology was approved in 24 of 25 IMRT test irradiations. Based on these results, it can be concluded that the proposed methodology using RCFs can be applied for in vivo QA in RT treatments. PACS number: 87.55.Qr, 87.55.km, 87.55.N‐ PMID:26699306

  19. Representations of everyday life: a proposal for capturing social values from the Marxist perspective of knowledge production.

    PubMed

    Soares, Cássia Baldini; Santos, Vilmar Ezequiel Dos; Campos, Célia Maria Sivalli; Lachtim, Sheila Aparecida Ferreira; Campos, Fernanda Cristina

    2011-12-01

    We propose from the Marxist perspective of the construction of knowledge, a theoretical and methodological framework for understanding social values by capturing everyday representations. We assume that scientific research brings together different dimensions: epistemological, theoretical and methodological that consistently to the other instances, proposes a set of operating procedures and techniques for capturing and analyzing the reality under study in order to expose the investigated object. The study of values reveals the essentiality of the formation of judgments and choices, there are values that reflect the dominant ideology, spanning all social classes, but there are values that reflect class interests, these are not universal, they are formed in relationships and social activities. Basing on the Marxist theory of consciousness, representations are discursive formulations of everyday life - opinion or conviction - issued by subjects about their reality, being a coherent way of understanding and exposure social values: focus groups show is suitable for grasping opinions while interviews show potential to expose convictions.

  20. Speech Enhancement of Mobile Devices Based on the Integration of a Dual Microphone Array and a Background Noise Elimination Algorithm.

    PubMed

    Chen, Yung-Yue

    2018-05-08

    Mobile devices are often used in our daily lives for the purposes of speech and communication. The speech quality of mobile devices is always degraded due to the environmental noises surrounding mobile device users. Regretfully, an effective background noise reduction solution cannot easily be developed for this speech enhancement problem. Due to these depicted reasons, a methodology is systematically proposed to eliminate the effects of background noises for the speech communication of mobile devices. This methodology integrates a dual microphone array with a background noise elimination algorithm. The proposed background noise elimination algorithm includes a whitening process, a speech modelling method and an H ₂ estimator. Due to the adoption of the dual microphone array, a low-cost design can be obtained for the speech enhancement of mobile devices. Practical tests have proven that this proposed method is immune to random background noises, and noiseless speech can be obtained after executing this denoise process.

  1. Developing an ethical code for engineers: the discursive approach.

    PubMed

    Lozano, J Félix

    2006-04-01

    From the Hippocratic Oath on, deontological codes and other professional self-regulation mechanisms have been used to legitimize and identify professional groups. New technological challenges and, above all, changes in the socioeconomic environment require adaptable codes which can respond to new demands. We assume that ethical codes for professionals should not simply focus on regulative functions, but must also consider ideological and educative functions. Any adaptations should take into account both contents (values, norms and recommendations) and the drafting process itself. In this article we propose a process for developing a professional ethical code for an official professional association (Colegio Oficial de Ingenieros Industriales de Valencia (COIIV) starting from the philosophical assumptions of discursive ethics but adapting them to critical hermeneutics. Our proposal is based on the Integrity Approach rather than the Compliance Approach. A process aiming to achieve an effective ethical document that fulfils regulative and ideological functions requires a participative, dialogical and reflexive methodology. This process must respond to moral exigencies and demands for efficiency and professional effectiveness. In addition to the methodological proposal we present our experience of producing an ethical code for the industrial engineers' association in Valencia (Spain) where this methodology was applied, and we evaluate the detected problems and future potential.

  2. An experimental procedure to determine heat transfer properties of turbochargers

    NASA Astrophysics Data System (ADS)

    Serrano, J. R.; Olmeda, P.; Páez, A.; Vidal, F.

    2010-03-01

    Heat transfer phenomena in turbochargers have been a subject of investigation due to their importance for the correct determination of compressor real work when modelling. The commonly stated condition of adiabaticity for turbochargers during normal operation of an engine has been revaluated because important deviations from adiabatic behaviour have been stated in many studies in this issue especially when the turbocharger is running at low rotational speeds/loads. The deviations mentioned do not permit us to assess properly the turbine and compressor efficiencies since the pure aerodynamic effects cannot be separated from the non-desired heat transfer due to the presence of both phenomena during turbocharger operation. The correction of the aforesaid facts is necessary to properly feed engine models with reliable information and in this way increase the quality of the results in any modelling process. The present work proposes a thermal characterization methodology successfully applied in a turbocharger for a passenger car which is based on the physics of the turbocharger. Its application helps to understand the thermal behaviour of the turbocharger, and the results obtained constitute vital information for future modelling efforts which involve the use of the information obtained from the proposed methodology. The conductance values obtained from the proposed methodology have been applied to correct a procedure for measuring the mechanical efficiency of the tested turbocharger.

  3. A reference case for economic evaluations in osteoarthritis: an expert consensus article from the European Society for Clinical and Economic Aspects of Osteoporosis and Osteoarthritis (ESCEO).

    PubMed

    Hiligsmann, Mickaël; Cooper, Cyrus; Guillemin, Francis; Hochberg, Marc C; Tugwell, Peter; Arden, Nigel; Berenbaum, Francis; Boers, Maarten; Boonen, Annelies; Branco, Jaime C; Maria-Luisa, Brandi; Bruyère, Olivier; Gasparik, Andrea; Kanis, John A; Kvien, Tore K; Martel-Pelletier, Johanne; Pelletier, Jean-Pierre; Pinedo-Villanueva, Rafael; Pinto, Daniel; Reiter-Niesert, Susanne; Rizzoli, René; Rovati, Lucio C; Severens, Johan L; Silverman, Stuart; Reginster, Jean-Yves

    2014-12-01

    General recommendations for a reference case for economic studies in rheumatic diseases were published in 2002 in an initiative to improve the comparability of cost-effectiveness studies in the field. Since then, economic evaluations in osteoarthritis (OA) continue to show considerable heterogeneity in methodological approach. To develop a reference case specific for economic studies in OA, including the standard optimal care, with which to judge new pharmacologic and non-pharmacologic interventions. Four subgroups of an ESCEO expert working group on economic assessments (13 experts representing diverse aspects of clinical research and/or economic evaluations) were charged with producing lists of recommendations that would potentially improve the comparability of economic analyses in OA: outcome measures, comparators, costs and methodology. These proposals were discussed and refined during a face-to-face meeting in 2013. They are presented here in the format of the recommendations of the recently published Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement, so that an initiative on economic analysis methodology might be consolidated with an initiative on reporting standards. Overall, three distinct reference cases are proposed, one for each hand, knee and hip OA; with diagnostic variations in the first two, giving rise to different treatment options: interphalangeal or thumb-based disease for hand OA and the presence or absence of joint malalignment for knee OA. A set of management strategies is proposed, which should be further evaluated to help establish a consensus on the "standard optimal care" in each proposed reference case. The recommendations on outcome measures, cost itemisation and methodological approaches are also provided. The ESCEO group proposes a set of disease-specific recommendations on the conduct and reporting of economic evaluations in OA that could help the standardisation and comparability of studies that evaluate therapeutic strategies of OA in terms of costs and effectiveness. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Implementation of efficient trajectories for an ultrasonic scanner using chaotic maps

    NASA Astrophysics Data System (ADS)

    Almeda, A.; Baltazar, A.; Treesatayapun, C.; Mijarez, R.

    2012-05-01

    Typical ultrasonic methodology for nondestructive scanning evaluation uses systematic scanning paths. In many cases, this approach is time inefficient and also energy and computational power consuming. Here, a methodology for the scanning of defects using an ultrasonic echo-pulse scanning technique combined with chaotic trajectory generation is proposed. This is implemented in a Cartesian coordinate robotic system developed in our lab. To cover the entire search area, a chaotic function and a proposed mirror mapping were incorporated. To improve detection probability, our proposed scanning methodology is complemented with a probabilistic approach of discontinuity detection. The developed methodology was found to be more efficient than traditional ones used to localize and characterize hidden flaws.

  5. Information security system based on virtual-optics imaging methodology and public key infrastructure

    NASA Astrophysics Data System (ADS)

    Peng, Xiang; Zhang, Peng; Cai, Lilong

    In this paper, we present a virtual-optical based information security system model with the aid of public-key-infrastructure (PKI) techniques. The proposed model employs a hybrid architecture in which our previously published encryption algorithm based on virtual-optics imaging methodology (VOIM) can be used to encipher and decipher data while an asymmetric algorithm, for example RSA, is applied for enciphering and deciphering the session key(s). For an asymmetric system, given an encryption key, it is computationally infeasible to determine the decryption key and vice versa. The whole information security model is run under the framework of PKI, which is on basis of public-key cryptography and digital signatures. This PKI-based VOIM security approach has additional features like confidentiality, authentication, and integrity for the purpose of data encryption under the environment of network.

  6. Comparison of calibration strategies for optical 3D scanners based on structured light projection using a new evaluation methodology

    NASA Astrophysics Data System (ADS)

    Bräuer-Burchardt, Christian; Ölsner, Sandy; Kühmstedt, Peter; Notni, Gunther

    2017-06-01

    In this paper a new evaluation strategy for optical 3D scanners based on structured light projection is introduced. It can be used for the characterization of the expected measurement accuracy. Compared to the procedure proposed in the VDI/VDE guidelines for optical 3D measurement systems based on area scanning it requires less effort and provides more impartiality. The methodology is suitable for the evaluation of sets of calibration parameters, which mainly determine the quality of the measurement result. It was applied to several calibrations of a mobile stereo camera based optical 3D scanner. The performed calibrations followed different strategies regarding calibration bodies and arrangement of the observed scene. The results obtained by the different calibration strategies are discussed and suggestions concerning future work on this area are given.

  7. A scoring system for appraising mixed methods research, and concomitantly appraising qualitative, quantitative and mixed methods primary studies in Mixed Studies Reviews.

    PubMed

    Pluye, Pierre; Gagnon, Marie-Pierre; Griffiths, Frances; Johnson-Lafleur, Janique

    2009-04-01

    A new form of literature review has emerged, Mixed Studies Review (MSR). These reviews include qualitative, quantitative and mixed methods studies. In the present paper, we examine MSRs in health sciences, and provide guidance on processes that should be included and reported. However, there are no valid and usable criteria for concomitantly appraising the methodological quality of the qualitative, quantitative and mixed methods studies. To propose criteria for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies or study components. A three-step critical review was conducted. 2322 references were identified in MEDLINE, and their titles and abstracts were screened; 149 potentially relevant references were selected and the full-text papers were examined; 59 MSRs were retained and scrutinized using a deductive-inductive qualitative thematic data analysis. This revealed three types of MSR: convenience, reproducible, and systematic. Guided by a proposal, we conducted a qualitative thematic data analysis of the quality appraisal procedures used in the 17 systematic MSRs (SMSRs). Of 17 SMSRs, 12 showed clear quality appraisal procedures with explicit criteria but no SMSR used valid checklists to concomitantly appraise qualitative, quantitative and mixed methods studies. In two SMSRs, criteria were developed following a specific procedure. Checklists usually contained more criteria than needed. In four SMSRs, a reliability assessment was described or mentioned. While criteria for quality appraisal were usually based on descriptors that require specific methodological expertise (e.g., appropriateness), no SMSR described the fit between reviewers' expertise and appraised studies. Quality appraisal usually resulted in studies being ranked by methodological quality. A scoring system is proposed for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies for SMSRs. This scoring system may also be used to appraise the methodological quality of qualitative, quantitative and mixed methods components of mixed methods research.

  8. Moving object detection using dynamic motion modelling from UAV aerial images.

    PubMed

    Saif, A F M Saifuddin; Prabuwono, Anton Satria; Mahayuddin, Zainal Rasyid

    2014-01-01

    Motion analysis based moving object detection from UAV aerial image is still an unsolved issue due to inconsideration of proper motion estimation. Existing moving object detection approaches from UAV aerial images did not deal with motion based pixel intensity measurement to detect moving object robustly. Besides current research on moving object detection from UAV aerial images mostly depends on either frame difference or segmentation approach separately. There are two main purposes for this research: firstly to develop a new motion model called DMM (dynamic motion model) and secondly to apply the proposed segmentation approach SUED (segmentation using edge based dilation) using frame difference embedded together with DMM model. The proposed DMM model provides effective search windows based on the highest pixel intensity to segment only specific area for moving object rather than searching the whole area of the frame using SUED. At each stage of the proposed scheme, experimental fusion of the DMM and SUED produces extracted moving objects faithfully. Experimental result reveals that the proposed DMM and SUED have successfully demonstrated the validity of the proposed methodology.

  9. SNPs selection using support vector regression and genetic algorithms in GWAS

    PubMed Central

    2014-01-01

    Introduction This paper proposes a new methodology to simultaneously select the most relevant SNPs markers for the characterization of any measurable phenotype described by a continuous variable using Support Vector Regression with Pearson Universal kernel as fitness function of a binary genetic algorithm. The proposed methodology is multi-attribute towards considering several markers simultaneously to explain the phenotype and is based jointly on statistical tools, machine learning and computational intelligence. Results The suggested method has shown potential in the simulated database 1, with additive effects only, and real database. In this simulated database, with a total of 1,000 markers, and 7 with major effect on the phenotype and the other 993 SNPs representing the noise, the method identified 21 markers. Of this total, 5 are relevant SNPs between the 7 but 16 are false positives. In real database, initially with 50,752 SNPs, we have reduced to 3,073 markers, increasing the accuracy of the model. In the simulated database 2, with additive effects and interactions (epistasis), the proposed method matched to the methodology most commonly used in GWAS. Conclusions The method suggested in this paper demonstrates the effectiveness in explaining the real phenotype (PTA for milk), because with the application of the wrapper based on genetic algorithm and Support Vector Regression with Pearson Universal, many redundant markers were eliminated, increasing the prediction and accuracy of the model on the real database without quality control filters. The PUK demonstrated that it can replicate the performance of linear and RBF kernels. PMID:25573332

  10. Assessing secondary soil salinization risk based on the PSR sustainability framework.

    PubMed

    Zhou, De; Lin, Zhulu; Liu, Liming; Zimmermann, David

    2013-10-15

    Risk assessment of secondary soil salinization, which is caused in part by the way people manage the land, is an essential challenge to agricultural sustainability. The objective of our study was to develop a soil salinity risk assessment methodology by selecting a consistent set of risk factors based on the conceptual Pressure-State-Response (PSR) sustainability framework and incorporating the grey relational analysis and the Analytic Hierarchy Process methods. The proposed salinity risk assessment methodology was demonstrated through a case study of developing composite risk index maps for the Yinchuan Plain, a major irrigation agriculture district in northwest China. Fourteen risk factors were selected in terms of the three PSR criteria: pressure, state, and response. The results showed that the salinity risk in the Yinchuan Plain was strongly influenced by the subsoil and groundwater salinity, land use, distance to irrigation canals, and depth to groundwater. To maintain agricultural sustainability in the Yinchuan Plain, a suite of remedial and preventative actions were proposed to manage soil salinity risk in the regions that are affected by salinity at different levels and by different salinization processes. The weight sensitivity analysis results also showed that the overall salinity risk of the Yinchuan Plain would increase or decrease as the weights for pressure or response risk factors increased, signifying the importance of human activities on secondary soil salinization. Ideally, the proposed methodology will help us develop more consistent management tools for risk assessment and management and for control of secondary soil salinization. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. LMI designmethod for networked-based PID control

    NASA Astrophysics Data System (ADS)

    Souza, Fernando de Oliveira; Mozelli, Leonardo Amaral; de Oliveira, Maurício Carvalho; Palhares, Reinaldo Martinez

    2016-10-01

    In this paper, we propose a methodology for the design of networked PID controllers for second-order delayed processes using linear matrix inequalities. The proposed procedure takes into account time-varying delay on the plant, time-varying delays induced by the network and packed dropouts. The design is carried on entirely using a continuous-time model of the closed-loop system where time-varying delays are used to represent sampling and holding occurring in a discrete-time digital PID controller.

  12. A deep learning approach for the analysis of masses in mammograms with minimal user intervention.

    PubMed

    Dhungel, Neeraj; Carneiro, Gustavo; Bradley, Andrew P

    2017-04-01

    We present an integrated methodology for detecting, segmenting and classifying breast masses from mammograms with minimal user intervention. This is a long standing problem due to low signal-to-noise ratio in the visualisation of breast masses, combined with their large variability in terms of shape, size, appearance and location. We break the problem down into three stages: mass detection, mass segmentation, and mass classification. For the detection, we propose a cascade of deep learning methods to select hypotheses that are refined based on Bayesian optimisation. For the segmentation, we propose the use of deep structured output learning that is subsequently refined by a level set method. Finally, for the classification, we propose the use of a deep learning classifier, which is pre-trained with a regression to hand-crafted feature values and fine-tuned based on the annotations of the breast mass classification dataset. We test our proposed system on the publicly available INbreast dataset and compare the results with the current state-of-the-art methodologies. This evaluation shows that our system detects 90% of masses at 1 false positive per image, has a segmentation accuracy of around 0.85 (Dice index) on the correctly detected masses, and overall classifies masses as malignant or benign with sensitivity (Se) of 0.98 and specificity (Sp) of 0.7. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Structural design of composite rotor blades with consideration of manufacturability, durability, and manufacturing uncertainties

    NASA Astrophysics Data System (ADS)

    Li, Leihong

    A modular structural design methodology for composite blades is developed. This design method can be used to design composite rotor blades with sophisticate geometric cross-sections. This design method hierarchically decomposed the highly-coupled interdisciplinary rotor analysis into global and local levels. In the global level, aeroelastic response analysis and rotor trim are conduced based on multi-body dynamic models. In the local level, variational asymptotic beam sectional analysis methods are used for the equivalent one-dimensional beam properties. Compared with traditional design methodology, the proposed method is more efficient and accurate. Then, the proposed method is used to study three different design problems that have not been investigated before. The first is to add manufacturing constraints into design optimization. The introduction of manufacturing constraints complicates the optimization process. However, the design with manufacturing constraints benefits the manufacturing process and reduces the risk of violating major performance constraints. Next, a new design procedure for structural design against fatigue failure is proposed. This procedure combines the fatigue analysis with the optimization process. The durability or fatigue analysis employs a strength-based model. The design is subject to stiffness, frequency, and durability constraints. Finally, the manufacturing uncertainty impacts on rotor blade aeroelastic behavior are investigated, and a probabilistic design method is proposed to control the impacts of uncertainty on blade structural performance. The uncertainty factors include dimensions, shapes, material properties, and service loads.

  14. The methodology for modeling queuing systems using Petri nets

    NASA Astrophysics Data System (ADS)

    Kotyrba, Martin; Gaj, Jakub; Tvarůžka, Matouš

    2017-07-01

    This papers deals with the use of Petri nets in modeling and simulation of queuing systems. The first part is focused on the explanation of basic concepts and properties of Petri nets and queuing systems. The proposed methodology for the modeling of queuing systems using Petri nets is described in the practical part. The proposed methodology will be tested on specific cases.

  15. Tracking by Identification Using Computer Vision and Radio

    PubMed Central

    Mandeljc, Rok; Kovačič, Stanislav; Kristan, Matej; Perš, Janez

    2013-01-01

    We present a novel system for detection, localization and tracking of multiple people, which fuses a multi-view computer vision approach with a radio-based localization system. The proposed fusion combines the best of both worlds, excellent computer-vision-based localization, and strong identity information provided by the radio system, and is therefore able to perform tracking by identification, which makes it impervious to propagated identity switches. We present comprehensive methodology for evaluation of systems that perform person localization in world coordinate system and use it to evaluate the proposed system as well as its components. Experimental results on a challenging indoor dataset, which involves multiple people walking around a realistically cluttered room, confirm that proposed fusion of both systems significantly outperforms its individual components. Compared to the radio-based system, it achieves better localization results, while at the same time it successfully prevents propagation of identity switches that occur in pure computer-vision-based tracking. PMID:23262485

  16. An efficient and provable secure revocable identity-based encryption scheme.

    PubMed

    Wang, Changji; Li, Yuan; Xia, Xiaonan; Zheng, Kangjia

    2014-01-01

    Revocation functionality is necessary and crucial to identity-based cryptosystems. Revocable identity-based encryption (RIBE) has attracted a lot of attention in recent years, many RIBE schemes have been proposed in the literature but shown to be either insecure or inefficient. In this paper, we propose a new scalable RIBE scheme with decryption key exposure resilience by combining Lewko and Waters' identity-based encryption scheme and complete subtree method, and prove our RIBE scheme to be semantically secure using dual system encryption methodology. Compared to existing scalable and semantically secure RIBE schemes, our proposed RIBE scheme is more efficient in term of ciphertext size, public parameters size and decryption cost at price of a little looser security reduction. To the best of our knowledge, this is the first construction of scalable and semantically secure RIBE scheme with constant size public system parameters.

  17. Overview of systematic reviews of therapeutic ranges: methodologies and recommendations for practice.

    PubMed

    Cooney, Lewis; Loke, Yoon K; Golder, Su; Kirkham, Jamie; Jorgensen, Andrea; Sinha, Ian; Hawcutt, Daniel

    2017-06-02

    Many medicines are dosed to achieve a particular therapeutic range, and monitored using therapeutic drug monitoring (TDM). The evidence base for a therapeutic range can be evaluated using systematic reviews, to ensure it continues to reflect current indications, doses, routes and formulations, as well as updated adverse effect data. There is no consensus on the optimal methodology for systematic reviews of therapeutic ranges. An overview of systematic reviews of therapeutic ranges was undertaken. The following databases were used: Cochrane Database of Systematic Reviews (CDSR), Database of Abstracts and Reviews of Effects (DARE) and MEDLINE. The published methodologies used when systematically reviewing the therapeutic range of a drug were analyzed. Step by step recommendations to optimize such systematic reviews are proposed. Ten systematic reviews that investigated the correlation between serum concentrations and clinical outcomes encompassing a variety of medicines and indications were assessed. There were significant variations in the methodologies used (including the search terms used, data extraction methods, assessment of bias, and statistical analyses undertaken). Therapeutic ranges should be population and indication specific and based on clinically relevant outcomes. Recommendations for future systematic reviews based on these findings have been developed. Evidence based therapeutic ranges have the potential to improve TDM practice. Current systematic reviews investigating therapeutic ranges have highly variable methodologies and there is no consensus of best practice when undertaking systematic reviews in this field. These recommendations meet a need not addressed by standard protocols.

  18. Calibration of CORSIM models under saturated traffic flow conditions.

    DOT National Transportation Integrated Search

    2013-09-01

    This study proposes a methodology to calibrate microscopic traffic flow simulation models. : The proposed methodology has the capability to calibrate simultaneously all the calibration : parameters as well as demand patterns for any network topology....

  19. Using ontological inference and hierarchical matchmaking to overcome semantic heterogeneity in remote sensing-based biodiversity monitoring

    NASA Astrophysics Data System (ADS)

    Nieland, Simon; Kleinschmit, Birgit; Förster, Michael

    2015-05-01

    Ontology-based applications hold promise in improving spatial data interoperability. In this work we use remote sensing-based biodiversity information and apply semantic formalisation and ontological inference to show improvements in data interoperability/comparability. The proposed methodology includes an observation-based, "bottom-up" engineering approach for remote sensing applications and gives a practical example of semantic mediation of geospatial products. We apply the methodology to three different nomenclatures used for remote sensing-based classification of two heathland nature conservation areas in Belgium and Germany. We analysed sensor nomenclatures with respect to their semantic formalisation and their bio-geographical differences. The results indicate that a hierarchical and transparent nomenclature is far more important for transferability than the sensor or study area. The inclusion of additional information, not necessarily belonging to a vegetation class description, is a key factor for the future success of using semantics for interoperability in remote sensing.

  20. Peaks Over Threshold (POT): A methodology for automatic threshold estimation using goodness of fit p-value

    NASA Astrophysics Data System (ADS)

    Solari, Sebastián.; Egüen, Marta; Polo, María. José; Losada, Miguel A.

    2017-04-01

    Threshold estimation in the Peaks Over Threshold (POT) method and the impact of the estimation method on the calculation of high return period quantiles and their uncertainty (or confidence intervals) are issues that are still unresolved. In the past, methods based on goodness of fit tests and EDF-statistics have yielded satisfactory results, but their use has not yet been systematized. This paper proposes a methodology for automatic threshold estimation, based on the Anderson-Darling EDF-statistic and goodness of fit test. When combined with bootstrapping techniques, this methodology can be used to quantify both the uncertainty of threshold estimation and its impact on the uncertainty of high return period quantiles. This methodology was applied to several simulated series and to four precipitation/river flow data series. The results obtained confirmed its robustness. For the measured series, the estimated thresholds corresponded to those obtained by nonautomatic methods. Moreover, even though the uncertainty of the threshold estimation was high, this did not have a significant effect on the width of the confidence intervals of high return period quantiles.

  1. Experimental validation of an integrated controls-structures design methodology for a class of flexible space structures

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.

    1994-01-01

    This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.

  2. Nonlinear Prediction Model for Hydrologic Time Series Based on Wavelet Decomposition

    NASA Astrophysics Data System (ADS)

    Kwon, H.; Khalil, A.; Brown, C.; Lall, U.; Ahn, H.; Moon, Y.

    2005-12-01

    Traditionally forecasting and characterizations of hydrologic systems is performed utilizing many techniques. Stochastic linear methods such as AR and ARIMA and nonlinear ones such as statistical learning theory based tools have been extensively used. The common difficulty to all methods is the determination of sufficient and necessary information and predictors for a successful prediction. Relationships between hydrologic variables are often highly nonlinear and interrelated across the temporal scale. A new hybrid approach is proposed for the simulation of hydrologic time series combining both the wavelet transform and the nonlinear model. The present model employs some merits of wavelet transform and nonlinear time series model. The Wavelet Transform is adopted to decompose a hydrologic nonlinear process into a set of mono-component signals, which are simulated by nonlinear model. The hybrid methodology is formulated in a manner to improve the accuracy of a long term forecasting. The proposed hybrid model yields much better results in terms of capturing and reproducing the time-frequency properties of the system at hand. Prediction results are promising when compared to traditional univariate time series models. An application of the plausibility of the proposed methodology is provided and the results conclude that wavelet based time series model can be utilized for simulating and forecasting of hydrologic variable reasonably well. This will ultimately serve the purpose of integrated water resources planning and management.

  3. Enhancing the temporal resolution of satellite-based flood extent generation using crowdsourced data for disaster monitoring.

    NASA Astrophysics Data System (ADS)

    Panteras, G.; Cervone, G.

    2016-12-01

    Satellite-based disaster monitoring has been extensively and successfully used for numerous crisis response and impact delineation tasks until nowadays. Remote sensing satellite are routinely used data during disasters for damage assessment and to coordinate relief operations. Although there is a plethora of satellite sensors able to provide actionable data about an event, their temporal resolution is limited by the satellite revisit time and the presence of clouds. These limitations do not allow for an uninterrupted and timely sensitive monitoring, which is crucial during disasters and emergencies. This research presents an approach that leverages the increased temporal resolution of crowdsourced data to partially overcame the limitations of satellite data. The proposed approach focuses on the geostatistical analysis of Tweeter data to help delineate the flood extent on a daily basis. The crowdsourced data are used to augment satellite imagery from EO-1 ALI, Landsat 8, WorldView-2 and WorldView-3 by fusing them together to complement the satellite observations. The proposed methodology was applied to estimate the daily flood extents in Charleston, SC, caused by hurricane Joaquin on October 2015. The results of the proposed methodology indicate that the user-generated data can be utilized adequately to both bridge the temporal gaps in the satellite-based observations and also to increase the spatial resolution of the flood extents.

  4. Universal Verification Methodology Based Register Test Automation Flow.

    PubMed

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  5. A Bayesian maximum entropy-based methodology for optimal spatiotemporal design of groundwater monitoring networks.

    PubMed

    Hosseini, Marjan; Kerachian, Reza

    2017-09-01

    This paper presents a new methodology for analyzing the spatiotemporal variability of water table levels and redesigning a groundwater level monitoring network (GLMN) using the Bayesian Maximum Entropy (BME) technique and a multi-criteria decision-making approach based on ordered weighted averaging (OWA). The spatial sampling is determined using a hexagonal gridding pattern and a new method, which is proposed to assign a removal priority number to each pre-existing station. To design temporal sampling, a new approach is also applied to consider uncertainty caused by lack of information. In this approach, different time lag values are tested by regarding another source of information, which is simulation result of a numerical groundwater flow model. Furthermore, to incorporate the existing uncertainties in available monitoring data, the flexibility of the BME interpolation technique is taken into account in applying soft data and improving the accuracy of the calculations. To examine the methodology, it is applied to the Dehgolan plain in northwestern Iran. Based on the results, a configuration of 33 monitoring stations for a regular hexagonal grid of side length 3600 m is proposed, in which the time lag between samples is equal to 5 weeks. Since the variance estimation errors of the BME method are almost identical for redesigned and existing networks, the redesigned monitoring network is more cost-effective and efficient than the existing monitoring network with 52 stations and monthly sampling frequency.

  6. REVIEW OF DRAFT REVISED BLUE BOOK ON ESTIMATING ...

    EPA Pesticide Factsheets

    In 1994, EPA published a report, referred to as the “Blue Book,” which lays out EPA’s current methodology for quantitatively estimating radiogenic cancer risks. A follow-on report made minor adjustments to the previous estimates and presented a partial analysis of the uncertainties in the numerical estimates. In 2006, the National Research Council of the National Academy of Sciences released a report on the health risks from exposure to low levels of ionizing radiation. Cosponsored by the EPA and several other Federal agencies, Health Risks from Exposure to Low Levels of Ionizing Radiation BEIR VII Phase 2 (BEIR VII) primarily addresses cancer and genetic risks from low doses of low-LET radiation. In the draft White Paper: Modifying EPA Radiation Risk Models Based on BEIR VII (White Paper), ORIA proposed changes in EPA’s methodology for estimating radiogenic cancers, based on the contents of BEIR VII and some ancillary information. For the most part, it proposed to adopt the models and methodology recommended in BEIR VII; however, certain modifications and expansions are considered to be desirable or necessary for EPA’s purposes. EPA sought advice from the Agency’s Science Advisory Board on the application of BEIR VII and on issues relating to these modifications and expansions in the Advisory on EPA’s Draft White Paper: Modifying EPA Radiation Risk Models Based on BEIR VII (record # 83044). The SAB issued its Advisory on Jan. 31, 2008 (EPA-SAB-08-

  7. Ecoliteracy and a Place-Based Pedagogy: Expanding Latin@ Students' Critical Understanding of the Reciprocity between Sociocultural Systems and Ecosystems in the US-Mexico Border Region

    ERIC Educational Resources Information Center

    Gutierrez, Kristina A.

    2012-01-01

    This dissertation proposes a place-based theoretical and methodological framework, informed by concepts of ecology, multimodality, and activity systems. I apply this framework of ecoliteracy as it is defined within the interdisciplinary contexts of rhetoric and composition, linguistics, and Chicana/o studies. Ecoliteracy refers to individuals'…

  8. Reading the World's Classics Critically: A Keyword-Based Approach to Literary Analysis in Foreign Language Studies

    ERIC Educational Resources Information Center

    García, Nuria Alonso; Caplan, Alison

    2014-01-01

    While there are a number of important critical pedagogies being proposed in the field of foreign language study, more attention should be given to providing concrete examples of how to apply these ideas in the classroom. This article offers a new approach to the textual analysis of literary classics through the keyword-based methodology originally…

  9. Single Event Burnout in DC-DC Converters for the LHC Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Claudio H. Rivetta et al.

    High voltage transistors in DC-DC converters are prone to catastrophic Single Event Burnout in the LHC radiation environment. This paper presents a systematic methodology to analyze single event effects sensitivity in converters and proposes solutions based on de-rating input voltage and output current or voltage.

  10. Personalised Information Services Using a Hybrid Recommendation Method Based on Usage Frequency

    ERIC Educational Resources Information Center

    Kim, Yong; Chung, Min Gyo

    2008-01-01

    Purpose: This paper seeks to describe a personal recommendation service (PRS) involving an innovative hybrid recommendation method suitable for deployment in a large-scale multimedia user environment. Design/methodology/approach: The proposed hybrid method partitions content and user into segments and executes association rule mining,…

  11. Demonstrating Cost-Effective Marker Assisted Selection for Biomass Yield in Red Clover (Trifolium pratense L.) – Part 1: Paternity Testing

    USDA-ARS?s Scientific Manuscript database

    Many methods have been proposed to incorporate molecular markers into breeding programs. Presented is a cost effective marker assisted selection (MAS) methodology that utilizes individual plant phenotypes, seed production-based knowledge of maternity, and molecular marker-determined paternity. Proge...

  12. Affordance Analysis--Matching Learning Tasks with Learning Technologies

    ERIC Educational Resources Information Center

    Bower, Matt

    2008-01-01

    This article presents a design methodology for matching learning tasks with learning technologies. First a working definition of "affordances" is provided based on the need to describe the action potentials of the technologies (utility). Categories of affordances are then proposed to provide a framework for analysis. Following this, a…

  13. Antecedents of Philanthropic Behavior of Health Care Volunteers

    ERIC Educational Resources Information Center

    Alias, Siti Noormi; Ismail, Maimunah

    2015-01-01

    Purpose: This paper aims to propose a conceptual model of philanthropic behavior of volunteers in the health care sector. Design/methodology/approach: This study is based on an extensive review of past research on philanthropic behavior. To conduct the literature review, keywords such as philanthropy, philanthropic behavior, giving, donating,…

  14. A Multivariate Descriptive Model of Motivation for Orthodontic Treatment.

    ERIC Educational Resources Information Center

    Hackett, Paul M. W.; And Others

    1993-01-01

    Motivation for receiving orthodontic treatment was studied among 109 young adults, and a multivariate model of the process is proposed. The combination of smallest scale analysis and Partial Order Scalogram Analysis by base Coordinates (POSAC) illustrates an interesting methodology for health treatment studies and explores motivation for dental…

  15. New Ideas and Approaches

    ERIC Educational Resources Information Center

    Lukov, V. A.

    2014-01-01

    The article examines theories of youth that have been proposed in the past few years by Russian scientists, and presents the author's original version of a theory of youth that is based on the thesaurus methodological approach. It addresses the ways in which biosocial characteristics may be reflected in new theories of youth.

  16. A Design Methodology for Complex (E)-Learning. Innovative Session.

    ERIC Educational Resources Information Center

    Bastiaens, Theo; van Merrienboer, Jeroen; Hoogveld, Bert

    Human resource development (HRD) specialists are searching for instructional design models that accommodate e-learning platforms. Van Merrienboer proposed the four-component instructional design model (4C/ID model) for competency-based education. The model's basic message is that well-designed learning environments can always be described in terms…

  17. Self-Knowledge, Capacity and Sensitivity: Prerequisites to Authentic Leadership by School Principals

    ERIC Educational Resources Information Center

    Begley, Paul T.

    2006-01-01

    Purpose: The article proposes three prerequisites to authentic leadership by school principals: self-knowledge, a capacity for moral reasoning, and sensitivity to the orientations of others. Design/methodology/approach: A conceptual framework, based on research on the valuation processes of school principals and their strategic responses to…

  18. Students' Self-Evaluation and Reflection (Part 1): "Measurement"

    ERIC Educational Resources Information Center

    Cambra-Fierro, Jesus; Cambra-Berdun, Jesus

    2007-01-01

    Purpose: The objective of the paper is the development and validation of scales to assess reflective learning. Design/methodology/approach: The research is based on a literature review plus in-classroom experience. For the scale validation process, exploratory and confirmatory analyses were conducted, following proposals made by Anderson and…

  19. A "View from Nowhen" on Time Perception Experiments

    ERIC Educational Resources Information Center

    Riemer, Martin; Trojan, Jorg; Kleinbohl, Dieter; Holzl, Rupert

    2012-01-01

    Systematic errors in time reproduction tasks have been interpreted as a misperception of time and therefore seem to contradict basic assumptions of pacemaker-accumulator models. Here we propose an alternative explanation of this phenomenon based on methodological constraints regarding the direction of time, which cannot be manipulated in…

  20. Waiting Online: A Review and Research Agenda.

    ERIC Educational Resources Information Center

    Ryan, Gerard; Valverde, Mireia

    2003-01-01

    Reviews 21 papers based on 13 separate empirical studies on waiting on the Internet, drawn from the areas of marketing, system response time, and quality of service studies. The article proposes an agenda for future research, including extending the range of research methodologies, broadening the definition of waiting on the Internet, and…

Top