[Ethical considerations about research with women in situations of violence].
Rafael, Ricardo de Mattos Russo; Soares de Moura, Anna Tereza Miranda
2013-01-01
This essay aims at reflecting on the ethical and methodological principles involved in research with women in situation of violence. The text raises the discussion of the application of the principles of beneficence and non-maleficence during researches involving this issue, pointing to recommendations towards privacy, autonomy and immediate contributions for volunteers. Then, taking as theoretical reference the principles of justice and equity, the authors propose a debate on methodological aspects involved in protection of respondents, with a view at improving the quality of the data obtained and possible social contributions.
1985-11-26
etc.).., Major decisions involving reliability ptudies, based on competing risk methodology , have been made in the past and will continue to be made...censoring mechanism. In such instances, the methodology for estimating relevant reliabili- ty probabilities has received considerable attention (cf. David...proposal for a discussion of the general methodology . .,4..% . - ’ -. - ’ . ’ , . * I - " . . - - - - . . ,_ . . . . . . . . .4
Diffusion and decay chain of radioisotopes in stagnant water in saturated porous media.
Guzmán, Juan; Alvarez-Ramirez, Jose; Escarela-Pérez, Rafael; Vargas, Raúl Alejandro
2014-09-01
The analysis of the diffusion of radioisotopes in stagnant water in saturated porous media is important to validate the performance of barrier systems used in radioactive repositories. In this work a methodology is developed to determine the radioisotope concentration in a two-reservoir configuration: a saturated porous medium with stagnant water is surrounded by two reservoirs. The concentrations are obtained for all the radioisotopes of the decay chain using the concept of overvalued concentration. A methodology, based on the variable separation method, is proposed for the solution of the transport equation. The novelty of the proposed methodology involves the factorization of the overvalued concentration in two factors: one that describes the diffusion without decay and another one that describes the decay without diffusion. It is possible with the proposed methodology to determine the required time to obtain equal injective and diffusive concentrations in reservoirs. In fact, this time is inversely proportional to the diffusion coefficient. In addition, the proposed methodology allows finding the required time to get a linear and constant space distribution of the concentration in porous mediums. This time is inversely proportional to the diffusion coefficient. In order to validate the proposed methodology, the distributions in the radioisotope concentrations are compared with other experimental and numerical works. Copyright © 2014 Elsevier Ltd. All rights reserved.
Rivera, José; Carrillo, Mariano; Chacón, Mario; Herrera, Gilberto; Bojorquez, Gilberto
2007-01-01
The development of smart sensors involves the design of reconfigurable systems capable of working with different input sensors. Reconfigurable systems ideally should spend the least possible amount of time in their calibration. An autocalibration algorithm for intelligent sensors should be able to fix major problems such as offset, variation of gain and lack of linearity, as accurately as possible. This paper describes a new autocalibration methodology for nonlinear intelligent sensors based on artificial neural networks, ANN. The methodology involves analysis of several network topologies and training algorithms. The proposed method was compared against the piecewise and polynomial linearization methods. Method comparison was achieved using different number of calibration points, and several nonlinear levels of the input signal. This paper also shows that the proposed method turned out to have a better overall accuracy than the other two methods. Besides, experimentation results and analysis of the complete study, the paper describes the implementation of the ANN in a microcontroller unit, MCU. In order to illustrate the method capability to build autocalibration and reconfigurable systems, a temperature measurement system was designed and tested. The proposed method is an improvement over the classic autocalibration methodologies, because it impacts on the design process of intelligent sensors, autocalibration methodologies and their associated factors, like time and cost.
Archetype modeling methodology.
Moner, David; Maldonado, José Alberto; Robles, Montserrat
2018-03-01
Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.
A methodology for overall consequence modeling in chemical industry.
Arunraj, N S; Maiti, J
2009-09-30
Risk assessment in chemical process industry is a very important issue for safeguarding human and the ecosystem from damages caused to them. Consequence assessment is an integral part of risk assessment. However, the commonly used consequence estimation methods involve time-consuming complex mathematical models and simple assimilation of losses without considering all the consequence factors. This lead to the deterioration of quality of estimated risk value. So, the consequence modeling has to be performed in detail considering all major losses with optimal time to improve the decisive value of risk. The losses can be broadly categorized into production loss, assets loss, human health and safety loss, and environment loss. In this paper, a conceptual framework is developed to assess the overall consequence considering all the important components of major losses. Secondly, a methodology is developed for the calculation of all the major losses, which are normalized to yield the overall consequence. Finally, as an illustration, the proposed methodology is applied to a case study plant involving benzene extraction. The case study result using the proposed consequence assessment scheme is compared with that from the existing methodologies.
Stakeholder analysis methodologies resource book
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babiuch, W.M.; Farhar, B.C.
1994-03-01
Stakeholder analysis allows analysts to identify how parties might be affected by government projects. This process involves identifying the likely impacts of a proposed action and stakeholder groups affected by that action. Additionally, the process involves assessing how these groups might be affected and suggesting measures to mitigate any adverse effects. Evidence suggests that the efficiency and effectiveness of government actions can be increased and adverse social impacts mitigated when officials understand how a proposed action might affect stakeholders. This report discusses how to conduct useful stakeholder analyses for government officials making decisions on energy-efficiency and renewable-energy technologies and theirmore » commercialization. It discusses methodological issues that may affect the validity and reliability of findings, including sampling, generalizability, validity, ``uncooperative`` stakeholder groups, using social indicators, and the effect of government regulations. The Appendix contains resource directories and a list of specialists in stakeholder analysis and involvement.« less
Conjugate gradient based projection - A new explicit methodology for frictional contact
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Li, Maocheng; Sha, Desong
1993-01-01
With special attention towards the applicability to parallel computation or vectorization, a new and effective explicit approach for linear complementary formulations involving a conjugate gradient based projection methodology is proposed in this study for contact problems with Coulomb friction. The overall objectives are focussed towards providing an explicit methodology of computation for the complete contact problem with friction. In this regard, the primary idea for solving the linear complementary formulations stems from an established search direction which is projected to a feasible region determined by the non-negative constraint condition; this direction is then applied to the Fletcher-Reeves conjugate gradient method resulting in a powerful explicit methodology which possesses high accuracy, excellent convergence characteristics, fast computational speed and is relatively simple to implement for contact problems involving Coulomb friction.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-23
... the large break loss-of-coolant accident (LOCA) analysis methodology with a reference to WCAP-16009-P... required by 10 CFR 50.91(a), the licensee has provided its analysis of the issue of no significant hazards... Section 5.6.5 to incorporate a new large break LOCA analysis methodology. Specifically, the proposed...
A Test Method for Monitoring Modulus Changes during Durability Tests on Building Joint Sealants
Christopher C. White; Donald L. Hunston; Kar Tean Tan; Gregory T. Schueneman
2012-01-01
The durability of building joint sealants is generally assessed using a descriptive methodology involving visual inspection of exposed specimens for defects. It is widely known that this methodology has inherent limitations, including that the results are qualitative. A new test method is proposed that provides more fundamental and quantitative information about...
Space Transportation Operations: Assessment of Methodologies and Models
NASA Technical Reports Server (NTRS)
Joglekar, Prafulla
2001-01-01
The systems design process for future space transportation involves understanding multiple variables and their effect on lifecycle metrics. Variables such as technology readiness or potential environmental impact are qualitative, while variables such as reliability, operations costs or flight rates are quantitative. In deciding what new design concepts to fund, NASA needs a methodology that would assess the sum total of all relevant qualitative and quantitative lifecycle metrics resulting from each proposed concept. The objective of this research was to review the state of operations assessment methodologies and models used to evaluate proposed space transportation systems and to develop recommendations for improving them. It was found that, compared to the models available from other sources, the operations assessment methodology recently developed at Kennedy Space Center has the potential to produce a decision support tool that will serve as the industry standard. Towards that goal, a number of areas of improvement in the Kennedy Space Center's methodology are identified.
Space Transportation Operations: Assessment of Methodologies and Models
NASA Technical Reports Server (NTRS)
Joglekar, Prafulla
2002-01-01
The systems design process for future space transportation involves understanding multiple variables and their effect on lifecycle metrics. Variables such as technology readiness or potential environmental impact are qualitative, while variables such as reliability, operations costs or flight rates are quantitative. In deciding what new design concepts to fund, NASA needs a methodology that would assess the sum total of all relevant qualitative and quantitative lifecycle metrics resulting from each proposed concept. The objective of this research was to review the state of operations assessment methodologies and models used to evaluate proposed space transportation systems and to develop recommendations for improving them. It was found that, compared to the models available from other sources, the operations assessment methodology recently developed at Kennedy Space Center has the potential to produce a decision support tool that will serve as the industry standard. Towards that goal, a number of areas of improvement in the Kennedy Space Center's methodology are identified.
Molinos-Senante, María; Hernández-Sancho, Francesc; Sala-Garrido, Ramón
2012-01-01
The concept of sustainability involves the integration of economic, environmental, and social aspects and this also applies in the field of wastewater treatment. Economic feasibility studies are a key tool for selecting the most appropriate option from a set of technological proposals. Moreover, these studies are needed to assess the viability of transferring new technologies from pilot-scale to full-scale. In traditional economic feasibility studies, the benefits that have no market price, such as environmental benefits, are not considered and are therefore underestimated. To overcome this limitation, we propose a new methodology to assess the economic viability of wastewater treatment technologies that considers internal and external impacts. The estimation of the costs is based on the use of cost functions. To quantify the environmental benefits from wastewater treatment, the distance function methodology is proposed to estimate the shadow price of each pollutant removed in the wastewater treatment. The application of this methodological approach by decision makers enables the calculation of the true costs and benefits associated with each alternative technology. The proposed methodology is presented as a useful tool to support decision making.
[Conceptual and methodological issues involved in the research field of diagnostic reasoning].
Di Persia, Francisco N
2016-05-01
The psychopathological field is crossed by dilemmas that put in question its methodological, conceptual and philosophical filiations. Since the early works of Ey and Jaspers until recent work of Berrios it has been in question the position psychopathology has in the field of medicine in general, and in the field of psychiatry in particular, especially if it should follow the principles of natural science or if it has an autonomous position between them. This debate has led to two opposing positions facing two different models of psychopathology: the biomedical model and the socio-constructionist model. In this work it is proposed to review the scope and difficulties involved in each model following two central axes: diagnostic reasoning and mental illness conceptual problem. Later, as a synthesis of the analysis proposed they are identified central concepts of each model that could allow the development of a hybrid model in psychopathology; in between them the comprehensive framework employed in symptoms recognition and the social component that characterizes it are highlighted. As a conclusion, these concepts are proposed as central aspects for conceptual and methodological clarification of the research field of diagnostic reasoning in psychopathology.
ERIC Educational Resources Information Center
Ghedotti, Michael J.; Fielitz, Christopher; Leonard, Daniel J.
2005-01-01
This paper presents a teaching methodology involving an independent research project component for use in undergraduate Comparative Vertebrate Anatomy laboratory courses. The proposed project introduces cooperative, active learning in a research context to comparative vertebrate anatomy. This project involves pairs or groups of three students…
78 FR 76151 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-16
... better allocate resources. Methodological tests will continue to be designed to examine the feasibility... reporting. Research will involve focus groups, cognitive laboratory testing, customer satisfaction surveys...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-14
... not listed on the Web site, but should note that the NRC's E-Filing system does not support unlisted... (COLR), to update the methodology reference list to support the core design with the new AREVA fuel... methodologies listed in Technical Specification 5.7.1.5 has no impact on any plant configuration or system...
ERIC Educational Resources Information Center
Rodríguez, Juan C.
2015-01-01
This article is a work proposal that aims to describe the methodology proposed by the Management of Personnel Management from a university in Lima, to implement a management model based on competencies which traceability involves various technical HR processes practiced in the organization and is aligned to institutional outcomes defined in the…
Subsize specimen testing of nuclear reactor pressure vessel material
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, A.S.; Rosinski, S.T.; Cannon, N.S.
1991-01-01
A new methodology is proposed to correlate the upper shelf energy (USE) of full size and subsize Charpy specimens of a nuclear reactor pressure vessel plate material, A533B. The methodology appears to be more satisfactory than the methodologies proposed earlier. USE of a notched-only specimen is partitioned into macro-crack initiation and crack propagation energies. USE of a notched and precracked specimen provides the crack propagation energy. [Delta]USE, the difference between the USE's of notched-only and precracked specimens, is an estimate of the crack initiation energy. [Delta]USE was normalized by a factor involving the dimensions of the Charpy specimen and themore » stress concentration factor at the notch root. The normalized values of the [Delta]USE were found to be invariant with specimen size.« less
Semantic integration of gene expression analysis tools and data sources using software connectors
2013-01-01
Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data. PMID:24341380
Semantic integration of gene expression analysis tools and data sources using software connectors.
Miyazaki, Flávia A; Guardia, Gabriela D A; Vêncio, Ricardo Z N; de Farias, Cléver R G
2013-10-25
The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heterogeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.
NASA Astrophysics Data System (ADS)
Twardoch, Marek; Messai, Youcef; Vileno, Bertrand; Hoarau, Yannick; Mekki, Djamel E.; Felix, Olivier; Turek, Philippe; Weiss, Jean; Decher, Gero; Martel, David
2018-06-01
An experimental approach involving electron paramagnetic resonance is proposed for studying photo-generated reactive species in semiconductor nano-particle-based films deposited on the internal wall of glass capillaries. This methodology is applied here to nano-TiO2 and allows a semi-quantitative analysis of the kinetic evolutions of radical production using a spin scavenger probe.
On multi-site damage identification using single-site training data
NASA Astrophysics Data System (ADS)
Barthorpe, R. J.; Manson, G.; Worden, K.
2017-11-01
This paper proposes a methodology for developing multi-site damage location systems for engineering structures that can be trained using single-site damaged state data only. The methodology involves training a sequence of binary classifiers based upon single-site damage data and combining the developed classifiers into a robust multi-class damage locator. In this way, the multi-site damage identification problem may be decomposed into a sequence of binary decisions. In this paper Support Vector Classifiers are adopted as the means of making these binary decisions. The proposed methodology represents an advancement on the state of the art in the field of multi-site damage identification which require either: (1) full damaged state data from single- and multi-site damage cases or (2) the development of a physics-based model to make multi-site model predictions. The potential benefit of the proposed methodology is that a significantly reduced number of recorded damage states may be required in order to train a multi-site damage locator without recourse to physics-based model predictions. In this paper it is first demonstrated that Support Vector Classification represents an appropriate approach to the multi-site damage location problem, with methods for combining binary classifiers discussed. Next, the proposed methodology is demonstrated and evaluated through application to a real engineering structure - a Piper Tomahawk trainer aircraft wing - with its performance compared to classifiers trained using the full damaged-state dataset.
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Deassuncao, G. V.; Moreira, M. A.; Novaes, R. A.
1984-01-01
The development of a methodology for annual estimates of irrigated rice crop in the State of Rio Grande do Sul, Brazil, using remote sensing techniques is proposed. The project involves interpretation, digital analysis, and sampling techniques of LANDSAT imagery. Results are discussed from a preliminary phase for identifying and evaluating irrigated rice crop areas in four counties of the State, for the crop year 1982/1983. This first phase involved just visual interpretation techniques of MSS/LANDSAT images.
A new zero-inflated negative binomial methodology for latent category identification.
Blanchard, Simon J; DeSarbo, Wayne S
2013-04-01
We introduce a new statistical procedure for the identification of unobserved categories that vary between individuals and in which objects may span multiple categories. This procedure can be used to analyze data from a proposed sorting task in which individuals may simultaneously assign objects to multiple piles. The results of a synthetic example and a consumer psychology study involving categories of restaurant brands illustrate how the application of the proposed methodology to the new sorting task can account for a variety of categorization phenomena including multiple category memberships and for heterogeneity through individual differences in the saliency of latent category structures.
In silico simulations of experimental protocols for cardiac modeling.
Carro, Jesus; Rodriguez, Jose Felix; Pueyo, Esther
2014-01-01
A mathematical model of the AP involves the sum of different transmembrane ionic currents and the balance of intracellular ionic concentrations. To each ionic current corresponds an equation involving several effects. There are a number of model parameters that must be identified using specific experimental protocols in which the effects are considered as independent. However, when the model complexity grows, the interaction between effects becomes increasingly important. Therefore, model parameters identified considering the different effects as independent might be misleading. In this work, a novel methodology consisting in performing in silico simulations of the experimental protocol and then comparing experimental and simulated outcomes is proposed for parameter model identification and validation. The potential of the methodology is demonstrated by validating voltage-dependent L-type calcium current (ICaL) inactivation in recently proposed human ventricular AP models with different formulations. Our results show large differences between ICaL inactivation as calculated from the model equation and ICaL inactivation from the in silico simulations due to the interaction between effects and/or to the experimental protocol. Our results suggest that, when proposing any new model formulation, consistency between such formulation and the corresponding experimental data that is aimed at being reproduced needs to be first verified considering all involved factors.
Subsize specimen testing of nuclear reactor pressure vessel material
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, A.S.; Rosinski, S.T.; Cannon, N.S.
1991-12-31
A new methodology is proposed to correlate the upper shelf energy (USE) of full size and subsize Charpy specimens of a nuclear reactor pressure vessel plate material, A533B. The methodology appears to be more satisfactory than the methodologies proposed earlier. USE of a notched-only specimen is partitioned into macro-crack initiation and crack propagation energies. USE of a notched and precracked specimen provides the crack propagation energy. {Delta}USE, the difference between the USE`s of notched-only and precracked specimens, is an estimate of the crack initiation energy. {Delta}USE was normalized by a factor involving the dimensions of the Charpy specimen and themore » stress concentration factor at the notch root. The normalized values of the {Delta}USE were found to be invariant with specimen size.« less
Krauter, Paula; Edwards, Donna; Yang, Lynn; Tucker, Mark
2011-09-01
Decontamination and recovery of a facility or outdoor area after a wide-area biological incident involving a highly persistent agent (eg, Bacillus anthracis spores) is a complex process that requires extensive information and significant resources, which are likely to be limited, particularly if multiple facilities or areas are affected. This article proposes a systematic methodology for evaluating information to select the decontamination or alternative treatments that optimize use of resources if decontamination is required for the facility or area. The methodology covers a wide range of approaches, including volumetric and surface decontamination, monitored natural attenuation, and seal and abandon strategies. A proposed trade-off analysis can help decision makers understand the relative appropriateness, efficacy, and labor, skill, and cost requirements of the various decontamination methods for the particular facility or area needing treatment--whether alone or as part of a larger decontamination effort. Because the state of decontamination knowledge and technology continues to evolve rapidly, the methodology presented here is designed to accommodate new strategies and materials and changing information.
NASA Astrophysics Data System (ADS)
Çakır, Süleyman
2017-10-01
In this study, a two-phase methodology for resource allocation problems under a fuzzy environment is proposed. In the first phase, the imprecise Shannon's entropy method and the acceptability index are suggested, for the first time in the literature, to select input and output variables to be used in the data envelopment analysis (DEA) application. In the second step, an interval inverse DEA model is executed for resource allocation in a short run. In an effort to exemplify the practicality of the proposed fuzzy model, a real case application has been conducted involving 16 cement firms listed in Borsa Istanbul. The results of the case application indicated that the proposed hybrid model is a viable procedure to handle input-output selection and resource allocation problems under fuzzy conditions. The presented methodology can also lend itself to different applications such as multi-criteria decision-making problems.
Methodological issues in the design of a rheumatoid arthritis activity score and its cut-offs.
Collignon, Olivier
2014-01-01
Activity of rheumatoid arthritis (RA) can be evaluated using several scoring scales based on clinical features. The most widely used one is the Disease Activity Score involving 28 joint counts (DAS28) for which cut-offs were proposed to help physicians classify patients. However, inaccurate scoring can lead to inappropriate medical decisions. In this article some methodological issues in the design of such a score and its cut-offs are highlighted in order to further propose a strategy to overcome them. As long as the issues reviewed in this article are not addressed, results of studies based on standard disease activity scores such as DAS28 should be considered with caution.
Portell, Mariona; Anguera, M Teresa; Hernández-Mendo, Antonio; Jonsson, Gudberg K
2015-01-01
Contextual factors are crucial for evaluative research in psychology, as they provide insights into what works, for whom, in what circumstances, in what respects, and why. Studying behavior in context, however, poses numerous methodological challenges. Although a comprehensive framework for classifying methods seeking to quantify biopsychosocial aspects in everyday contexts was recently proposed, this framework does not contemplate contributions from observational methodology. The aim of this paper is to justify and propose a more general framework that includes observational methodology approaches. Our analysis is rooted in two general concepts: ecological validity and methodological complementarity. We performed a narrative review of the literature on research methods and techniques for studying daily life and describe their shared properties and requirements (collection of data in real time, on repeated occasions, and in natural settings) and classification criteria (eg, variables of interest and level of participant involvement in the data collection process). We provide several examples that illustrate why, despite their higher costs, studies of behavior and experience in everyday contexts offer insights that complement findings provided by other methodological approaches. We urge that observational methodology be included in classifications of research methods and techniques for studying everyday behavior and advocate a renewed commitment to prioritizing ecological validity in behavioral research seeking to quantify biopsychosocial aspects. PMID:26089708
Web Health Monitoring Survey: A New Approach to Enhance the Effectiveness of Telemedicine Systems.
Romano, Maria Francesca; Sardella, Maria Vittoria; Alboni, Fabrizio
2016-06-06
Aging of the European population and interest in a healthy population in western countries have contributed to an increase in the number of health surveys, where the role of survey design, data collection, and data analysis methodology is clear and recognized by the whole scientific community. Survey methodology has had to couple with the challenges deriving from data collection through information and communications technology (ICT). Telemedicine systems have not used patients as a source of information, often limiting them to collecting only biometric data. A more effective telemonitoring system would be able to collect objective and subjective data (biometric parameters and symptoms reported by the patients themselves), and to control the quality of subjective data collected: this goal be achieved only by using and merging competencies from both survey methodology and health research. The objective of our study was to propose new metrics to control the quality of data, along with the well-known indicators of survey methodology. Web questionnaires administered daily to a group of patients for an extended length of time are a Web health monitoring survey (WHMS) in a telemedicine system. We calculated indicators based on paradata collected during a WHMS study involving 12 patients, who signed in to the website daily for 2 months. The patients' involvement was very high: the patients' response rate ranged between 1.00 and 0.82, with an outlier of 0.65. Item nonresponse rate was very low, ranging between 0.0% and 7.4%. We propose adherence to the chosen time to connect to the website as a measure of involvement and cooperation by the patients: the difference from the median time ranged between 11 and 24 minutes, demonstrating very good cooperation and involvement from all patients. To measure habituation to the questionnaire, we also compared nonresponse rates to the items between the first and the second month of the study, and found no significant difference. We computed the time to complete the questionnaire both as a measure of possible burden for patient, and to detect the risk of automatic responses. Neither of these hypothesis was confirmed, and differences in time to completion seemed to depend on health conditions. Focus groups with patients confirmed their appreciation for this "new" active role in a telemonitoring system. The main and innovative aspect of our proposal is the use of a Web questionnaire to virtually recreate a checkup visit, integrating subjective (patient's information) with objective data (biometric information). Our results, although preliminary and if need of further study, appear promising in proposing more effective telemedicine systems. Survey methodology could have an effective role in this growing field of research and applications.
A RLS-SVM Aided Fusion Methodology for INS during GPS Outages
Yao, Yiqing; Xu, Xiaosu
2017-01-01
In order to maintain a relatively high accuracy of navigation performance during global positioning system (GPS) outages, a novel robust least squares support vector machine (LS-SVM)-aided fusion methodology is explored to provide the pseudo-GPS position information for the inertial navigation system (INS). The relationship between the yaw, specific force, velocity, and the position increment is modeled. Rather than share the same weight in the traditional LS-SVM, the proposed algorithm allocates various weights for different data, which makes the system immune to the outliers. Field test data was collected to evaluate the proposed algorithm. The comparison results indicate that the proposed algorithm can effectively provide position corrections for standalone INS during the 300 s GPS outage, which outperforms the traditional LS-SVM method. Historical information is also involved to better represent the vehicle dynamics. PMID:28245549
A RLS-SVM Aided Fusion Methodology for INS during GPS Outages.
Yao, Yiqing; Xu, Xiaosu
2017-02-24
In order to maintain a relatively high accuracy of navigation performance during global positioning system (GPS) outages, a novel robust least squares support vector machine (LS-SVM)-aided fusion methodology is explored to provide the pseudo-GPS position information for the inertial navigation system (INS). The relationship between the yaw, specific force, velocity, and the position increment is modeled. Rather than share the same weight in the traditional LS-SVM, the proposed algorithm allocates various weights for different data, which makes the system immune to the outliers. Field test data was collected to evaluate the proposed algorithm. The comparison results indicate that the proposed algorithm can effectively provide position corrections for standalone INS during the 300 s GPS outage, which outperforms the traditional LS-SVM method. Historical information is also involved to better represent the vehicle dynamics.
77 FR 26736 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-07
... an Internet Push methodology, in an effort to obtain early response rate indicators for the 2020... contact strategies involving optimizing the Internet push strategy are proposed, such as implementing... reducing and/or eliminating back-end processing. Affected Public: Individuals or households. Frequency: One...
VARIABLE SELECTION FOR REGRESSION MODELS WITH MISSING DATA
Garcia, Ramon I.; Ibrahim, Joseph G.; Zhu, Hongtu
2009-01-01
We consider the variable selection problem for a class of statistical models with missing data, including missing covariate and/or response data. We investigate the smoothly clipped absolute deviation penalty (SCAD) and adaptive LASSO and propose a unified model selection and estimation procedure for use in the presence of missing data. We develop a computationally attractive algorithm for simultaneously optimizing the penalized likelihood function and estimating the penalty parameters. Particularly, we propose to use a model selection criterion, called the ICQ statistic, for selecting the penalty parameters. We show that the variable selection procedure based on ICQ automatically and consistently selects the important covariates and leads to efficient estimates with oracle properties. The methodology is very general and can be applied to numerous situations involving missing data, from covariates missing at random in arbitrary regression models to nonignorably missing longitudinal responses and/or covariates. Simulations are given to demonstrate the methodology and examine the finite sample performance of the variable selection procedures. Melanoma data from a cancer clinical trial is presented to illustrate the proposed methodology. PMID:20336190
Knowledge Building in Asynchronous Discussion Groups: Going Beyond Quantitative Analysis
ERIC Educational Resources Information Center
Schrire, Sarah
2006-01-01
This contribution examines the methodological challenges involved in defining the collaborative knowledge-building processes occurring in asynchronous discussion and proposes an approach that could advance understanding of these processes. The written protocols that are available to the analyst provide an exact record of the instructional…
Training Tools for Translators and Interpreters
ERIC Educational Resources Information Center
Al-Qinai, Jamal
2010-01-01
The present paper reviews the traditional methodologies of translator training and proposes an eclectic multi-componential approach that involves a set of interdisciplinary skills with the ultimate objective of meeting market demand. Courses on translation for specific purposes (TSP) and think-aloud protocols (TAP) along with self-monitoring and…
Castro, Alexander Garcia; Rocca-Serra, Philippe; Stevens, Robert; Taylor, Chris; Nashar, Karim; Ragan, Mark A; Sansone, Susanna-Assunta
2006-01-01
Background Incorporation of ontologies into annotations has enabled 'semantic integration' of complex data, making explicit the knowledge within a certain field. One of the major bottlenecks in developing bio-ontologies is the lack of a unified methodology. Different methodologies have been proposed for different scenarios, but there is no agreed-upon standard methodology for building ontologies. The involvement of geographically distributed domain experts, the need for domain experts to lead the design process, the application of the ontologies and the life cycles of bio-ontologies are amongst the features not considered by previously proposed methodologies. Results Here, we present a methodology for developing ontologies within the biological domain. We describe our scenario, competency questions, results and milestones for each methodological stage. We introduce the use of concept maps during knowledge acquisition phases as a feasible transition between domain expert and knowledge engineer. Conclusion The contributions of this paper are the thorough description of the steps we suggest when building an ontology, example use of concept maps, consideration of applicability to the development of lower-level ontologies and application to decentralised environments. We have found that within our scenario conceptual maps played an important role in the development process. PMID:16725019
Integrating Design and Manufacturing for a High Speed Civil Transport Wing
NASA Technical Reports Server (NTRS)
Marx, William J.; Mavris, Dimitri N.; Schrage, Daniel P.
1994-01-01
The aerospace industry is currently addressing the problem of integrating design and manufacturing. Because of the difficulties associated with using conventional, procedural techniques and algorithms, it is the authors' belief that the only feasible way to integrate the two concepts is with the development of an appropriate Knowledge-Based System (KBS). The authors propose a methodology for an aircraft producibility assessment, including a KBS, that addresses both procedural and heuristic aspects of integrating design and manufacturing of a High Speed Civil Transport (HSCT) wing. The HSCT was chosen as the focus of this investigation since it is a current NASA/aerospace industry initiative full of technological challenges involving many disciplines. The paper gives a brief background of selected previous supersonic transport studies followed by descriptions of key relevant design and manufacturing methodologies. Georgia Tech's Concurrent Engineering/Integrated Product and Process Development methodology is discussed with reference to this proposed conceptual producibility assessment. Evaluation criteria are presented that relate pertinent product and process parameters to overall product producibility. In addition, the authors' integration methodology and reasons for selecting a KBS to integrate design and manufacturing are presented in this paper. Finally, a proposed KBS is given, as well as statements of future work and overall investigation objectives.
Chai, Xun; Gao, Feng; Pan, Yang; Qi, Chenkun; Xu, Yilin
2015-04-22
Coordinate identification between vision systems and robots is quite a challenging issue in the field of intelligent robotic applications, involving steps such as perceiving the immediate environment, building the terrain map and planning the locomotion automatically. It is now well established that current identification methods have non-negligible limitations such as a difficult feature matching, the requirement of external tools and the intervention of multiple people. In this paper, we propose a novel methodology to identify the geometric parameters of 3D vision systems mounted on robots without involving other people or additional equipment. In particular, our method focuses on legged robots which have complex body structures and excellent locomotion ability compared to their wheeled/tracked counterparts. The parameters can be identified only by moving robots on a relatively flat ground. Concretely, an estimation approach is provided to calculate the ground plane. In addition, the relationship between the robot and the ground is modeled. The parameters are obtained by formulating the identification problem as an optimization problem. The methodology is integrated on a legged robot called "Octopus", which can traverse through rough terrains with high stability after obtaining the identification parameters of its mounted vision system using the proposed method. Diverse experiments in different environments demonstrate our novel method is accurate and robust.
A comprehensive methodology for the multidimensional and synchronic data collecting in soundscape.
Kogan, Pablo; Turra, Bruno; Arenas, Jorge P; Hinalaf, María
2017-02-15
The soundscape paradigm is comprised of complex living systems where individuals interact moment-by-moment among one another and with the physical environment. The real environments provide promising conditions to reveal deep soundscape behavior, including the multiple components involved and their interrelations as a whole. However, measuring and analyzing the numerous simultaneous variables of soundscape represents a challenge that is not completely understood. This work proposes and applies a comprehensive methodology for multidimensional and synchronic data collection in soundscape. The soundscape variables were organized into three main entities: experienced environment, acoustic environment, and extra-acoustic environment, containing, in turn, subgroups of variables called components. The variables contained in these components were acquired through synchronic field techniques that include surveys, acoustic measurements, audio recordings, photography, and video. The proposed methodology was tested, optimized, and applied in diverse open environments, including squares, parks, fountains, university campuses, streets, and pedestrian areas. The systematization of this comprehensive methodology provides a framework for soundscape research, a support for urban and environment management, and a preliminary procedure for standardization in soundscape data collecting. Copyright © 2016 Elsevier B.V. All rights reserved.
Personalised Information Services Using a Hybrid Recommendation Method Based on Usage Frequency
ERIC Educational Resources Information Center
Kim, Yong; Chung, Min Gyo
2008-01-01
Purpose: This paper seeks to describe a personal recommendation service (PRS) involving an innovative hybrid recommendation method suitable for deployment in a large-scale multimedia user environment. Design/methodology/approach: The proposed hybrid method partitions content and user into segments and executes association rule mining,…
Sharing a Multimodal Corpus to Study Webcam-Mediated Language Teaching
ERIC Educational Resources Information Center
Guichon, Nicolas
2017-01-01
This article proposes a methodology to create a multimodal corpus that can be shared with a group of researchers in order to analyze synchronous online pedagogical interactions. Epistemological aspects involved in studying online interactions from a multimodal and semiotic perspective are addressed. Then, issues and challenges raised by corpus…
Exact Tests for the Rasch Model via Sequential Importance Sampling
ERIC Educational Resources Information Center
Chen, Yuguo; Small, Dylan
2005-01-01
Rasch proposed an exact conditional inference approach to testing his model but never implemented it because it involves the calculation of a complicated probability. This paper furthers Rasch's approach by (1) providing an efficient Monte Carlo methodology for accurately approximating the required probability and (2) illustrating the usefulness…
Composite Indices of Development and Poverty: An Application to MDGs
ERIC Educational Resources Information Center
De Muro, Pasquale; Mazziotta, Matteo; Pareto, Adriano
2011-01-01
The measurement of development or poverty as multidimensional phenomena is very difficult because there are several theoretical, methodological and empirical problems involved. The literature of composite indicators offers a wide variety of aggregation methods, all with their pros and cons. In this paper, we propose a new, alternative composite…
Atwood's Machine as a Tool to Introduce Variable Mass Systems
ERIC Educational Resources Information Center
de Sousa, Celia A.
2012-01-01
This article discusses an instructional strategy which explores eventual similarities and/or analogies between familiar problems and more sophisticated systems. In this context, the Atwood's machine problem is used to introduce students to more complex problems involving ropes and chains. The methodology proposed helps students to develop the…
[Cooperative learning for improving healthy housing conditions in Bogota: a case study].
Torres-Parra, Camilo A; García-Ubaque, Juan C; García-Ubaque, César A
2014-01-01
This was a community-based effort at constructing an educational proposal orientated towards self-empowerment aimed at improving the target population's sanitary, housing and living conditions through cooperative learning. A constructivist approach was adopted based on a programme called "Habitat community manger". The project involved working with fifteen families living in the Mochuelo Bajo barrio in Ciudad Bolívar in Bogotá, Colombia, for identifying the most relevant sanitary aspects for improving their homes and proposing a methodology and organisation for an educational proposal. Twenty-one poor housing-related epidemiological indicators were identified which formed the basis for defining specific problems and establishing a methodology for designing an educational proposal. The course which emerged from the cooperative learning experience was designed to promote the community's skills and education regarding health aimed at improving households' living conditions and ensuring a healthy environment which would allow them to develop an immediate habitat ensuring their own welfare and dignity.
NASA Astrophysics Data System (ADS)
Morse, Llewellyn; Sharif Khodaei, Zahra; Aliabadi, M. H.
2018-01-01
In this work, a reliability based impact detection strategy for a sensorized composite structure is proposed. Impacts are localized using Artificial Neural Networks (ANNs) with recorded guided waves due to impacts used as inputs. To account for variability in the recorded data under operational conditions, Bayesian updating and Kalman filter techniques are applied to improve the reliability of the detection algorithm. The possibility of having one or more faulty sensors is considered, and a decision fusion algorithm based on sub-networks of sensors is proposed to improve the application of the methodology to real structures. A strategy for reliably categorizing impacts into high energy impacts, which are probable to cause damage in the structure (true impacts), and low energy non-damaging impacts (false impacts), has also been proposed to reduce the false alarm rate. The proposed strategy involves employing classification ANNs with different features extracted from captured signals used as inputs. The proposed methodologies are validated by experimental results on a quasi-isotropic composite coupon impacted with a range of impact energies.
Probability genotype imputation method and integrated weighted lasso for QTL identification.
Demetrashvili, Nino; Van den Heuvel, Edwin R; Wit, Ernst C
2013-12-30
Many QTL studies have two common features: (1) often there is missing marker information, (2) among many markers involved in the biological process only a few are causal. In statistics, the second issue falls under the headings "sparsity" and "causal inference". The goal of this work is to develop a two-step statistical methodology for QTL mapping for markers with binary genotypes. The first step introduces a novel imputation method for missing genotypes. Outcomes of the proposed imputation method are probabilities which serve as weights to the second step, namely in weighted lasso. The sparse phenotype inference is employed to select a set of predictive markers for the trait of interest. Simulation studies validate the proposed methodology under a wide range of realistic settings. Furthermore, the methodology outperforms alternative imputation and variable selection methods in such studies. The methodology was applied to an Arabidopsis experiment, containing 69 markers for 165 recombinant inbred lines of a F8 generation. The results confirm previously identified regions, however several new markers are also found. On the basis of the inferred ROC behavior these markers show good potential for being real, especially for the germination trait Gmax. Our imputation method shows higher accuracy in terms of sensitivity and specificity compared to alternative imputation method. Also, the proposed weighted lasso outperforms commonly practiced multiple regression as well as the traditional lasso and adaptive lasso with three weighting schemes. This means that under realistic missing data settings this methodology can be used for QTL identification.
A Design Methodology for Medical Processes.
Ferrante, Simona; Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara
2016-01-01
Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient's needs, the uncertainty of the patient's response, and the indeterminacy of patient's compliance to treatment. Also, the multiple actors involved in patient's care need clear and transparent communication to ensure care coordination. In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution.
THERP and HEART integrated methodology for human error assessment
NASA Astrophysics Data System (ADS)
Castiglia, Francesco; Giardina, Mariarosa; Tomarchio, Elio
2015-11-01
THERP and HEART integrated methodology is proposed to investigate accident scenarios that involve operator errors during high-dose-rate (HDR) treatments. The new approach has been modified on the basis of fuzzy set concept with the aim of prioritizing an exhaustive list of erroneous tasks that can lead to patient radiological overexposures. The results allow for the identification of human errors that are necessary to achieve a better understanding of health hazards in the radiotherapy treatment process, so that it can be properly monitored and appropriately managed.
Radiological Characterization Methodology of INEEL Stored RH-TRU Waste from ANL-E
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rajiv N. Bhatt
2003-02-01
An Acceptable Knowledge (AK)-based radiological characterization methodology is being developed for RH TRU waste generated from ANL-E hot cell operations performed on fuel elements irradiated in the EBR-II reactor. The methodology relies on AK for composition of the fresh fuel elements, their irradiation history, and the waste generation and collection processes. Radiological characterization of the waste involves the estimates of the quantities of significant fission products and transuranic isotopes in the waste. Methods based on reactor and physics principles are used to achieve these estimates. Because of the availability of AK and the robustness of the calculation methods, the AK-basedmore » characterization methodology offers a superior alternative to traditional waste assay techniques. Using this methodology, it is shown that the radiological parameters of a test batch of ANL-E waste is well within the proposed WIPP Waste Acceptance Criteria limits.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuan, P.; Bhatt, R.N.
2003-01-14
An Acceptable Knowledge (AK)-based radiological characterization methodology is being developed for RH TRU waste generated from ANL-E hot cell operations performed on fuel elements irradiated in the EBR-II reactor. The methodology relies on AK for composition of the fresh fuel elements, their irradiation history, and the waste generation and collection processes. Radiological characterization of the waste involves the estimates of the quantities of significant fission products and transuranic isotopes in the waste. Methods based on reactor and physics principles are used to achieve these estimates. Because of the availability of AK and the robustness of the calculation methods, the AK-basedmore » characterization methodology offers a superior alternative to traditional waste assay techniques. Using the methodology, it is shown that the radiological parameters of a test batch of ANL-E waste is well within the proposed WIPP Waste Acceptance Criteria limits.« less
Google+ as a Tool for Use in Cooperative Laboratory Activities between Universities
ERIC Educational Resources Information Center
Puig-Ortiz, Joan; Pàmies-Vilà, Rosa; Martinez Miralles, Jordi Ramon
2015-01-01
The following is a proposal for collaboration between universities with the aim to improve curricula that require laboratory activities. A methodology is suggested to implement an innovative educational project involving the exchange of laboratory activities. The exchange of laboratory activities can be carried out on different levels of…
An Alternative Approach for Nonlinear Latent Variable Models
ERIC Educational Resources Information Center
Mooijaart, Ab; Bentler, Peter M.
2010-01-01
In the last decades there has been an increasing interest in nonlinear latent variable models. Since the seminal paper of Kenny and Judd, several methods have been proposed for dealing with these kinds of models. This article introduces an alternative approach. The methodology involves fitting some third-order moments in addition to the means and…
Possibilities of Particle Finite Element Methods in Industrial Forming Processes
NASA Astrophysics Data System (ADS)
Oliver, J.; Cante, J. C.; Weyler, R.; Hernandez, J.
2007-04-01
The work investigates the possibilities offered by the particle finite element method (PFEM) in the simulation of forming problems involving large deformations, multiple contacts, and new boundaries generation. The description of the most distinguishing aspects of the PFEM, and its application to simulation of representative forming processes, illustrate the proposed methodology.
21 CFR 514.8 - Supplements and other changes to an approved application.
Code of Federal Regulations, 2011 CFR
2011-04-01
... the drug as manufactured without the change; (C) Changes that may affect drug substance or drug... the proposed change; (C) The drug(s) involved; (D) The manufacturing site(s) or area(s) affected; (E...) Replacement of equipment with that of a different design that does not affect the process methodology or...
21 CFR 514.8 - Supplements and other changes to an approved application.
Code of Federal Regulations, 2010 CFR
2010-04-01
... the drug as manufactured without the change; (C) Changes that may affect drug substance or drug... the proposed change; (C) The drug(s) involved; (D) The manufacturing site(s) or area(s) affected; (E...) Replacement of equipment with that of a different design that does not affect the process methodology or...
ERIC Educational Resources Information Center
Robinson, Viviane M. J.
2010-01-01
While there is considerable evidence about the impact of instructional leadership on student outcomes, there is far less known about the leadership capabilities that are required to confidently engage in the practices involved. This article uses the limited available evidence, combined with relevant theoretical analyses, to propose a tentative…
Research Knowledge Transfer through Business-Driven Student Assignment
ERIC Educational Resources Information Center
Sas, Corina
2009-01-01
Purpose: The purpose of this paper is to present a knowledge transfer method that capitalizes on both research and teaching dimensions of academic work. It also aims to propose a framework for evaluating the impact of such a method on the involved stakeholders. Design/methodology/approach: The case study outlines and evaluates the six-stage…
Exploring the Partnership between Line Managers and HRM in Greece
ERIC Educational Resources Information Center
Papalexandris, Nancy; Panayotopoulou, Leda
2005-01-01
Purpose: This article seeks to discuss the role that line managers take up concerning human resource management issues among Greek firms and to propose ways for enhancing the synergistic relationship between human resource (HR) and line managers. Design/methodology/approach: It presents the trends of line management involvement in Greek firms,…
Methodological pluralism in the teaching of Astronomy
NASA Astrophysics Data System (ADS)
de Macedo, Josué Antunes; Voelzke, Marcos Rincon
2015-04-01
This paper discusses the feasibility of using a teaching strategy called methodological pluralism, consisting of the use of various methodological resources in order to provide a meaningful learning. It is part of a doctoral thesis, which aims to investigate contributions to the use of traditional resources combined with digital technologies, in order to create autonomy for future teachers of Natural Sciences and Mathematics in relation to themes in Astronomy. It was offered an extension course at the "Federal Institution of Education, Science and Technology" in the North of Minas Gerais (FINMG), Campus Januaria, for thirty-two students of licentiate courses in Physics, Mathematics and Biological Sciences, involving themes of Astronomy, in order to search and contribute to improving the training of future teachers. The following aspects are used: the mixed methodology, with pre-experimental design, combined with content analysis. The results indicate the rates of students' prior knowledge in relation to Astronomy was low; meaningful learning indications of concepts related to Astronomy, and the feasibility of using digital resources Involving technologies, articulated with traditional materials in the teaching of Astronomy. This research sought to contribute to the initial teacher training, especially in relation to Astronomy Teaching, proposing new alternatives to promote the teaching of this area of knowledge, extending the methodological options of future teachers.
Chai, Xun; Gao, Feng; Pan, Yang; Qi, Chenkun; Xu, Yilin
2015-01-01
Coordinate identification between vision systems and robots is quite a challenging issue in the field of intelligent robotic applications, involving steps such as perceiving the immediate environment, building the terrain map and planning the locomotion automatically. It is now well established that current identification methods have non-negligible limitations such as a difficult feature matching, the requirement of external tools and the intervention of multiple people. In this paper, we propose a novel methodology to identify the geometric parameters of 3D vision systems mounted on robots without involving other people or additional equipment. In particular, our method focuses on legged robots which have complex body structures and excellent locomotion ability compared to their wheeled/tracked counterparts. The parameters can be identified only by moving robots on a relatively flat ground. Concretely, an estimation approach is provided to calculate the ground plane. In addition, the relationship between the robot and the ground is modeled. The parameters are obtained by formulating the identification problem as an optimization problem. The methodology is integrated on a legged robot called “Octopus”, which can traverse through rough terrains with high stability after obtaining the identification parameters of its mounted vision system using the proposed method. Diverse experiments in different environments demonstrate our novel method is accurate and robust. PMID:25912350
Martins, Marcelo Ramos; Schleder, Adriana Miralles; Droguett, Enrique López
2014-12-01
This article presents an iterative six-step risk analysis methodology based on hybrid Bayesian networks (BNs). In typical risk analysis, systems are usually modeled as discrete and Boolean variables with constant failure rates via fault trees. Nevertheless, in many cases, it is not possible to perform an efficient analysis using only discrete and Boolean variables. The approach put forward by the proposed methodology makes use of BNs and incorporates recent developments that facilitate the use of continuous variables whose values may have any probability distributions. Thus, this approach makes the methodology particularly useful in cases where the available data for quantification of hazardous events probabilities are scarce or nonexistent, there is dependence among events, or when nonbinary events are involved. The methodology is applied to the risk analysis of a regasification system of liquefied natural gas (LNG) on board an FSRU (floating, storage, and regasification unit). LNG is becoming an important energy source option and the world's capacity to produce LNG is surging. Large reserves of natural gas exist worldwide, particularly in areas where the resources exceed the demand. Thus, this natural gas is liquefied for shipping and the storage and regasification process usually occurs at onshore plants. However, a new option for LNG storage and regasification has been proposed: the FSRU. As very few FSRUs have been put into operation, relevant failure data on FSRU systems are scarce. The results show the usefulness of the proposed methodology for cases where the risk analysis must be performed under considerable uncertainty. © 2014 Society for Risk Analysis.
A proposal on teaching methodology: cooperative learning by peer tutoring based on the case method
NASA Astrophysics Data System (ADS)
Pozo, Antonio M.; Durbán, Juan J.; Salas, Carlos; del Mar Lázaro, M.
2014-07-01
The European Higher Education Area (EHEA) proposes substantial changes in the teaching-learning model, moving from a model based mainly on the activity of teachers to a model in which the true protagonist is the student. This new framework requires that students develop new abilities and acquire specific skills. This also implies that the teacher should incorporate new methodologies in class. In this work, we present a proposal on teaching methodology based on cooperative learning and peer tutoring by case study. A noteworthy aspect of the case-study method is that it presents situations that can occur in real life. Therefore, students can acquire certain skills that will be useful in their future professional practice. An innovative aspect in the teaching methodology that we propose is to form work groups consisting of students from different levels in the same major. In our case, the teaching of four subjects would be involved: one subject of the 4th year, one subject of the 3rd year, and two subjects of the 2nd year of the Degree in Optics and Optometry of the University of Granada, Spain. Each work group would consist of a professor and a student of the 4th year, a professor and a student of the 3rd year, and two professors and two students of the 2nd year. Each work group would have a tutoring process from each professor for the corresponding student, and a 4th-year student providing peer tutoring for the students of the 2nd and 3rd year.
Design methodology of Dutch banknotes
NASA Astrophysics Data System (ADS)
de Heij, Hans A. M.
2000-04-01
Since the introduction of a design methodology for Dutch banknotes, the quality of Dutch paper currency has improved in more than one way. The methodology is question provides for (i) a design policy, which helps fix clear objectives; (ii) design management, to ensure a smooth cooperation between the graphic designer, printer, papermaker an central bank, (iii) a program of requirements, a banknote development guideline for all parties involved. This systematic approach enables an objective selection of design proposals, including security features. Furthermore, the project manager obtains regular feedback from the public by conducting market surveys. Each new design of a Netherlands Guilder banknote issued by the Nederlandsche Bank of the past 50 years has been an improvement on its predecessor in terms of value recognition, security and durability.
A Design Methodology for Medical Processes
Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara
2016-01-01
Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415
Towards an Approach for an Accessible and Inclusive Virtual Education Using ESVI-AL Project Results
ERIC Educational Resources Information Center
Amado-Salvatierra, Hector R.; Hilera, Jose R.
2015-01-01
Purpose: This paper aims to present an approach to achieve accessible and inclusive Virtual Education for all, but especially intended for students with disabilities. This work proposes main steps to take into consideration for stakeholders involved in the educational process related to an inclusive e-Learning. Design/methodology/approach: The…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-04
..., but should note that the NRC's E-Filing system does not support unlisted software, and the NRC Meta... support the physical fuel change. These methodologies do not use the total planar radial peaking factor (F... systems performance, operating mode and equipment out of service. The proposed change is supported by GEH...
What Does a Transformative Lens Bring to Credible Evidence in Mixed Methods Evaluations?
ERIC Educational Resources Information Center
Mertens, Donna M.
2013-01-01
Credibility in evaluation is a multifaceted concept that involves consideration of diverse stakeholders' perspectives and purposes. The use of a transformative lens is proposed as a means to bringing issues of social justice and human rights to the foreground in decisions about methodology, credibility of evidence, and use of evaluation…
Coalescence computations for large samples drawn from populations of time-varying sizes
Polanski, Andrzej; Szczesna, Agnieszka; Garbulowski, Mateusz; Kimmel, Marek
2017-01-01
We present new results concerning probability distributions of times in the coalescence tree and expected allele frequencies for coalescent with large sample size. The obtained results are based on computational methodologies, which involve combining coalescence time scale changes with techniques of integral transformations and using analytical formulae for infinite products. We show applications of the proposed methodologies for computing probability distributions of times in the coalescence tree and their limits, for evaluation of accuracy of approximate expressions for times in the coalescence tree and expected allele frequencies, and for analysis of large human mitochondrial DNA dataset. PMID:28170404
Real-time control systems: feedback, scheduling and robustness
NASA Astrophysics Data System (ADS)
Simon, Daniel; Seuret, Alexandre; Sename, Olivier
2017-08-01
The efficient control of real-time distributed systems, where continuous components are governed through digital devices and communication networks, needs a careful examination of the constraints arising from the different involved domains inside co-design approaches. Thanks to the robustness of feedback control, both new control methodologies and slackened real-time scheduling schemes are proposed beyond the frontiers between these traditionally separated fields. A methodology to design robust aperiodic controllers is provided, where the sampling interval is considered as a control variable of the system. Promising experimental results are provided to show the feasibility and robustness of the approach.
Object Transportation by Two Mobile Robots with Hand Carts
Hara, Tatsunori
2014-01-01
This paper proposes a methodology by which two small mobile robots can grasp, lift, and transport large objects using hand carts. The specific problems involve generating robot actions and determining the hand cart positions to achieve the stable loading of objects onto the carts. These problems are solved using nonlinear optimization, and we propose an algorithm for generating robot actions. The proposed method was verified through simulations and experiments using actual devices in a real environment. The proposed method could reduce the number of robots required to transport large objects with 50–60%. In addition, we demonstrated the efficacy of this task in real environments where errors occur in robot sensing and movement. PMID:27433499
Object Transportation by Two Mobile Robots with Hand Carts.
Sakuyama, Takuya; Figueroa Heredia, Jorge David; Ogata, Taiki; Hara, Tatsunori; Ota, Jun
2014-01-01
This paper proposes a methodology by which two small mobile robots can grasp, lift, and transport large objects using hand carts. The specific problems involve generating robot actions and determining the hand cart positions to achieve the stable loading of objects onto the carts. These problems are solved using nonlinear optimization, and we propose an algorithm for generating robot actions. The proposed method was verified through simulations and experiments using actual devices in a real environment. The proposed method could reduce the number of robots required to transport large objects with 50-60%. In addition, we demonstrated the efficacy of this task in real environments where errors occur in robot sensing and movement.
NASA Astrophysics Data System (ADS)
Molina-Viedma, Ángel J.; López-Alba, Elías; Felipe-Sesé, Luis; Díaz, Francisco A.
2017-10-01
In recent years, many efforts have been made to exploit full-field measurement optical techniques for modal identification. Three-dimensional digital image correlation using high-speed cameras has been extensively employed for this purpose. Modal identification algorithms are applied to process the frequency response functions (FRF), which relate the displacement response of the structure to the excitation force. However, one of the most common tests for modal analysis involves the base motion excitation of a structural element instead of force excitation. In this case, the relationship between response and excitation is typically based on displacements, which are known as transmissibility functions. In this study, a methodology for experimental modal analysis using high-speed 3D digital image correlation and base motion excitation tests is proposed. In particular, a cantilever beam was excited from its base with a random signal, using a clamped edge join. Full-field transmissibility functions were obtained through the beam and converted into FRF for proper identification, considering a single degree-of-freedom theoretical conversion. Subsequently, modal identification was performed using a circle-fit approach. The proposed methodology facilitates the management of the typically large amounts of data points involved in the DIC measurement during modal identification. Moreover, it was possible to determine the natural frequencies, damping ratios and full-field mode shapes without requiring any additional tests. Finally, the results were experimentally validated by comparing them with those obtained by employing traditional accelerometers, analytical models and finite element method analyses. The comparison was performed by using the quantitative indicator modal assurance criterion. The results showed a high level of correspondence, consolidating the proposed experimental methodology.
Angerville, Ruth; Perrodin, Yves; Bazin, Christine; Emmanuel, Evens
2013-01-01
Discharges of Combined Sewer Overflows (CSOs) into periurban rivers present risks for the concerned aquatic ecosystems. In this work, a specific ecotoxicological risk assessment methodology has been developed as management tool to municipalities equipped with CSOs. This methodology comprises a detailed description of the spatio-temporal system involved, the choice of ecological targets to be preserved, and carrying out bioassays adapted to each compartment of the river receiving CSOs. Once formulated, this methodology was applied to a river flowing through the outskirts of the city of Lyon in France. The results obtained for the scenario studied showed a moderate risk for organisms of the water column and a major risk for organisms of the benthic and hyporheic zones of the river. The methodology enabled identifying the critical points of the spatio-temporal systems studied, and then making proposals for improving the management of CSOs. PMID:23812025
Automated Methodologies for the Design of Flow Diagrams for Development and Maintenance Activities
NASA Astrophysics Data System (ADS)
Shivanand M., Handigund; Shweta, Bhat
The Software Requirements Specification (SRS) of the organization is a text document prepared by strategic management incorporating the requirements of the organization. These requirements of ongoing business/ project development process involve the software tools, the hardware devices, the manual procedures, the application programs and the communication commands. These components are appropriately ordered for achieving the mission of the concerned process both in the project development and the ongoing business processes, in different flow diagrams viz. activity chart, workflow diagram, activity diagram, component diagram and deployment diagram. This paper proposes two generic, automatic methodologies for the design of various flow diagrams of (i) project development activities, (ii) ongoing business process. The methodologies also resolve the ensuing deadlocks in the flow diagrams and determine the critical paths for the activity chart. Though both methodologies are independent, each complements other in authenticating its correctness and completeness.
Mobile mapping of sporting event spectators using bluetooth sensors: tour of flanders 2011.
Versichele, Mathias; Neutens, Tijs; Goudeseune, Stephanie; van Bossche, Frederik; van de Weghe, Nico
2012-10-22
Accurate spatiotemporal information on crowds is a necessity for a better management in general and for the mitigation of potential security risks. The large numbers of individuals involved and their mobility, however, make generation of this information non-trivial. This paper proposes a novel methodology to estimate and map crowd sizes using mobile Bluetooth sensors and examines to what extent this methodology represents a valuable alternative to existing traditional crowd density estimation methods. The proposed methodology is applied in a unique case study that uses Bluetooth technology for the mobile mapping of spectators of the Tour of Flanders 2011 road cycling race. The locations of nearly 16,000 cell phones of spectators along the race course were registered and detailed views of the spatiotemporal distribution of the crowd were generated. Comparison with visual head counts from camera footage delivered a detection ratio of 13.0 ± 2.3%, making it possible to estimate the crowd size. To our knowledge, this is the first study that uses mobile Bluetooth sensors to count and map a crowd over space and time.
Mobile Mapping of Sporting Event Spectators Using Bluetooth Sensors: Tour of Flanders 2011
Versichele, Mathias; Neutens, Tijs; Goudeseune, Stephanie; van Bossche, Frederik; van de Weghe, Nico
2012-01-01
Accurate spatiotemporal information on crowds is a necessity for a better management in general and for the mitigation of potential security risks. The large numbers of individuals involved and their mobility, however, make generation of this information non-trivial. This paper proposes a novel methodology to estimate and map crowd sizes using mobile Bluetooth sensors and examines to what extent this methodology represents a valuable alternative to existing traditional crowd density estimation methods. The proposed methodology is applied in a unique case study that uses Bluetooth technology for the mobile mapping of spectators of the Tour of Flanders 2011 road cycling race. The locations of nearly 16,000 cell phones of spectators along the race course were registered and detailed views of the spatiotemporal distribution of the crowd were generated. Comparison with visual head counts from camera footage delivered a detection ratio of 13.0 ± 2.3%, making it possible to estimate the crowd size. To our knowledge, this is the first study that uses mobile Bluetooth sensors to count and map a crowd over space and time. PMID:23202044
NASA Astrophysics Data System (ADS)
Neuman, Yair; Cohen, Yochai; Israeli, Navot; Tamir, Boaz
2018-02-01
The availability of historical textual corpora has led to the study of words' frequency along the historical time line, as representing the public's focus of attention over time. However, studying of the dynamics of words' meaning is still in its infancy. In this paper, we propose a methodology for studying the historical trajectory of words' meaning through Tsallis entropy. First, we present the idea that the meaning of a word may be studied through the entropy of its embedding. Using two historical case studies, we show that this entropy measure is correlated with the intensity in which a word is used. More importantly, we show that using Tsallis entropy with a superadditive entropy index may provide a better estimation of a word's frequency of use than using Shannon entropy. We explain this finding as resulting from an increasing redundancy between the words that comprise the semantic field of the target word and develop a new measure of redundancy between words. Using this measure, which relies on the Tsallis version of the Kullback-Leibler divergence, we show that the evolving meaning of a word involves the dynamics of increasing redundancy between components of its semantic field. The proposed methodology may enrich the toolkit of researchers who study the dynamics of word senses.
An experimental procedure to determine heat transfer properties of turbochargers
NASA Astrophysics Data System (ADS)
Serrano, J. R.; Olmeda, P.; Páez, A.; Vidal, F.
2010-03-01
Heat transfer phenomena in turbochargers have been a subject of investigation due to their importance for the correct determination of compressor real work when modelling. The commonly stated condition of adiabaticity for turbochargers during normal operation of an engine has been revaluated because important deviations from adiabatic behaviour have been stated in many studies in this issue especially when the turbocharger is running at low rotational speeds/loads. The deviations mentioned do not permit us to assess properly the turbine and compressor efficiencies since the pure aerodynamic effects cannot be separated from the non-desired heat transfer due to the presence of both phenomena during turbocharger operation. The correction of the aforesaid facts is necessary to properly feed engine models with reliable information and in this way increase the quality of the results in any modelling process. The present work proposes a thermal characterization methodology successfully applied in a turbocharger for a passenger car which is based on the physics of the turbocharger. Its application helps to understand the thermal behaviour of the turbocharger, and the results obtained constitute vital information for future modelling efforts which involve the use of the information obtained from the proposed methodology. The conductance values obtained from the proposed methodology have been applied to correct a procedure for measuring the mechanical efficiency of the tested turbocharger.
The Development of a "Neighborhood in Solidarity" in Switzerland.
Zwygart, Marion; Plattet, Alain; Ammor, Sarah
2017-01-01
This article presents a case study based on the "Neighborhood in Solidarity" (NS) methodology to illustrate its application in a locality of 8,000 inhabitants in Switzerland. This specific project is proposed to exemplify the global aim of the NS methodology. That aim is to increase the integration of elderly persons in societies in order to improve their quality of life. The case study demonstrates the enhancement of the capacity of the older people to remain actively engaged in their neighborhood. The article focuses on the creation of an autonomous community of empowered older people who can resolve their own problems after a 5-year project. The construction of the local community is presented throughout the six steps of the methodology: (1) preliminary analysis, (2) diagnostic, (3) construction, (4) project design, (5) project implementation, and (6) empowerment and with three degrees of involvement (community, participative, and integrative involvement). Performance and output indicators, quality indicators, and social determinants of health assess the development of the local project. The impacts of the projects which are illustrated in this specific example motivated this publication to inspire practitioners from other countries.
Development of a Design Methodology for Reconfigurable Flight Control Systems
NASA Technical Reports Server (NTRS)
Hess, Ronald A.; McLean, C.
2000-01-01
A methodology is presented for the design of flight control systems that exhibit stability and performance-robustness in the presence of actuator failures. The design is based upon two elements. The first element consists of a control law that will ensure at least stability in the presence of a class of actuator failures. This law is created by inner-loop, reduced-order, linear dynamic inversion, and outer-loop compensation based upon Quantitative Feedback Theory. The second element consists of adaptive compensators obtained from simple and approximate time-domain identification of the dynamics of the 'effective vehicle' with failed actuator(s). An example involving the lateral-directional control of a fighter aircraft is employed both to introduce the proposed methodology and to demonstrate its effectiveness and limitations.
Lugão, Suzana S M; Ricart, Simone L S I; Pinheiro, Renata M S; Gonçalves, Waldney M
2012-01-01
This article presents the description and discussion of a pilot project in an ergonomic action developed in a public health institution. This project involves the implantation of an Ergonomics Program (PROERGO) in a department of this institution, guided by a methodology structured on six stages, referenced in the literature by ergonomics authors. The methodology includes the training of workers and the formation of facilitators and multipliers of the ergonomics actions, aiming to the implementation of a cyclical process of actions and the consolidation of an ergonomics culture in the organization. Starting from the results of this experiment we intend to replicate this program model in other departments of the institution and to propose the methodology applied as a strategy of intervention to Occupational Health area.
Universal Verification Methodology Based Register Test Automation Flow.
Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu
2016-05-01
In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.
CNN based approach for activity recognition using a wrist-worn accelerometer.
Panwar, Madhuri; Dyuthi, S Ram; Chandra Prakash, K; Biswas, Dwaipayan; Acharyya, Amit; Maharatna, Koushik; Gautam, Arvind; Naik, Ganesh R
2017-07-01
In recent years, significant advancements have taken place in human activity recognition using various machine learning approaches. However, feature engineering have dominated conventional methods involving the difficult process of optimal feature selection. This problem has been mitigated by using a novel methodology based on deep learning framework which automatically extracts the useful features and reduces the computational cost. As a proof of concept, we have attempted to design a generalized model for recognition of three fundamental movements of the human forearm performed in daily life where data is collected from four different subjects using a single wrist worn accelerometer sensor. The validation of the proposed model is done with different pre-processing and noisy data condition which is evaluated using three possible methods. The results show that our proposed methodology achieves an average recognition rate of 99.8% as opposed to conventional methods based on K-means clustering, linear discriminant analysis and support vector machine.
What lies behind crop decisions?Coming to terms with revealing farmers' preferences
NASA Astrophysics Data System (ADS)
Gomez, C.; Gutierrez, C.; Pulido-Velazquez, M.; López Nicolás, A.
2016-12-01
The paper offers a fully-fledged applied revealed preference methodology to screen and represent farmers' choices as the solution of an optimal program involving trade-offs among the alternative welfare outcomes of crop decisions such as profits, income security and management easiness. The recursive two-stage method is proposed as an alternative to cope with the methodological problems inherent to common practice positive mathematical program methodologies (PMP). Differently from PMP, in the model proposed in this paper, the non-linear costs that are required for both calibration and smooth adjustment are not at odds with the assumptions of linear Leontief technologies and fixed crop prices and input costs. The method frees the model from ad-hoc assumptions about costs and then recovers the potential of economic analysis as a means to understand the rationale behind observed and forecasted farmers' decisions and then to enhance the potential of the model to support policy making in relevant domains such as agricultural policy, water management, risk management and climate change adaptation. After the introduction, where the methodological drawbacks and challenges are set up, section two presents the theoretical model, section three develops its empirical application and presents its implementation to a Spanish irrigation district and finally section four concludes and makes suggestions for further research.
NASA Astrophysics Data System (ADS)
Zolfaghari, Mohammad R.
2009-07-01
Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.
ERIC Educational Resources Information Center
Spais, George S.
2005-01-01
The major objective of this study is to identify a methodology that will help educators in marketing to efficiently manage the design, impact, and cost of case studies. It is my intention is to examine the impact of case study characteristics in relation to the degree of learner involvement in the learning process. The author proposes that…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-03
... the LSCS, Cycle 15, operation. Cycle 15 will be the first cycle of operation with a mixed core... methodologies. The analyses for LSCS, Unit 1, Cycle 15 have concluded that a two-loop MCPR SL of >= 1.13, based... accident from any accident previously evaluated? Response: No. The GNF2 fuel to be used in Cycle 15 is of a...
A Proposed Methodology to Classify Frontier Capital Markets
2011-07-31
but because it is the surest route to our common good.” -Inaugural Speech by President Barack Obama, Jan 2009 This project involves basic...machine learning. The algorithm consists of a unique binary classifier mechanism that combines three methods: k-Nearest Neighbors ( kNN ), ensemble...Through kNN Ensemble Classification Techniques E. Capital Market Classification Based on Capital Flows and Trading Architecture F. Horizontal
Combined Use of Terrestrial Laser Scanning and IR Thermography Applied to a Historical Building
Costanzo, Antonio; Minasi, Mario; Casula, Giuseppe; Musacchio, Massimo; Buongiorno, Maria Fabrizia
2015-01-01
The conservation of architectural heritage usually requires a multidisciplinary approach involving a variety of specialist expertise and techniques. Nevertheless, destructive techniques should be avoided, wherever possible, in order to preserve the integrity of the historical buildings, therefore the development of non-destructive and non-contact techniques is extremely important. In this framework, a methodology for combining the terrestrial laser scanning and the infrared thermal images is proposed, in order to obtain a reconnaissance of the conservation state of a historical building. The proposed case study is represented by St. Augustine Monumental Compound, located in the historical centre of the town of Cosenza (Calabria, South Italy). Adopting the proposed methodology, the paper illustrates the main results obtained for the building test overlaying and comparing the collected data with both techniques, in order to outline the capabilities both to detect the anomalies and to improve the knowledge on health state of the masonry building. The 3D model, also, allows to provide a reference model, laying the groundwork for implementation of a monitoring multisensor system based on the use of non-destructive techniques. PMID:25609042
Combined use of terrestrial laser scanning and IR thermography applied to a historical building.
Costanzo, Antonio; Minasi, Mario; Casula, Giuseppe; Musacchio, Massimo; Buongiorno, Maria Fabrizia
2014-12-24
The conservation of architectural heritage usually requires a multidisciplinary approach involving a variety of specialist expertise and techniques. Nevertheless, destructive techniques should be avoided, wherever possible, in order to preserve the integrity of the historical buildings, therefore the development of non-destructive and non-contact techniques is extremely important. In this framework, a methodology for combining the terrestrial laser scanning and the infrared thermal images is proposed, in order to obtain a reconnaissance of the conservation state of a historical building. The proposed case study is represented by St. Augustine Monumental Compound, located in the historical centre of the town of Cosenza (Calabria, South Italy). Adopting the proposed methodology, the paper illustrates the main results obtained for the building test overlaying and comparing the collected data with both techniques, in order to outline the capabilities both to detect the anomalies and to improve the knowledge on health state of the masonry building. The 3D model, also, allows to provide a reference model, laying the groundwork for implementation of a monitoring multisensor system based on the use of non-destructive techniques.
Exploratory High-Fidelity Aerostructural Optimization Using an Efficient Monolithic Solution Method
NASA Astrophysics Data System (ADS)
Zhang, Jenmy Zimi
This thesis is motivated by the desire to discover fuel efficient aircraft concepts through exploratory design. An optimization methodology based on tightly integrated high-fidelity aerostructural analysis is proposed, which has the flexibility, robustness, and efficiency to contribute to this goal. The present aerostructural optimization methodology uses an integrated geometry parameterization and mesh movement strategy, which was initially proposed for aerodynamic shape optimization. This integrated approach provides the optimizer with a large amount of geometric freedom for conducting exploratory design, while allowing for efficient and robust mesh movement in the presence of substantial shape changes. In extending this approach to aerostructural optimization, this thesis has addressed a number of important challenges. A structural mesh deformation strategy has been introduced to translate consistently the shape changes described by the geometry parameterization to the structural model. A three-field formulation of the discrete steady aerostructural residual couples the mesh movement equations with the three-dimensional Euler equations and a linear structural analysis. Gradients needed for optimization are computed with a three-field coupled adjoint approach. A number of investigations have been conducted to demonstrate the suitability and accuracy of the present methodology for use in aerostructural optimization involving substantial shape changes. Robustness and efficiency in the coupled solution algorithms is crucial to the success of an exploratory optimization. This thesis therefore also focuses on the design of an effective monolithic solution algorithm for the proposed methodology. This involves using a Newton-Krylov method for the aerostructural analysis and a preconditioned Krylov subspace method for the coupled adjoint solution. Several aspects of the monolithic solution method have been investigated. These include appropriate strategies for scaling and matrix-vector product evaluation, as well as block preconditioning techniques that preserve the modularity between subproblems. The monolithic solution method is applied to problems with varying degrees of fluid-structural coupling, as well as a wing span optimization study. The monolithic solution algorithm typically requires 20%-70% less computing time than its partitioned counterpart. This advantage increases with increasing wing flexibility. The performance of the monolithic solution method is also much less sensitive to the choice of the solution parameter.
Ising Processing Units: Potential and Challenges for Discrete Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coffrin, Carleton James; Nagarajan, Harsha; Bent, Russell Whitford
The recent emergence of novel computational devices, such as adiabatic quantum computers, CMOS annealers, and optical parametric oscillators, presents new opportunities for hybrid-optimization algorithms that leverage these kinds of specialized hardware. In this work, we propose the idea of an Ising processing unit as a computational abstraction for these emerging tools. Challenges involved in using and bench- marking these devices are presented, and open-source software tools are proposed to address some of these challenges. The proposed benchmarking tools and methodology are demonstrated by conducting a baseline study of established solution methods to a D-Wave 2X adiabatic quantum computer, one examplemore » of a commercially available Ising processing unit.« less
When is good, good enough? Methodological pragmatism for sustainable guideline development.
Browman, George P; Somerfield, Mark R; Lyman, Gary H; Brouwers, Melissa C
2015-03-06
Continuous escalation in methodological and procedural rigor for evidence-based processes in guideline development is associated with increasing costs and production delays that threaten sustainability. While health research methodologists are appropriately responsible for promoting increasing rigor in guideline development, guideline sponsors are responsible for funding such processes. This paper acknowledges that other stakeholders in addition to methodologists should be more involved in negotiating trade-offs between methodological procedures and efficiency in guideline production to produce guidelines that are 'good enough' to be trustworthy and affordable under specific circumstances. The argument for reasonable methodological compromise to meet practical circumstances is consistent with current implicit methodological practice. This paper proposes a conceptual tool as a framework to be used by different stakeholders in negotiating, and explicitly reporting, reasonable compromises for trustworthy as well as cost-worthy guidelines. The framework helps fill a transparency gap in how methodological choices in guideline development are made. The principle, 'when good is good enough' can serve as a basis for this approach. The conceptual tool 'Efficiency-Validity Methodological Continuum' acknowledges trade-offs between validity and efficiency in evidence-based guideline development and allows for negotiation, guided by methodologists, of reasonable methodological compromises among stakeholders. Collaboration among guideline stakeholders in the development process is necessary if evidence-based guideline development is to be sustainable.
Rodríguez-González, Alejandro; Torres-Niño, Javier; Valencia-Garcia, Rafael; Mayer, Miguel A; Alor-Hernandez, Giner
2013-09-01
This paper proposes a new methodology for assessing the efficiency of medical diagnostic systems and clinical decision support systems by using the feedback/opinions of medical experts. The methodology behind this work is based on a comparison between the expert feedback that has helped solve different clinical cases and the expert system that has evaluated these same cases. Once the results are returned, an arbitration process is carried out in order to ensure the correctness of the results provided by both methods. Once this process has been completed, the results are analyzed using Precision, Recall, Accuracy, Specificity and Matthews Correlation Coefficient (MCC) (PRAS-M) metrics. When the methodology is applied, the results obtained from a real diagnostic system allow researchers to establish the accuracy of the system based on objective facts. The methodology returns enough information to analyze the system's behavior for each disease in the knowledge base or across the entire knowledge base. It also returns data on the efficiency of the different assessors involved in the evaluation process, analyzing their behavior in the diagnostic process. The proposed work facilitates the evaluation of medical diagnostic systems, having a reliable process based on objective facts. The methodology presented in this research makes it possible to identify the main characteristics that define a medical diagnostic system and their values, allowing for system improvement. A good example of the results provided by the application of the methodology is shown in this paper. A diagnosis system was evaluated by means of this methodology, yielding positive results (statistically significant) when comparing the system with the assessors that participated in the evaluation process of the system through metrics such as recall (+27.54%) and MCC (+32.19%). These results demonstrate the real applicability of the methodology used. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Porello, Daniele
The aim of this paper is to propose a methodology for evaluating the quality of collective decisions in sociotechnical systems (STS). We propose using a foundational ontology for conceptualizing the complex hierarchy of information involved in decisions in STS (e.g., normative, conceptual, factual, perceptual). Moreover, we introduce the concept of transparency of decisions as a necessary condition in order to assess the quality of decision-making in STS. We further view transparency as an entitlement of the agent affected by the decision: i.e., the collective decision should be justified.
The economics of project analysis: Optimal investment criteria and methods of study
NASA Technical Reports Server (NTRS)
Scriven, M. C.
1979-01-01
Insight is provided toward the development of an optimal program for investment analysis of project proposals offering commercial potential and its components. This involves a critique of economic investment criteria viewed in relation to requirements of engineering economy analysis. An outline for a systems approach to project analysis is given Application of the Leontief input-output methodology to analysis of projects involving multiple processes and products is investigated. Effective application of elements of neoclassical economic theory to investment analysis of project components is demonstrated. Patterns of both static and dynamic activity levels are incorporated.
Casaseca-de-la-Higuera, Pablo; Simmross-Wattenberg, Federico; Martín-Fernández, Marcos; Alberola-López, Carlos
2009-07-01
Discontinuation of mechanical ventilation is a challenging task that involves a number of subtle clinical issues. The gradual removal of the respiratory support (referred to as weaning) should be performed as soon as autonomous respiration can be sustained. However, the prediction rate of successful extubation is still below 25% based on previous studies. Construction of an automatic system that provides information on extubation readiness is thus desirable. Recent works have demonstrated that the breathing pattern variability is a useful extubation readiness indicator, with improving performance when multiple respiratory signals are jointly processed. However, the existing methods for predictor extraction present several drawbacks when length-limited time series are to be processed in heterogeneous groups of patients. In this paper, we propose a model-based methodology for automatic readiness prediction. It is intended to deal with multichannel, nonstationary, short records of the breathing pattern. Results on experimental data yield an 87.27% of successful readiness prediction, which is in line with the best figures reported in the literature. A comparative analysis shows that our methodology overcomes the shortcomings of so far proposed methods when applied to length-limited records on heterogeneous groups of patients.
Groundwater vulnerability to climate change: A review of the assessment methodology.
Aslam, Rana Ammar; Shrestha, Sangam; Pandey, Vishnu Prasad
2018-01-15
Impacts of climate change on water resources, especially groundwater, can no longer be hidden. These impacts are further exacerbated under the integrated influence of climate variability, climate change and anthropogenic activities. The degree of impact varies according to geographical location and other factors leading systems and regions towards different levels of vulnerability. In the recent past, several attempts have been made in various regions across the globe to quantify the impacts and consequences of climate and non-climate factors in terms of vulnerability to groundwater resources. Firstly, this paper provides a structured review of the available literature, aiming to critically analyse and highlight the limitations and knowledge gaps involved in vulnerability (of groundwater to climate change) assessment methodologies. The effects of indicator choice and the importance of including composite indicators are then emphasised. A new integrated approach for the assessment of groundwater vulnerability to climate change is proposed to successfully address those limitations. This review concludes that the choice of indicator has a significant role in defining the reliability of computed results. The effect of an individual indicator is also apparent but the consideration of a combination (variety) of indicators may give more realistic results. Therefore, in future, depending upon the local conditions and scale of the study, indicators from various groups should be chosen. Furthermore, there are various assumptions involved in previous methodologies, which limit their scope by introducing uncertainty in the calculated results. These limitations can be overcome by implementing the proposed approach. Copyright © 2017 Elsevier B.V. All rights reserved.
Alimonti, Luca; Atalla, Noureddine; Berry, Alain; Sgard, Franck
2015-02-01
Practical vibroacoustic systems involve passive acoustic treatments consisting of highly dissipative media such as poroelastic materials. The numerical modeling of such systems at low to mid frequencies typically relies on substructuring methodologies based on finite element models. Namely, the master subsystems (i.e., structural and acoustic domains) are described by a finite set of uncoupled modes, whereas condensation procedures are typically preferred for the acoustic treatments. However, although accurate, such methodology is computationally expensive when real life applications are considered. A potential reduction of the computational burden could be obtained by approximating the effect of the acoustic treatment on the master subsystems without introducing physical degrees of freedom. To do that, the treatment has to be assumed homogeneous, flat, and of infinite lateral extent. Under these hypotheses, simple analytical tools like the transfer matrix method can be employed. In this paper, a hybrid finite element-transfer matrix methodology is proposed. The impact of the limiting assumptions inherent within the analytical framework are assessed for the case of plate-cavity systems involving flat and homogeneous acoustic treatments. The results prove that the hybrid model can capture the qualitative behavior of the vibroacoustic system while reducing the computational effort.
Frosini, Francesco; Miniati, Roberto; Grillone, Saverio; Dori, Fabrizio; Gentili, Guido Biffi; Belardinelli, Andrea
2016-11-14
The following study proposes and tests an integrated methodology involving Health Technology Assessment (HTA) and Failure Modes, Effects and Criticality Analysis (FMECA) for the assessment of specific aspects related to robotic surgery involving safety, process and technology. The integrated methodology consists of the application of specific techniques coming from the HTA joined to the aid of the most typical models from reliability engineering such as FMEA/FMECA. The study has also included in-site data collection and interviews to medical personnel. The total number of robotic procedures included in the analysis was 44: 28 for urology and 16 for general surgery. The main outcomes refer to the comparative evaluation between robotic, laparoscopic and open surgery. Risk analysis and mitigation interventions come from FMECA application. The small sample size available for the study represents an important bias, especially for the clinical outcomes reliability. Despite this, the study seems to confirm the better trend for robotics' surgical times with comparison to the open technique as well as confirming the robotics' clinical benefits in urology. More complex situation is observed for general surgery, where robotics' clinical benefits directly measured are the lowest blood transfusion rate.
Classification of physical activities based on body-segments coordination.
Fradet, Laetitia; Marin, Frederic
2016-09-01
Numerous innovations based on connected objects and physical activity (PA) monitoring have been proposed. However, recognition of PAs requires robust algorithm and methodology. The current study presents an innovative approach for PA recognition. It is based on the heuristic definition of postures and the use of body-segments coordination obtained through external sensors. The first part of this study presents the methodology required to define the set of accelerations which is the most appropriate to represent the particular body-segments coordination involved in the chosen PAs (here walking, running, and cycling). For that purpose, subjects of different ages and heterogeneous physical conditions walked, ran, cycled, and performed daily activities at different paces. From the 3D motion capture, vertical and horizontal accelerations of 8 anatomical landmarks representative of the body were computed. Then, the 680 combinations from up to 3 accelerations were compared to identify the most appropriate set of acceleration to discriminate the PAs in terms of body segment coordinations. The discrimination was based on the maximal Hausdorff Distance obtained between the different set of accelerations. The vertical accelerations of both knees demonstrated the best PAs discrimination. The second step was the proof of concept, implementing the proposed algorithm to classify PAs of new group of subjects. The originality of the proposed algorithm is the possibility to use the subject's specific measures as reference data. With the proposed algorithm, 94% of the trials were correctly classified. In conclusion, our study proposed a flexible and extendable methodology. At the current stage, the algorithm has been shown to be valid for heterogeneous subjects, which suggests that it could be deployed in clinical or health-related applications regardless of the subjects' physical abilities or characteristics. Copyright © 2016 Elsevier Ltd. All rights reserved.
BIM Methodology Approach to Infrastructure Design: Case Study of Paniga Tunnel
NASA Astrophysics Data System (ADS)
Osello, Anna; Rapetti, Niccolò; Semeraro, Francesco
2017-10-01
Nowadays, the implementation of Building Information Modelling (BIM) in civil design represent a new challenge for the AECO (Architecture, Engineering, Construction, Owner and Operator) world, which will involve the interest of many researchers in the next years. It is due to the incentives of Public Administration and European Directives that aim to improve the efficiency and to enhance a better management of the complexity of infrastructure projects. For these reasons, the goal of this research is to propose a methodology for the use of BIM in a tunnel project, analysing the definition of a correct level of detail (LOD) and the possibility to share information via interoperability for FEM analysis.
Reliability analysis of repairable systems using Petri nets and vague Lambda-Tau methodology.
Garg, Harish
2013-01-01
The main objective of the paper is to developed a methodology, named as vague Lambda-Tau, for reliability analysis of repairable systems. Petri net tool is applied to represent the asynchronous and concurrent processing of the system instead of fault tree analysis. To enhance the relevance of the reliability study, vague set theory is used for representing the failure rate and repair times instead of classical(crisp) or fuzzy set theory because vague sets are characterized by a truth membership function and false membership functions (non-membership functions) so that sum of both values is less than 1. The proposed methodology involves qualitative modeling using PN and quantitative analysis using Lambda-Tau method of solution with the basic events represented by intuitionistic fuzzy numbers of triangular membership functions. Sensitivity analysis has also been performed and the effects on system MTBF are addressed. The methodology improves the shortcomings of the existing probabilistic approaches and gives a better understanding of the system behavior through its graphical representation. The washing unit of a paper mill situated in a northern part of India, producing approximately 200 ton of paper per day, has been considered to demonstrate the proposed approach. The results may be helpful for the plant personnel for analyzing the systems' behavior and to improve their performance by adopting suitable maintenance strategies. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Zaychik, Kirill; Cardullo, Frank; George, Gary; Kelly, Lon C.
2009-01-01
In order to use the Hess Structural Model to predict the need for certain cueing systems, George and Cardullo significantly expanded it by adding motion feedback to the model and incorporating models of the motion system dynamics, motion cueing algorithm and a vestibular system. This paper proposes a methodology to evaluate effectiveness of these innovations by performing a comparison analysis of the model performance with and without the expanded motion feedback. The proposed methodology is composed of two stages. The first stage involves fine-tuning parameters of the original Hess structural model in order to match the actual control behavior recorded during the experiments at NASA Visual Motion Simulator (VMS) facility. The parameter tuning procedure utilizes a new automated parameter identification technique, which was developed at the Man-Machine Systems Lab at SUNY Binghamton. In the second stage of the proposed methodology, an expanded motion feedback is added to the structural model. The resulting performance of the model is then compared to that of the original one. As proposed by Hess, metrics to evaluate the performance of the models include comparison against the crossover models standards imposed on the crossover frequency and phase margin of the overall man-machine system. Preliminary results indicate the advantage of having the model of the motion system and motion cueing incorporated into the model of the human operator. It is also demonstrated that the crossover frequency and the phase margin of the expanded model are well within the limits imposed by the crossover model.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-17
...: Proposed Collection; Comment Request--Generic Clearance to Conduct Methodological Testing, Surveys, Focus... proposed information collection. This information collection will conduct research by methodological... Methodological Testing, Surveys, Focus Groups, and Related Tools to Improve the Management of Federal Nutrition...
Methodological Challenges to Economic Evaluations of Vaccines: Is a Common Approach Still Possible?
Jit, Mark; Hutubessy, Raymond
2016-06-01
Economic evaluation of vaccination is a key tool to inform effective spending on vaccines. However, many evaluations have been criticised for failing to capture features of vaccines which are relevant to decision makers. These include broader societal benefits (such as improved educational achievement, economic growth and political stability), reduced health disparities, medical innovation, reduced hospital beds pressures, greater peace of mind and synergies in economic benefits with non-vaccine interventions. Also, the fiscal implications of vaccination programmes are not always made explicit. Alternative methodological frameworks have been proposed to better capture these benefits. However, any broadening of the methodology for economic evaluation must also involve evaluations of non-vaccine interventions, and hence may not always benefit vaccines given a fixed health-care budget. The scope of an economic evaluation must consider the budget from which vaccines are funded, and the decision-maker's stated aims for that spending to achieve.
Zhang, Yong-Feng; Chiang, Hsiao-Dong
2017-09-01
A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.
The Innate Immunity in Alzheimer Disease- Relevance to Pathogenesis and Therapy.
Blach-Olszewska, Zofia; Zaczynska, Ewa; Gustaw-Rothenberg, Kasia; Avila-Rodrigues, Marco; Barreto, George E; Leszek, Jerzy; Aliev, Gjumrakch
2015-01-01
The genetic, cellular, and molecular changes associated with Alzheimer disease provide evidence of immune and inflammatory processes involvement in its pathogenesis. These are supported by epidemiological studies, which show some benefit of long-term use of NSAID. The hypothesis that AD is in fact an immunologically mediated and even inflammatory pathological process may be in fact scientifically intriguing. There are several obstacles that suggest the need for more complex view, in the process of targeting inflammation and immunity in AD. In our previous studies we proposed a reliable methodology to assess innate immunity in Alzheimer patients and controls. The methodology is based on the phenomenon of human leukocytes being resistant to viral infection. The unspecific character of the resistance, dependent on interferons and tumor necrosis factor, and occurrence in cells ex vivo indicate that an in vivo mechanism of innate immunity may be involved. The above mentioned resistance could be estimated in a test based on peripheral blood leukocytes infection by vesicular stomachs virus.
African Primary Care Research: Participatory action research
2014-01-01
Abstract This article is part of the series on African primary care research and focuses on participatory action research. The article gives an overview of the emancipatory-critical research paradigm, the key characteristics and different types of participatory action research. Following this it describes in detail the methodological issues involved in professional participatory action research and running a cooperative inquiry group. The article is intended to help students with writing their research proposal. PMID:26245439
Chen, Ching-Ho; Wu, Ray-Shyan; Liu, Wei-Lin; Su, Wen-Ray; Chang, Yu-Min
2009-01-01
Some countries, including Taiwan, have adopted strategic environmental assessment (SEA) to assess and modify proposed policies, plans, and programs (PPPs) in the planning phase for pursuing sustainable development. However, there were only some sketchy steps focusing on policy assessment in the system of Taiwan. This study aims to develop a methodology for SEA in Taiwan to enhance the effectiveness associated with PPPs. The proposed methodology comprises an SEA procedure involving PPP management and assessment in various phases, a sustainable assessment framework, and an SEA management system. The SEA procedure is devised based on the theoretical considerations by systems thinking and the regulative requirements in Taiwan. The positive and negative impacts on ecology, society, and economy are simultaneously considered in the planning (including policy generation and evaluation), implementation, and control phases of the procedure. This study used the analytic hierarchy process, Delphi technique, and systems analysis to develop a sustainable assessment framework. An SEA management system was built based on geographic information system software to process spatial, attribute, and satellite image data during the assessment procedure. The proposed methodology was applied in the SEA of golf course installation policy in 2001 as a case study, which was the first SEA in Taiwan. Most of the 82 existing golf courses in 2001 were installed on slope lands and caused a serious ecological impact. Assessment results indicated that 15 future golf courses installed on marginal lands (including buffer zones, remedied lands, and wastelands) were acceptable because the comprehensive environmental (ecological, social, and economic) assessment value was better based on environmental characteristics and management regulations of Taiwan. The SEA procedure in the planning phase for this policy was completed but the implementation phase of this policy was not begun because the related legislation procedure could not be arranged due to a few senators' resistance. A self-review of the control phase was carried out in 2006 using this methodology. Installation permits for 12 courses on slope lands were terminated after 2001 and then 27 future courses could be installed on marginal lands. The assessment value of this policy using the data on ecological, social, and economic conditions from 2006 was higher than that using the data from 2001. The analytical results illustrate that the proposed methodology can be used to effectively and efficiently assist the related authorities for SEA.
Capitation pricing: Adjusting for prior utilization and physician discretion
Anderson, Gerard F.; Cantor, Joel C.; Steinberg, Earl P.; Holloway, James
1986-01-01
As the number of Medicare beneficiaries receiving care under at-risk capitation arrangements increases, the method for setting payment rates will come under increasing scrutiny. A number of modifications to the current adjusted average per capita cost (AAPCC) methodology have been proposed, including an adjustment for prior utilization. In this article, we propose use of a utilization adjustment that includes only hospitalizations involving low or moderate physician discretion in the decision to hospitalize. This modification avoids discrimination against capitated systems that prevent certain discretionary admissions. The model also explains more of the variance in per capita expenditures than does the current AAPCC. PMID:10312010
Bourhis, Cathy; Tual, Florence
2013-01-01
Health education among children and adolescents tends to be more effective if the objectives are shared, supported and promoted by parents. Professionals and policy-makers are therefore keen to promote the active involvement of parents. However, they face the same challenge: how to get parents involved. To address this issue, we need to examine parents' concerns and expectations directly. Professionals will need to adapt the proposed responses to the identified needs. This approach is a basic methodological and ethical principle in health education and requires the ability to change perceptions and practices while taking into account public expectations.
Hu, Xiao-Bing; Wang, Ming; Di Paolo, Ezequiel
2013-06-01
Searching the Pareto front for multiobjective optimization problems usually involves the use of a population-based search algorithm or of a deterministic method with a set of different single aggregate objective functions. The results are, in fact, only approximations of the real Pareto front. In this paper, we propose a new deterministic approach capable of fully determining the real Pareto front for those discrete problems for which it is possible to construct optimization algorithms to find the k best solutions to each of the single-objective problems. To this end, two theoretical conditions are given to guarantee the finding of the actual Pareto front rather than its approximation. Then, a general methodology for designing a deterministic search procedure is proposed. A case study is conducted, where by following the general methodology, a ripple-spreading algorithm is designed to calculate the complete exact Pareto front for multiobjective route optimization. When compared with traditional Pareto front search methods, the obvious advantage of the proposed approach is its unique capability of finding the complete Pareto front. This is illustrated by the simulation results in terms of both solution quality and computational efficiency.
2014-01-01
Background We propose a mathematical model for multichannel assessment of the trial-to-trial variability of auditory evoked brain responses in magnetoencephalography (MEG). Methods Following the work of de Munck et al., our approach is based on the maximum likelihood estimation and involves an approximation of the spatio-temporal covariance of the contaminating background noise by means of the Kronecker product of its spatial and temporal covariance matrices. Extending the work of de Munck et al., where the trial-to-trial variability of the responses was considered identical to all channels, we evaluate it for each individual channel. Results Simulations with two equivalent current dipoles (ECDs) with different trial-to-trial variability, one seeded in each of the auditory cortices, were used to study the applicability of the proposed methodology on the sensor level and revealed spatial selectivity of the trial-to-trial estimates. In addition, we simulated a scenario with neighboring ECDs, to show limitations of the method. We also present an illustrative example of the application of this methodology to real MEG data taken from an auditory experimental paradigm, where we found hemispheric lateralization of the habituation effect to multiple stimulus presentation. Conclusions The proposed algorithm is capable of reconstructing lateralization effects of the trial-to-trial variability of evoked responses, i.e. when an ECD of only one hemisphere habituates, whereas the activity of the other hemisphere is not subject to habituation. Hence, it may be a useful tool in paradigms that assume lateralization effects, like, e.g., those involving language processing. PMID:24939398
Fractional Programming for Communication Systems—Part II: Uplink Scheduling via Matching
NASA Astrophysics Data System (ADS)
Shen, Kaiming; Yu, Wei
2018-05-01
This two-part paper develops novel methodologies for using fractional programming (FP) techniques to design and optimize communication systems. Part I of this paper proposes a new quadratic transform for FP and treats its application for continuous optimization problems. In this Part II of the paper, we study discrete problems, such as those involving user scheduling, which are considerably more difficult to solve. Unlike the continuous problems, discrete or mixed discrete-continuous problems normally cannot be recast as convex problems. In contrast to the common heuristic of relaxing the discrete variables, this work reformulates the original problem in an FP form amenable to distributed combinatorial optimization. The paper illustrates this methodology by tackling the important and challenging problem of uplink coordinated multi-cell user scheduling in wireless cellular systems. Uplink scheduling is more challenging than downlink scheduling, because uplink user scheduling decisions significantly affect the interference pattern in nearby cells. Further, the discrete scheduling variable needs to be optimized jointly with continuous variables such as transmit power levels and beamformers. The main idea of the proposed FP approach is to decouple the interaction among the interfering links, thereby permitting a distributed and joint optimization of the discrete and continuous variables with provable convergence. The paper shows that the well-known weighted minimum mean-square-error (WMMSE) algorithm can also be derived from a particular use of FP; but our proposed FP-based method significantly outperforms WMMSE when discrete user scheduling variables are involved, both in term of run-time efficiency and optimizing results.
Cerqueira, Marcos Rodrigues Facchini; Grasseschi, Daniel; Matos, Renato Camargo; Angnes, Lucio
2014-08-01
Different materials like glass, silicon and poly(methyl methacrylate) (PMMA) are being used to immobilise enzymes in microchannels. PMMA shows advantages such as its low price, biocompatibility and attractive mechanical and chemical properties. Despite this, the introduction of reactive functional groups on PMMA is still problematic, either because of the complex chemistry or extended reaction time involved. In this paper, a new methodology was developed to immobilise glucose oxidase (GOx) in PMMA microchannels, with the benefit of a rapid immobilisation process and a very simple route. The new procedure involves only two steps, based on the reaction of 5.0% (w/w) polyethyleneimine (PEI) with PMMA in a dimethyl sulphoxide medium, followed by the immobilisation of glucose oxidase using a solution containing 100U enzymes and 1.0% (v/v) glutaraldehyde. The reactors prepared in this way were evaluated by a flowing system with amperometric detection (+0.60V) based on the oxidation of the H2O2 produced by the reactor. The microreactor proposed here was able to work with high bioconversion and a frequency of 60 samples h(-1), with detection and quantification limits of 0.50 and 1.66µmol L(-1), respectively. Michaelis-Menten parameters (Vmax and KM) were calculated as 449±47.7nmol min(-1) and 7.79±0.98mmol. Statistical evaluations were done to validate the proposed methodology. The content of glucose in natural and commercial coconut water samples was evaluated using the developed method. Comparison with spectrophotometric measurements showed that both methodologies have a very good correlation (tcalculated, 0.05, 4=1.35
2011-01-01
When applying echo-Doppler imaging for either clinical or research purposes it is very important to select the most adequate modality/technology and choose the most reliable and reproducible measurements. Quality control is a mainstay to reduce variability among institutions and operators and must be obtained by using appropriate procedures for data acquisition, storage and interpretation of echo-Doppler data. This goal can be achieved by employing an echo core laboratory (ECL), with the responsibility for standardizing image acquisition processes (performed at the peripheral echo-labs) and analysis (by monitoring and optimizing the internal intra- and inter-reader variability of measurements). Accordingly, the Working Group of Echocardiography of the Italian Society of Cardiology decided to design standardized procedures for imaging acquisition in peripheral laboratories and reading procedures and to propose a methodological approach to assess the reproducibility of echo-Doppler parameters of cardiac structure and function by using both standard and advanced technologies. A number of cardiologists experienced in cardiac ultrasound was involved to set up an ECL available for future studies involving complex imaging or including echo-Doppler measures as primary or secondary efficacy or safety end-points. The present manuscript describes the methodology of the procedures (imaging acquisition and measurement reading) and provides the documentation of the work done so far to test the reproducibility of the different echo-Doppler modalities (standard and advanced). These procedures can be suggested for utilization also in non referall echocardiographic laboratories as an "inside" quality check, with the aim at optimizing clinical consistency of echo-Doppler data. PMID:21943283
The SIMRAND methodology - Simulation of Research and Development Projects
NASA Technical Reports Server (NTRS)
Miles, R. F., Jr.
1984-01-01
In research and development projects, a commonly occurring management decision is concerned with the optimum allocation of resources to achieve the project goals. Because of resource constraints, management has to make a decision regarding the set of proposed systems or tasks which should be undertaken. SIMRAND (Simulation of Research and Development Projects) is a methodology which was developed for aiding management in this decision. Attention is given to a problem description, aspects of model formulation, the reduction phase of the model solution, the simulation phase, and the evaluation phase. The implementation of the considered approach is illustrated with the aid of an example which involves a simplified network of the type used to determine the price of silicon solar cells.
Optical tweezers force measurements to study parasites chemotaxis
NASA Astrophysics Data System (ADS)
de Thomaz, A. A.; Pozzo, L. Y.; Fontes, A.; Almeida, D. B.; Stahl, C. V.; Santos-Mallet, J. R.; Gomes, S. A. O.; Feder, D.; Ayres, D. C.; Giorgio, S.; Cesar, C. L.
2009-07-01
In this work, we propose a methodology to study microorganisms chemotaxis in real time using an Optical Tweezers system. Optical Tweezers allowed real time measurements of the force vectors, strength and direction, of living parasites under chemical or other kinds of gradients. This seems to be the ideal tool to perform observations of taxis response of cells and microorganisms with high sensitivity to capture instantaneous responses to a given stimulus. Forces involved in the movement of unicellular parasites are very small, in the femto-pico-Newton range, about the same order of magnitude of the forces generated in an Optical Tweezers. We applied this methodology to investigate the Leishmania amazonensis (L. amazonensis) and Trypanossoma cruzi (T. cruzi) under distinct situations.
Composite Dry Structure Cost Improvement Approach
NASA Technical Reports Server (NTRS)
Nettles, Alan; Nettles, Mindy
2015-01-01
This effort demonstrates that by focusing only on properties of relevance, composite interstage and shroud structures can be placed on the Space Launch System vehicle that simultaneously reduces cost, improves reliability, and maximizes performance, thus providing the Advanced Development Group with a new methodology of how to utilize composites to reduce weight for composite structures on launch vehicles. Interstage and shroud structures were chosen since both of these structures are simple in configuration and do not experience extreme environments (such as cryogenic or hot gas temperatures) and should represent a good starting point for flying composites on a 'man-rated' vehicle. They are used as an example only. The project involves using polymer matrix composites for launch vehicle structures, and the logic and rationale behind the proposed new methodology.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-20
.... The text of the proposed rule change is set forth below. Proposed new language is italicized; proposed... methodology approved by FINRA as announced in a Regulatory Notice (``approved margin methodology''). The... an Approved Margin Methodology. Members shall require as a minimum for computing customer or broker...
Novel thermal management system design methodology for power lithium-ion battery
NASA Astrophysics Data System (ADS)
Nieto, Nerea; Díaz, Luis; Gastelurrutia, Jon; Blanco, Francisco; Ramos, Juan Carlos; Rivas, Alejandro
2014-12-01
Battery packs conformed by large format lithium-ion cells are increasingly being adopted in hybrid and pure electric vehicles in order to use the energy more efficiently and for a better environmental performance. Safety and cycle life are two of the main concerns regarding this technology, which are closely related to the cell's operating behavior and temperature asymmetries in the system. Therefore, the temperature of the cells in battery packs needs to be controlled by thermal management systems (TMSs). In the present paper an improved design methodology for developing TMSs is proposed. This methodology involves the development of different mathematical models for heat generation, transmission, and dissipation and their coupling and integration in the battery pack product design methodology in order to improve the overall safety and performance. The methodology is validated by comparing simulation results with laboratory measurements on a single module of the battery pack designed at IK4-IKERLAN for a traction application. The maximum difference between model predictions and experimental temperature data is 2 °C. The models developed have shown potential for use in battery thermal management studies for EV/HEV applications since they allow for scalability with accuracy and reasonable simulation time.
De Ambrogi, Francesco; Ratti, Elisabetta Ceppi
2011-01-01
Today the Italian national debate over the Work-Related Stress Risk Assessment methodology is rather heated. Several methodological proposals and guidelines have been published in recent months, not least those by the "Commissione Consultiva". But despite this wide range of proposals, it appears that there is still a lack of attention to some of the basic methodological issues that must be taken into account in order to correctly implement the above-mentioned guidelines. The aim of this paper is to outline these methodological issues. In order to achieve this, the most authoritative methodological proposals and guidelines have been reviewed. The study focuses in particular on the methodological issues that could lead to important biases if not considered properly. The study leads to some considerations about the methodological validity of a Work-Related Stress Risk Assessment based exclusively on the literal interpretation of the considered proposals. Furthermore, the study provides some hints and working hypotheses on how to overcome these methodological limits. This study should be considered as a starting point for further investigations and debate on the Work-Related Stress Risk Assessment methodology on a national level.
Experience factors in performing periodic physical evaluations
NASA Technical Reports Server (NTRS)
Hoffman, A. A.
1969-01-01
The lack of scientific basis in the so-called periodic health examinations on military personnel inclusive of the Executive Health Program is outlined. This latter program can well represent a management tool of the company involved in addition to being a status symbol. A multiphasic screening technique is proposed in conjunction with an automated medical history questionnaire for preventive occupational medicine methodology. The need to collate early sickness consultation or clinic visit histories with screening techniques is emphasized.
Izewska, Joanna; Wesolowska, Paulina; Azangwe, Godfrey; Followill, David S.; Thwaites, David I.; Arib, Mehenna; Stefanic, Amalia; Viegas, Claudio; Suming, Luo; Ekendahl, Daniela; Bulski, Wojciech; Georg, Dietmar
2016-01-01
Abstract The International Atomic Energy Agency (IAEA) has a long tradition of supporting development of methodologies for national networks providing quality audits in radiotherapy. A series of co-ordinated research projects (CRPs) has been conducted by the IAEA since 1995 assisting national external audit groups developing national audit programs. The CRP ‘Development of Quality Audits for Radiotherapy Dosimetry for Complex Treatment Techniques’ was conducted in 2009–2012 as an extension of previously developed audit programs. Material and methods. The CRP work described in this paper focused on developing and testing two steps of dosimetry audit: verification of heterogeneity corrections, and treatment planning system (TPS) modeling of small MLC fields, which are important for the initial stages of complex radiation treatments, such as IMRT. The project involved development of a new solid slab phantom with heterogeneities containing special measurement inserts for thermoluminescent dosimeters (TLD) and radiochromic films. The phantom and the audit methodology has been developed at the IAEA and tested in multi-center studies involving the CRP participants. Results. The results of multi-center testing of methodology for two steps of dosimetry audit show that the design of audit procedures is adequate and the methodology is feasible for meeting the audit objectives. A total of 97% TLD results in heterogeneity situations obtained in the study were within 3% and all results within 5% agreement with the TPS predicted doses. In contrast, only 64% small beam profiles were within 3 mm agreement between the TPS calculated and film measured doses. Film dosimetry results have highlighted some limitations in TPS modeling of small beam profiles in the direction of MLC leave movements. Discussion. Through multi-center testing, any challenges or difficulties in the proposed audit methodology were identified, and the methodology improved. Using the experience of these studies, the participants could incorporate the auditing procedures in their national programs. PMID:26934916
Izewska, Joanna; Wesolowska, Paulina; Azangwe, Godfrey; Followill, David S; Thwaites, David I; Arib, Mehenna; Stefanic, Amalia; Viegas, Claudio; Suming, Luo; Ekendahl, Daniela; Bulski, Wojciech; Georg, Dietmar
2016-07-01
The International Atomic Energy Agency (IAEA) has a long tradition of supporting development of methodologies for national networks providing quality audits in radiotherapy. A series of co-ordinated research projects (CRPs) has been conducted by the IAEA since 1995 assisting national external audit groups developing national audit programs. The CRP 'Development of Quality Audits for Radiotherapy Dosimetry for Complex Treatment Techniques' was conducted in 2009-2012 as an extension of previously developed audit programs. The CRP work described in this paper focused on developing and testing two steps of dosimetry audit: verification of heterogeneity corrections, and treatment planning system (TPS) modeling of small MLC fields, which are important for the initial stages of complex radiation treatments, such as IMRT. The project involved development of a new solid slab phantom with heterogeneities containing special measurement inserts for thermoluminescent dosimeters (TLD) and radiochromic films. The phantom and the audit methodology has been developed at the IAEA and tested in multi-center studies involving the CRP participants. The results of multi-center testing of methodology for two steps of dosimetry audit show that the design of audit procedures is adequate and the methodology is feasible for meeting the audit objectives. A total of 97% TLD results in heterogeneity situations obtained in the study were within 3% and all results within 5% agreement with the TPS predicted doses. In contrast, only 64% small beam profiles were within 3 mm agreement between the TPS calculated and film measured doses. Film dosimetry results have highlighted some limitations in TPS modeling of small beam profiles in the direction of MLC leave movements. Through multi-center testing, any challenges or difficulties in the proposed audit methodology were identified, and the methodology improved. Using the experience of these studies, the participants could incorporate the auditing procedures in their national programs.
Towards a Unified Approach to Information Integration - A review paper on data/information fusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitney, Paul D.; Posse, Christian; Lei, Xingye C.
2005-10-14
Information or data fusion of data from different sources are ubiquitous in many applications, from epidemiology, medical, biological, political, and intelligence to military applications. Data fusion involves integration of spectral, imaging, text, and many other sensor data. For example, in epidemiology, information is often obtained based on many studies conducted by different researchers at different regions with different protocols. In the medical field, the diagnosis of a disease is often based on imaging (MRI, X-Ray, CT), clinical examination, and lab results. In the biological field, information is obtained based on studies conducted on many different species. In military field, informationmore » is obtained based on data from radar sensors, text messages, chemical biological sensor, acoustic sensor, optical warning and many other sources. Many methodologies are used in the data integration process, from classical, Bayesian, to evidence based expert systems. The implementation of the data integration ranges from pure software design to a mixture of software and hardware. In this review we summarize the methodologies and implementations of data fusion process, and illustrate in more detail the methodologies involved in three examples. We propose a unified multi-stage and multi-path mapping approach to the data fusion process, and point out future prospects and challenges.« less
NASA Technical Reports Server (NTRS)
Lee, Allan Y.; Tsuha, Walter S.
1993-01-01
A two-stage model reduction methodology, combining the classical Component Mode Synthesis (CMS) method and the newly developed Enhanced Projection and Assembly (EP&A) method, is proposed in this research. The first stage of this methodology, called the COmponent Modes Projection and Assembly model REduction (COMPARE) method, involves the generation of CMS mode sets, such as the MacNeal-Rubin mode sets. These mode sets are then used to reduce the order of each component model in the Rayleigh-Ritz sense. The resultant component models are then combined to generate reduced-order system models at various system configurations. A composite mode set which retains important system modes at all system configurations is then selected from these reduced-order system models. In the second stage, the EP&A model reduction method is employed to reduce further the order of the system model generated in the first stage. The effectiveness of the COMPARE methodology has been successfully demonstrated on a high-order, finite-element model of the cruise-configured Galileo spacecraft.
ERIC Educational Resources Information Center
Afzal, Waseem
2017-01-01
Introduction: The purpose of this paper is to propose a methodology to conceptualize, operationalize, and empirically validate the concept of information need. Method: The proposed methodology makes use of both qualitative and quantitative perspectives, and includes a broad array of approaches such as literature reviews, expert opinions, focus…
NASA Technical Reports Server (NTRS)
Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong
1993-01-01
The present paper describes a new explicit virtual-pulse time integral methodology for nonlinear structural dynamics problems. The purpose of the paper is to provide the theoretical basis of the methodology and to demonstrate applicability of the proposed formulations to nonlinear dynamic structures. Different from the existing numerical methods such as direct time integrations or mode superposition techniques, the proposed methodology offers new perspectives and methodology of development, and possesses several unique and attractive computational characteristics. The methodology is tested and compared with the implicit Newmark method (trapezoidal rule) through a nonlinear softening and hardening spring dynamic models. The numerical results indicate that the proposed explicit virtual-pulse time integral methodology is an excellent alternative for solving general nonlinear dynamic problems.
Capello, Manuela; Robert, Marianne; Soria, Marc; Potin, Gael; Itano, David; Holland, Kim; Deneubourg, Jean-Louis; Dagorn, Laurent
2015-01-01
The rapid expansion of the use of passive acoustic telemetry technologies has facilitated unprecedented opportunities for studying the behavior of marine organisms in their natural environment. This technological advance would greatly benefit from the parallel development of dedicated methodologies accounting for the variety of timescales involved in the remote detection of tagged animals related to instrumental, environmental and behavioral events. In this paper we propose a methodological framework for estimating the site fidelity (“residence times”) of acoustic tagged animals at different timescales, based on the survival analysis of continuous residence times recorded at multiple receivers. Our approach is validated through modeling and applied on two distinct datasets obtained from a small coastal pelagic species (bigeye scad, Selar crumenophthalmus) and a large, offshore pelagic species (yellowfin tuna, Thunnus albacares), which show very distinct spatial scales of behavior. The methodological framework proposed herein allows estimating the most appropriate temporal scale for processing passive acoustic telemetry data depending on the scientific question of interest. Our method provides residence times free of the bias inherent to environmental and instrumental noise that can be used to study the small scale behavior of acoustic tagged animals. At larger timescales, it can effectively identify residence times that encompass the diel behavioral excursions of fish out of the acoustic detection range. This study provides a systematic framework for the analysis of passive acoustic telemetry data that can be employed for the comparative study of different species and study sites. The same methodology can be used each time discrete records of animal detections of any nature are employed for estimating the site fidelity of an animal at different timescales. PMID:26261985
Capello, Manuela; Robert, Marianne; Soria, Marc; Potin, Gael; Itano, David; Holland, Kim; Deneubourg, Jean-Louis; Dagorn, Laurent
2015-01-01
The rapid expansion of the use of passive acoustic telemetry technologies has facilitated unprecedented opportunities for studying the behavior of marine organisms in their natural environment. This technological advance would greatly benefit from the parallel development of dedicated methodologies accounting for the variety of timescales involved in the remote detection of tagged animals related to instrumental, environmental and behavioral events. In this paper we propose a methodological framework for estimating the site fidelity ("residence times") of acoustic tagged animals at different timescales, based on the survival analysis of continuous residence times recorded at multiple receivers. Our approach is validated through modeling and applied on two distinct datasets obtained from a small coastal pelagic species (bigeye scad, Selar crumenophthalmus) and a large, offshore pelagic species (yellowfin tuna, Thunnus albacares), which show very distinct spatial scales of behavior. The methodological framework proposed herein allows estimating the most appropriate temporal scale for processing passive acoustic telemetry data depending on the scientific question of interest. Our method provides residence times free of the bias inherent to environmental and instrumental noise that can be used to study the small scale behavior of acoustic tagged animals. At larger timescales, it can effectively identify residence times that encompass the diel behavioral excursions of fish out of the acoustic detection range. This study provides a systematic framework for the analysis of passive acoustic telemetry data that can be employed for the comparative study of different species and study sites. The same methodology can be used each time discrete records of animal detections of any nature are employed for estimating the site fidelity of an animal at different timescales.
NASA Astrophysics Data System (ADS)
Vieceli, Nathália; Nogueira, Carlos A.; Pereira, Manuel F. C.; Durão, Fernando O.; Guimarães, Carlos; Margarido, Fernanda
2018-01-01
The recovery of lithium from hard rock minerals has received increased attention given the high demand for this element. Therefore, this study optimized an innovative process, which does not require a high-temperature calcination step, for lithium extraction from lepidolite. Mechanical activation and acid digestion were suggested as crucial process parameters, and experimental design and response-surface methodology were applied to model and optimize the proposed lithium extraction process. The promoting effect of amorphization and the formation of lithium sulfate hydrate on lithium extraction yield were assessed. Several factor combinations led to extraction yields that exceeded 90%, indicating that the proposed process is an effective approach for lithium recovery.
Scrutinizing UML Activity Diagrams
NASA Astrophysics Data System (ADS)
Al-Fedaghi, Sabah
Building an information system involves two processes: conceptual modeling of the “real world domain” and designing the software system. Object-oriented methods and languages (e.g., UML) are typically used for describing the software system. For the system analysis process that produces the conceptual description, object-oriented techniques or semantics extensions are utilized. Specifically, UML activity diagrams are the “flow charts” of object-oriented conceptualization tools. This chapter proposes an alternative to UML activity diagrams through the development of a conceptual modeling methodology based on the notion of flow.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. House Committee on Education and Labor.
This hearing addressed the issue of whether the delays in producing a proposed National Institute for Occupational and Safety Health (NIOSH) study on the possible health hazards associated with video display terminals (VDTs) are due to concerns about scientific methodology or unwarranted interference by the Office of Management and Budget (OMB).…
An LMI approach for the Integral Sliding Mode and H∞ State Feedback Control Problem
NASA Astrophysics Data System (ADS)
Bezzaoucha, Souad; Henry, David
2015-11-01
This paper deals with the state feedback control problem for linear uncertain systems subject to both matched and unmatched perturbations. The proposed control law is based on an the Integral Sliding Mode Control (ISMC) approach to tackle matched perturbations as well as the H∞ paradigm for robustness against unmatched perturbations. The proposed method also parallels the work presented in [1] which addressed the same problem and proposed a solution involving an Algebraic Riccati Equation (ARE)-based formulation. The contribution of this paper is concerned by the establishment of a Linear Matrix Inequality (LMI)-based solution which offers the possibility to consider other types of constraints such as 𝓓-stability constraints (pole assignment-like constraints). The proposed methodology is applied to a pilot three-tank system and experiment results illustrate the feasibility. Note that only a few real experiments have been rarely considered using SMC in the past. This is due to the high energetic behaviour of the control signal. It is important to outline that the paper does not aim at proposing a LMI formulation of an ARE. This is done since 1971 [2] and further discussed in [3] where the link between AREs and ARIs (algebraic Riccati inequality) is established for the H∞ control problem. The main contribution of this paper is to establish the adequate LMI-based methodology (changes of matrix variables) so that the ARE that corresponds to the particular structure of the mixed ISMC/H∞ structure proposed by [1] can be re-formulated within the LMI paradigm.
NextGen Future Safety Assessment Game
NASA Technical Reports Server (NTRS)
Ancel, Ersin; Gheorghe, Adian; Jones, Sharon Monica
2010-01-01
The successful implementation of the next generation infrastructure systems requires solid understanding of their technical, social, political and economic aspects along with their interactions. The lack of historical data that relate to the long-term planning of complex systems introduces unique challenges for decision makers and involved stakeholders which in turn result in unsustainable systems. Also, the need to understand the infrastructure at the societal level and capture the interaction between multiple stakeholders becomes important. This paper proposes a methodology in order to develop a holistic approach aiming to provide an alternative subject-matter expert (SME) elicitation and data collection method for future sociotechnical systems. The methodology is adapted to Next Generation Air Transportation System (NextGen) decision making environment in order to demonstrate the benefits of this holistic approach.
NextGen Future Safety Assessment Game
NASA Technical Reports Server (NTRS)
Ancel, Ersin; Gheorghe, Adrian; Jones, Sharon Monica
2011-01-01
The successful implementation of the next generation infrastructure systems requires solid understanding of their technical, social, political and economic aspects along with their interactions. The lack of historical data that relate to the long-term planning of complex systems introduces unique challenges for decision makers and involved stakeholders which in turn result in unsustainable systems. Also, the need to understand the infrastructure at the societal level and capture the interaction between multiple stakeholders becomes important. This paper proposes a methodology in order to develop a holistic approach aiming to provide an alternative subject-matter expert (SME) elicitation and data collection method for future sociotechnical systems. The methodology is adapted to Next Generation Air Transportation System (NextGen) decision making environment in order to demonstrate the benefits of this holistic approach.
Self-adaptive MOEA feature selection for classification of bankruptcy prediction data.
Gaspar-Cunha, A; Recio, G; Costa, L; Estébanez, C
2014-01-01
Bankruptcy prediction is a vast area of finance and accounting whose importance lies in the relevance for creditors and investors in evaluating the likelihood of getting into bankrupt. As companies become complex, they develop sophisticated schemes to hide their real situation. In turn, making an estimation of the credit risks associated with counterparts or predicting bankruptcy becomes harder. Evolutionary algorithms have shown to be an excellent tool to deal with complex problems in finances and economics where a large number of irrelevant features are involved. This paper provides a methodology for feature selection in classification of bankruptcy data sets using an evolutionary multiobjective approach that simultaneously minimise the number of features and maximise the classifier quality measure (e.g., accuracy). The proposed methodology makes use of self-adaptation by applying the feature selection algorithm while simultaneously optimising the parameters of the classifier used. The methodology was applied to four different sets of data. The obtained results showed the utility of using the self-adaptation of the classifier.
NASA Astrophysics Data System (ADS)
Dodick, Jeff; Argamon, Shlomo; Chase, Paul
2009-08-01
A key focus of current science education reforms involves developing inquiry-based learning materials. However, without an understanding of how working scientists actually do science, such learning materials cannot be properly developed. Until now, research on scientific reasoning has focused on cognitive studies of individual scientific fields. However, the question remains as to whether scientists in different fields fundamentally rely on different methodologies. Although many philosophers and historians of science do indeed assert that there is no single monolithic scientific method, this has never been tested empirically. We therefore approach this problem by analyzing patterns of language used by scientists in their published work. Our results demonstrate systematic variation in language use between types of science that are thought to differ in their characteristic methodologies. The features of language use that were found correspond closely to a proposed distinction between Experimental Sciences (e.g., chemistry) and Historical Sciences (e.g., paleontology); thus, different underlying rhetorical and conceptual mechanisms likely operate for scientific reasoning and communication in different contexts.
A stochastic conflict resolution model for trading pollutant discharge permits in river systems.
Niksokhan, Mohammad Hossein; Kerachian, Reza; Amin, Pedram
2009-07-01
This paper presents an efficient methodology for developing pollutant discharge permit trading in river systems considering the conflict of interests of involving decision-makers and the stakeholders. In this methodology, a trade-off curve between objectives is developed using a powerful and recently developed multi-objective genetic algorithm technique known as the Nondominated Sorting Genetic Algorithm-II (NSGA-II). The best non-dominated solution on the trade-off curve is defined using the Young conflict resolution theory, which considers the utility functions of decision makers and stakeholders of the system. These utility functions are related to the total treatment cost and a fuzzy risk of violating the water quality standards. The fuzzy risk is evaluated using the Monte Carlo analysis. Finally, an optimization model provides the trading discharge permit policies. The practical utility of the proposed methodology in decision-making is illustrated through a realistic example of the Zarjub River in the northern part of Iran.
NASA Astrophysics Data System (ADS)
Fredouille, Corinne; Pouchoulin, Gilles; Ghio, Alain; Revis, Joana; Bonastre, Jean-François; Giovanni, Antoine
2009-12-01
This paper addresses voice disorder assessment. It proposes an original back-and-forth methodology involving an automatic classification system as well as knowledge of the human experts (machine learning experts, phoneticians, and pathologists). The goal of this methodology is to bring a better understanding of acoustic phenomena related to dysphonia. The automatic system was validated on a dysphonic corpus (80 female voices), rated according to the GRBAS perceptual scale by an expert jury. Firstly, focused on the frequency domain, the classification system showed the interest of 0-3000 Hz frequency band for the classification task based on the GRBAS scale. Later, an automatic phonemic analysis underlined the significance of consonants and more surprisingly of unvoiced consonants for the same classification task. Submitted to the human experts, these observations led to a manual analysis of unvoiced plosives, which highlighted a lengthening of VOT according to the dysphonia severity validated by a preliminary statistical analysis.
Bruni, Aline Thaís; Velho, Jesus Antonio; Ferreira, Arthur Serra Lopes; Tasso, Maria Júlia; Ferrari, Raíssa Santos; Yoshida, Ricardo Luís; Dias, Marcos Salvador; Leite, Vitor Barbanti Pereira
2014-08-01
This study uses statistical techniques to evaluate reports on suicide scenes; it utilizes 80 reports from different locations in Brazil, randomly collected from both federal and state jurisdictions. We aimed to assess a heterogeneous group of cases in order to obtain an overall perspective of the problem. We evaluated variables regarding the characteristics of the crime scene, such as the detected traces (blood, instruments and clothes) that were found and we addressed the methodology employed by the experts. A qualitative approach using basic statistics revealed a wide distribution as to how the issue was addressed in the documents. We examined a quantitative approach involving an empirical equation and we used multivariate procedures to validate the quantitative methodology proposed for this empirical equation. The methodology successfully identified the main differences in the information presented in the reports, showing that there is no standardized method of analyzing evidences. Copyright © 2014 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
Pareto frontier analyses based decision making tool for transportation of hazardous waste.
Das, Arup; Mazumder, T N; Gupta, A K
2012-08-15
Transportation of hazardous wastes through a region poses immense threat on the development along its road network. The risk to the population, exposed to such activities, has been documented in the past. However, a comprehensive framework for routing hazardous wastes has often been overlooked. A regional Hazardous Waste Management scheme should incorporate a comprehensive framework for hazardous waste transportation. This framework would incorporate the various stakeholders involved in decision making. Hence, a multi-objective approach is required to safeguard the interest of all the concerned stakeholders. The objective of this study is to design a methodology for routing of hazardous wastes between the generating units and the disposal facilities through a capacity constrained network. The proposed methodology uses posteriori method with multi-objective approach to find non-dominated solutions for the system consisting of multiple origins and destinations. A case study of transportation of hazardous wastes in Kolkata Metropolitan Area has also been provided to elucidate the methodology. Copyright © 2012 Elsevier B.V. All rights reserved.
Self-Adaptive MOEA Feature Selection for Classification of Bankruptcy Prediction Data
Gaspar-Cunha, A.; Recio, G.; Costa, L.; Estébanez, C.
2014-01-01
Bankruptcy prediction is a vast area of finance and accounting whose importance lies in the relevance for creditors and investors in evaluating the likelihood of getting into bankrupt. As companies become complex, they develop sophisticated schemes to hide their real situation. In turn, making an estimation of the credit risks associated with counterparts or predicting bankruptcy becomes harder. Evolutionary algorithms have shown to be an excellent tool to deal with complex problems in finances and economics where a large number of irrelevant features are involved. This paper provides a methodology for feature selection in classification of bankruptcy data sets using an evolutionary multiobjective approach that simultaneously minimise the number of features and maximise the classifier quality measure (e.g., accuracy). The proposed methodology makes use of self-adaptation by applying the feature selection algorithm while simultaneously optimising the parameters of the classifier used. The methodology was applied to four different sets of data. The obtained results showed the utility of using the self-adaptation of the classifier. PMID:24707201
A unified approach for development of Urdu Corpus for OCR and demographic purpose
NASA Astrophysics Data System (ADS)
Choudhary, Prakash; Nain, Neeta; Ahmed, Mushtaq
2015-02-01
This paper presents a methodology for the development of an Urdu handwritten text image Corpus and application of Corpus linguistics in the field of OCR and information retrieval from handwritten document. Compared to other language scripts, Urdu script is little bit complicated for data entry. To enter a single character it requires a combination of multiple keys entry. Here, a mixed approach is proposed and demonstrated for building Urdu Corpus for OCR and Demographic data collection. Demographic part of database could be used to train a system to fetch the data automatically, which will be helpful to simplify existing manual data-processing task involved in the field of data collection such as input forms like Passport, Ration Card, Voting Card, AADHAR, Driving licence, Indian Railway Reservation, Census data etc. This would increase the participation of Urdu language community in understanding and taking benefit of the Government schemes. To make availability and applicability of database in a vast area of corpus linguistics, we propose a methodology for data collection, mark-up, digital transcription, and XML metadata information for benchmarking.
Projecting adverse event incidence rates using empirical Bayes methodology.
Ma, Guoguang Julie; Ganju, Jitendra; Huang, Jing
2016-08-01
Although there is considerable interest in adverse events observed in clinical trials, projecting adverse event incidence rates in an extended period can be of interest when the trial duration is limited compared to clinical practice. A naïve method for making projections might involve modeling the observed rates into the future for each adverse event. However, such an approach overlooks the information that can be borrowed across all the adverse event data. We propose a method that weights each projection using a shrinkage factor; the adverse event-specific shrinkage is a probability, based on empirical Bayes methodology, estimated from all the adverse event data, reflecting evidence in support of the null or non-null hypotheses. Also proposed is a technique to estimate the proportion of true nulls, called the common area under the density curves, which is a critical step in arriving at the shrinkage factor. The performance of the method is evaluated by projecting from interim data and then comparing the projected results with observed results. The method is illustrated on two data sets. © The Author(s) 2013.
78 FR 4369 - Rates for Interstate Inmate Calling Services
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-22
.... Marginal Location Methodology. In 2008, ICS providers submitted the ICS Provider Proposal for ICS rates. The ICS Provider Proposal uses the ``marginal location'' methodology, previously adopted by the... ``marginal location'' methodology provides a ``basis for rates that represent `fair compensation' as set...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-09
... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB Number 1121-NEW] Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological Research To Support the National... Redesign Research (NCVS-RR) program: Methodological Research to Support the National Crime Victimization...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-07
... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB No. 1121-NEW] Agency Information Collection Activities: Proposed Collection; Comments Requested Methodological Research To Support the National Crime... related to the National Crime Victimization Survey Redesign Research (NCVS-RR) program: Methodological...
Huscher, Dörte; Saketkoo, Lesley Ann; Pittrow, David; Khanna, Dinesh
2010-05-01
This review article discusses the proposed methodology that will be utilized to develop core set items for connective tissue disease-associated interstitial lung disease (CTD-ILD). CTD-ILD remain an important enigma in clinical medicine. No consensus exists on measurement of disease activity or what constitutes a significant response to therapeutic interventions. Lack of appropriate measures inhibit effective drug development and hamper regulatory evaluation of candidate therapies.An interdisciplinary and international Steering Committee (SC) will oversee the execution of a 3-tier Delphi exercise involving experts in CTD and ILD. In parallel to the Delphi, qualitative information will be gathered from patients with ILD using focus groups. These data will subsequently be used to construct surveys to collect quantitative response from patients with ILD. The final Delphi and Patient Perspective results are to be scrutinized by SC and specialty sub-groups (including patient advocates) for truth, discrimination and feasibility - the OMERACT filters. Through application of Nominal Group technique, a core set of outcome measures will be proposed. Subsequent exercises will evaluate the applicability of a proposed core set to the unique issues posed by individual CTDs in addition to guidelines on screening, prognostication and damage scoring.
Nakamura, Shinichiro; Kondo, Yasushi; Matsubae, Kazuyo; Nakajima, Kenichi; Nagasaka, Tetsuya
2011-02-01
Identification of the flow of materials and substances associated with a product system provides useful information for Life Cycle Analysis (LCA), and contributes to extending the scope of complementarity between LCA and Materials Flow Analysis/Substances Flow Analysis (MFA/SFA), the two major tools of industrial ecology. This paper proposes a new methodology based on input-output analysis for identifying the physical input-output flow of individual materials that is associated with the production of a unit of given product, the unit physical input-output by materials (UPIOM). While the Sankey diagram has been a standard tool for the visualization of MFA/SFA, with an increase in the complexity of the flows under consideration, which will be the case when economy-wide intersectoral flows of materials are involved, the Sankey diagram may become too complex for effective visualization. An alternative way to visually represent material flows is proposed which makes use of triangulation of the flow matrix based on degrees of fabrication. The proposed methodology is applied to the flow of pig iron and iron and steel scrap that are associated with the production of a passenger car in Japan. Its usefulness to identify a specific MFA pattern from the original IO table is demonstrated.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-16
... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB Number 1121-NEW] Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological Research to Support the National...: Methodological Research to Support the National Crime Victimization Survey: Self-Report Data on Rape and Sexual...
Penn, Alexandra S.; Knight, Christopher J. K.; Lloyd, David J. B.; Avitabile, Daniele; Kok, Kasper; Schiller, Frank; Woodward, Amy; Druckman, Angela; Basson, Lauren
2013-01-01
Fuzzy Cognitive Mapping (FCM) is a widely used participatory modelling methodology in which stakeholders collaboratively develop a ‘cognitive map’ (a weighted, directed graph), representing the perceived causal structure of their system. This can be directly transformed by a workshop facilitator into simple mathematical models to be interrogated by participants by the end of the session. Such simple models provide thinking tools which can be used for discussion and exploration of complex issues, as well as sense checking the implications of suggested causal links. They increase stakeholder motivation and understanding of whole systems approaches, but cannot be separated from an intersubjective participatory context. Standard FCM methodologies make simplifying assumptions, which may strongly influence results, presenting particular challenges and opportunities. We report on a participatory process, involving local companies and organisations, focussing on the development of a bio-based economy in the Humber region. The initial cognitive map generated consisted of factors considered key for the development of the regional bio-based economy and their directional, weighted, causal interconnections. A verification and scenario generation procedure, to check the structure of the map and suggest modifications, was carried out with a second session. Participants agreed on updates to the original map and described two alternate potential causal structures. In a novel analysis all map structures were tested using two standard methodologies usually used independently: linear and sigmoidal FCMs, demonstrating some significantly different results alongside some broad similarities. We suggest a development of FCM methodology involving a sensitivity analysis with different mappings and discuss the use of this technique in the context of our case study. Using the results and analysis of our process, we discuss the limitations and benefits of the FCM methodology in this case and in general. We conclude by proposing an extended FCM methodology, including multiple functional mappings within one participant-constructed graph. PMID:24244303
Patrizi, Alfredo; Pennestrì, Ettore; Valentini, Pier Paolo
2016-01-01
The paper deals with the comparison between a high-end marker-based acquisition system and a low-cost marker-less methodology for the assessment of the human posture during working tasks. The low-cost methodology is based on the use of a single Microsoft Kinect V1 device. The high-end acquisition system is the BTS SMART that requires the use of reflective markers to be placed on the subject's body. Three practical working activities involving object lifting and displacement have been investigated. The operational risk has been evaluated according to the lifting equation proposed by the American National Institute for Occupational Safety and Health. The results of the study show that the risk multipliers computed from the two acquisition methodologies are very close for all the analysed activities. In agreement to this outcome, the marker-less methodology based on the Microsoft Kinect V1 device seems very promising to promote the dissemination of computer-aided assessment of ergonomics while maintaining good accuracy and affordable costs. PRACTITIONER’S SUMMARY: The study is motivated by the increasing interest for on-site working ergonomics assessment. We compared a low-cost marker-less methodology with a high-end marker-based system. We tested them on three different working tasks, assessing the working risk of lifting loads. The two methodologies showed comparable precision in all the investigations.
Angelis, Aris; Kanavos, Panos
2016-05-01
In recent years, multiple criteria decision analysis (MCDA) has emerged as a likely alternative to address shortcomings in health technology assessment (HTA) by offering a more holistic perspective to value assessment and acting as an alternative priority setting tool. In this paper, we argue that MCDA needs to subscribe to robust methodological processes related to the selection of objectives, criteria and attributes in order to be meaningful in the context of healthcare decision making and fulfil its role in value-based assessment (VBA). We propose a methodological process, based on multi-attribute value theory (MAVT) methods comprising five distinct phases, outline the stages involved in each phase and discuss their relevance in the HTA process. Importantly, criteria and attributes need to satisfy a set of desired properties, otherwise the outcome of the analysis can produce spurious results and misleading recommendations. Assuming the methodological process we propose is adhered to, the application of MCDA presents three very distinct advantages to decision makers in the context of HTA and VBA: first, it acts as an instrument for eliciting preferences on the performance of alternative options across a wider set of explicit criteria, leading to a more complete assessment of value; second, it allows the elicitation of preferences across the criteria themselves to reflect differences in their relative importance; and, third, the entire process of preference elicitation can be informed by direct stakeholder engagement, and can therefore reflect their own preferences. All features are fully transparent and facilitate decision making.
NASA Astrophysics Data System (ADS)
Bazilevs, Y.; Kamran, K.; Moutsanidis, G.; Benson, D. J.; Oñate, E.
2017-07-01
In this two-part paper we begin the development of a new class of methods for modeling fluid-structure interaction (FSI) phenomena for air blast. We aim to develop accurate, robust, and practical computational methodology, which is capable of modeling the dynamics of air blast coupled with the structure response, where the latter involves large, inelastic deformations and disintegration into fragments. An immersed approach is adopted, which leads to an a-priori monolithic FSI formulation with intrinsic contact detection between solid objects, and without formal restrictions on the solid motions. In Part I of this paper, the core air-blast FSI methodology suitable for a variety of discretizations is presented and tested using standard finite elements. Part II of this paper focuses on a particular instantiation of the proposed framework, which couples isogeometric analysis (IGA) based on non-uniform rational B-splines and a reproducing-kernel particle method (RKPM), which is a Meshfree technique. The combination of IGA and RKPM is felt to be particularly attractive for the problem class of interest due to the higher-order accuracy and smoothness of both discretizations, and relative simplicity of RKPM in handling fragmentation scenarios. A collection of mostly 2D numerical examples is presented in each of the parts to illustrate the good performance of the proposed air-blast FSI framework.
Diagnostic methodology for incipient system disturbance based on a neural wavelet approach
NASA Astrophysics Data System (ADS)
Won, In-Ho
Since incipient system disturbances are easily mixed up with other events or noise sources, the signal from the system disturbance can be neglected or identified as noise. Thus, as available knowledge and information is obtained incompletely or inexactly from the measurements; an exploration into the use of artificial intelligence (AI) tools to overcome these uncertainties and limitations was done. A methodology integrating the feature extraction efficiency of the wavelet transform with the classification capabilities of neural networks is developed for signal classification in the context of detecting incipient system disturbances. The synergistic effects of wavelets and neural networks present more strength and less weakness than either technique taken alone. A wavelet feature extractor is developed to form concise feature vectors for neural network inputs. The feature vectors are calculated from wavelet coefficients to reduce redundancy and computational expense. During this procedure, the statistical features based on the fractal concept to the wavelet coefficients play a role as crucial key in the wavelet feature extractor. To verify the proposed methodology, two applications are investigated and successfully tested. The first involves pump cavitation detection using dynamic pressure sensor. The second pertains to incipient pump cavitation detection using signals obtained from a current sensor. Also, through comparisons between three proposed feature vectors and with statistical techniques, it is shown that the variance feature extractor provides a better approach in the performed applications.
A CWT-based methodology for piston slap experimental characterization
NASA Astrophysics Data System (ADS)
Buzzoni, M.; Mucchi, E.; Dalpiaz, G.
2017-03-01
Noise and vibration control in mechanical systems has become ever more significant for automotive industry where the comfort of the passenger compartment represents a challenging issue for car manufacturers. The reduction of piston slap noise is pivotal for a good design of IC engines. In this scenario, a methodology has been developed for the vibro-acoustic assessment of IC diesel engines by means of design changes in piston to cylinder bore clearance. Vibration signals have been analysed by means of advanced signal processing techniques taking advantage of cyclostationarity theory. The procedure departs from the analysis of the Continuous Wavelet Transform (CWT) in order to identify a representative frequency band of piston slap phenomenon. Such a frequency band has been exploited as the input data in the further signal processing analysis that involves the envelope analysis of the second order cyclostationary component of the signal. The second order harmonic component has been used as the benchmark parameter of piston slap noise. An experimental procedure of vibrational benchmarking is proposed and verified at different operational conditions in real IC engines actually equipped on cars. This study clearly underlines the crucial role of the transducer positioning when differences among real piston-to-cylinder clearances are considered. In particular, the proposed methodology is effective for the sensors placed on the outer cylinder wall in all the tested conditions.
Temperature - Emissivity Separation Assessment in a Sub-Urban Scenario
NASA Astrophysics Data System (ADS)
Moscadelli, M.; Diani, M.; Corsini, G.
2017-10-01
In this paper, a methodology that aims at evaluating the effectiveness of different TES strategies is presented. The methodology takes into account the specific material of interest in the monitored scenario, sensor characteristics, and errors in the atmospheric compensation step. The methodology is proposed in order to predict and analyse algorithms performances during the planning of a remote sensing mission, aimed to discover specific materials of interest in the monitored scenario. As case study, the proposed methodology is applied to a real airborne data set of a suburban scenario. In order to perform the TES problem, three state-of-the-art algorithms, and a recently proposed one, are investigated: Temperature-Emissivity Separation '98 (TES-98) algorithm, Stepwise Refining TES (SRTES) algorithm, Linear piecewise TES (LTES) algorithm, and Optimized Smoothing TES (OSTES) algorithm. At the end, the accuracy obtained with real data, and the ones predicted by means of the proposed methodology are compared and discussed.
Node-making process in network meta-analysis of nonpharmacological treatment are poorly reported.
James, Arthur; Yavchitz, Amélie; Ravaud, Philippe; Boutron, Isabelle
2018-05-01
To identify methods to support the node-making process in network meta-analyses (NMAs) of nonpharmacological treatments. We proceeded in two stages. First, we conducted a literature review of guidelines and methodological articles about NMAs to identify methods proposed to lump interventions into nodes. Second, we conducted a systematic review of NMAs of nonpharmacological treatments to extract methods used by authors to support their node-making process. MEDLINE and Google Scholar were searched to identify articles assessing NMA guidelines or methodology intended for NMA authors. MEDLINE, CENTRAL, and EMBASE were searched to identify reports of NMAs including at least one nonpharmacological treatment. Both searches involved articles available from database inception to March 2016. From the methodological review, we identified and extracted methods proposed to lump interventions into nodes. From the systematic review, the reporting of the network was assessed as long as the method described supported the node-making process. Among the 116 articles retrieved in the literature review, 12 (10%) discussed the concept of lumping or splitting interventions in NMAs. No consensual method was identified during the methodological review, and expert consensus was the only method proposed to support the node-making process. Among 5187 references for the systematic review, we included 110 reports of NMAs published between 2007 and 2016. The nodes were described in the introduction section of 88 reports (80%), which suggested that the node content might have been a priori decided before the systematic review. Nine reports (8.1%) described a specific process or justification to build nodes for the network. Two methods were identified: (1) fit a previously published classification and (2) expert consensus. Despite the importance of NMA in the delivery of evidence when several interventions are available for a single indication, recommendations on the reporting of the node-making process in NMAs are lacking, and reporting of the node-making process in NMAs seems insufficient. Copyright © 2017 Elsevier Inc. All rights reserved.
Environmental care in agricultural catchments: Toward the communicative catchment
NASA Astrophysics Data System (ADS)
Martin, Peter
1991-11-01
Substantial land degradation of agricultural catchments in Australia has resulted from the importation of European farming methods and the large-scale clearing of land. Rural communities are now being encouraged by government to take responsibility for environmental care. The importance of community involvement is supported by the view that environmental problems are a function of interactions between people and their environment. It is suggested that the commonly held view that community groups cannot care for their resources is due to inappropriate social institutions rather that any inherent disability in people. The communicative catchment is developed as a vision for environmental care into the future. This concept emerges from a critique of resource management through the catchment metaphors of the reduced, mechanical, and the complex, evolving catchment, which reflect the development of systemic and people-centered approaches to environmental care. The communicative catchment is one where both community and resource managers participate collaboratively in environmental care. A methodology based on action research and systemic thinking (systemic action research) is proposed as a way of moving towards the communicative catchment of the future. Action research is a way of taking action in organizations and communities that is participative and informed by theory, while systemic thinking takes into account the interconnections and relationships between social and natural worlds. The proposed vision, methodology, and practical operating principles stem from involvement in an action research project looking at extension strategies for the implementation of total catchment management in the Hunter Valley, New South Wales.
Shukla, Nagesh; Keast, John E; Ceglarek, Darek
2014-10-01
The modelling of complex workflows is an important problem-solving technique within healthcare settings. However, currently most of the workflow models use a simplified flow chart of patient flow obtained using on-site observations, group-based debates and brainstorming sessions, together with historic patient data. This paper presents a systematic and semi-automatic methodology for knowledge acquisition with detailed process representation using sequential interviews of people in the key roles involved in the service delivery process. The proposed methodology allows the modelling of roles, interactions, actions, and decisions involved in the service delivery process. This approach is based on protocol generation and analysis techniques such as: (i) initial protocol generation based on qualitative interviews of radiology staff, (ii) extraction of key features of the service delivery process, (iii) discovering the relationships among the key features extracted, and, (iv) a graphical representation of the final structured model of the service delivery process. The methodology is demonstrated through a case study of a magnetic resonance (MR) scanning service-delivery process in the radiology department of a large hospital. A set of guidelines is also presented in this paper to visually analyze the resulting process model for identifying process vulnerabilities. A comparative analysis of different workflow models is also conducted. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-24
... Information Collection: The purpose of the proposed methodological study is to evaluate the feasibility... the NCS, the multiple methodological studies conducted during the Vanguard phase will inform the... methodological study is identification of recruitment strategies and components of recruitment strategies that...
48 CFR 1552.215-72 - Instructions for the Preparation of Proposals.
Code of Federal Regulations, 2011 CFR
2011-10-01
... used. If escalation is included, state the degree (percent) and methodology. The methodology shall.... If so, state the number required, the professional or technical level and the methodology used to... for which the salary is applicable; (C) List of other research Projects or proposals for which...
NASA Astrophysics Data System (ADS)
Tabibzadeh, Maryam
According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test misinterpretation. Finally, a rational decision making model is introduced to quantify a section of the developed conceptual framework in the previous step and analyze the impact of different decision making biases on negative pressure test results. Along with the corroborating findings of previous studies, the analysis of the developed conceptual framework in this paper indicates that organizational factors are root causes of accumulated errors and questionable decisions made by personnel or management. Further analysis of this framework identifies procedural issues, economic pressure, and personnel management issues as the organizational factors with the highest influence on misinterpreting a negative pressure test. It is noteworthy that the captured organizational factors in the introduced conceptual framework are not only specific to the scope of the NPT. Most of these organizational factors have been identified as not only the common contributing causes of other offshore drilling accidents but also accidents in other oil and gas related operations as well as high-risk operations in other industries. In addition, the proposed rational decision making model in this research introduces a quantitative structure for analysis of the results of a conducted NPT. This model provides a structure and some parametric derived formulas to determine a cut-off point value, which assists personnel in accepting or rejecting an implemented negative pressure test. Moreover, it enables analysts to assess different decision making biases involved in the process of interpreting a conducted negative pressure test as well as the root organizational factors of those biases. In general, although the proposed integrated research methodology in this dissertation is developed for the risk assessment of human and organizational factors contributions in negative pressure test misinterpretation, it can be generalized and be potentially useful for other well control situations, both offshore and onshore; e.g. fracking. In addition, this methodology can be applied for the analysis of any high-risk operations, in not only the oil and gas industry but also in other industries such as nuclear power plants, aviation industry, and transportation sector.
The microwave-assisted ionic-liquid method: a promising methodology in nanomaterials.
Ma, Ming-Guo; Zhu, Jie-Fang; Zhu, Ying-Jie; Sun, Run-Cang
2014-09-01
In recent years, the microwave-assisted ionic-liquid method has been accepted as a promising methodology for the preparation of nanomaterials and cellulose-based nanocomposites. Applications of this method in the preparation of cellulose-based nanocomposites comply with the major principles of green chemistry, that is, they use an environmentally friendly method in environmentally preferable solvents to make use of renewable materials. This minireview focuses on the recent development of the synthesis of nanomaterials and cellulose-based nanocomposites by means of the microwave-assisted ionic-liquid method. We first discuss the preparation of nanomaterials including noble metals, metal oxides, complex metal oxides, metal sulfides, and other nanomaterials by means of this method. Then we provide an overview of the synthesis of cellulose-based nanocomposites by using this method. The emphasis is on the synthesis, microstructure, and properties of nanostructured materials obtained through this methodology. Our recent research on nanomaterials and cellulose-based nanocomposites by this rapid method is summarized. In addition, the formation mechanisms involved in the microwave-assisted ionic-liquid synthesis of nanostructured materials are discussed briefly. Finally, the future perspectives of this methodology in the synthesis of nanostructured materials are proposed. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria
NASA Astrophysics Data System (ADS)
Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong
2017-08-01
In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.
77 FR 67363 - Sunshine Act Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-09
... 20571. Open Agenda Items: Item No. 1: Proposed Economic Impact Procedures and Methodological Guidelines. Documentation including the proposed Economic Impact Procedures and Methodological Guidelines as well as the...
Modelling and simulating a crisis management system: an organisational perspective
NASA Astrophysics Data System (ADS)
Chaawa, Mohamed; Thabet, Inès; Hanachi, Chihab; Ben Said, Lamjed
2017-04-01
Crises are complex situations due to the dynamism of the environment, its unpredictability and the complexity of the interactions among several different and autonomous involved organisations. In such a context, establishing an organisational view as well as structuring organisations' communications and their functioning is a crucial requirement. In this article, we propose a multi-agent organisational model (OM) to abstract, simulate and analyse a crisis management system (CMS). The objective is to evaluate the CMS from an organisational view, to assess its strength as well as its weakness and to provide deciders with some recommendations for a more flexible and reactive CMS. The proposed OM is illustrated through a real case study: a snowstorm in a Tunisian region. More precisely, we made the following contribution: firstly, we provide an environmental model that identifies the concepts involved in the crisis. Then, we define a role model that copes with the involved actors. In addition, we specify the organisational structure and the interaction model that rule communications and structure actors' functioning. Those models, built following the GAIA methodology, abstract the CMS from an organisational perspective. Finally, we implemented a customisable multi-agent simulator based on the Janus platform to analyse, through several performed simulations, the organisational model.
A Study on Optimal Sizing of Pipeline Transporting Equi-sized Particulate Solid-Liquid Mixture
NASA Astrophysics Data System (ADS)
Asim, Taimoor; Mishra, Rakesh; Pradhan, Suman; Ubbi, Kuldip
2012-05-01
Pipelines transporting solid-liquid mixtures are of practical interest to the oil and pipe industry throughout the world. Such pipelines are known as slurry pipelines where the solid medium of the flow is commonly known as slurry. The optimal designing of such pipelines is of commercial interests for their widespread acceptance. A methodology has been evolved for the optimal sizing of a pipeline transporting solid-liquid mixture. Least cost principle has been used in sizing such pipelines, which involves the determination of pipe diameter corresponding to the minimum cost for given solid throughput. The detailed analysis with regard to transportation of slurry having solids of uniformly graded particles size has been included. The proposed methodology can be used for designing a pipeline for transporting any solid material for different solid throughput.
Similarity-Based Recommendation of New Concepts to a Terminology
Chandar, Praveen; Yaman, Anil; Hoxha, Julia; He, Zhe; Weng, Chunhua
2015-01-01
Terminologies can suffer from poor concept coverage due to delays in addition of new concepts. This study tests a similarity-based approach to recommending concepts from a text corpus to a terminology. Our approach involves extraction of candidate concepts from a given text corpus, which are represented using a set of features. The model learns the important features to characterize a concept and recommends new concepts to a terminology. Further, we propose a cost-effective evaluation methodology to estimate the effectiveness of terminology enrichment methods. To test our methodology, we use the clinical trial eligibility criteria free-text as an example text corpus to recommend concepts for SNOMED CT. We computed precision at various rank intervals to measure the performance of the methods. Results indicate that our automated algorithm is an effective method for concept recommendation. PMID:26958170
Collagen morphology and texture analysis: from statistics to classification
Mostaço-Guidolin, Leila B.; Ko, Alex C.-T.; Wang, Fei; Xiang, Bo; Hewko, Mark; Tian, Ganghong; Major, Arkady; Shiomi, Masashi; Sowa, Michael G.
2013-01-01
In this study we present an image analysis methodology capable of quantifying morphological changes in tissue collagen fibril organization caused by pathological conditions. Texture analysis based on first-order statistics (FOS) and second-order statistics such as gray level co-occurrence matrix (GLCM) was explored to extract second-harmonic generation (SHG) image features that are associated with the structural and biochemical changes of tissue collagen networks. Based on these extracted quantitative parameters, multi-group classification of SHG images was performed. With combined FOS and GLCM texture values, we achieved reliable classification of SHG collagen images acquired from atherosclerosis arteries with >90% accuracy, sensitivity and specificity. The proposed methodology can be applied to a wide range of conditions involving collagen re-modeling, such as in skin disorders, different types of fibrosis and muscular-skeletal diseases affecting ligaments and cartilage. PMID:23846580
Large scale nonlinear programming for the optimization of spacecraft trajectories
NASA Astrophysics Data System (ADS)
Arrieta-Camacho, Juan Jose
Despite the availability of high fidelity mathematical models, the computation of accurate optimal spacecraft trajectories has never been an easy task. While simplified models of spacecraft motion can provide useful estimates on energy requirements, sizing, and cost; the actual launch window and maneuver scheduling must rely on more accurate representations. We propose an alternative for the computation of optimal transfers that uses an accurate representation of the spacecraft dynamics. Like other methodologies for trajectory optimization, this alternative is able to consider all major disturbances. In contrast, it can handle explicitly equality and inequality constraints throughout the trajectory; it requires neither the derivation of costate equations nor the identification of the constrained arcs. The alternative consist of two steps: (1) discretizing the dynamic model using high-order collocation at Radau points, which displays numerical advantages, and (2) solution to the resulting Nonlinear Programming (NLP) problem using an interior point method, which does not suffer from the performance bottleneck associated with identifying the active set, as required by sequential quadratic programming methods; in this way the methodology exploits the availability of sound numerical methods, and next generation NLP solvers. In practice the methodology is versatile; it can be applied to a variety of aerospace problems like homing, guidance, and aircraft collision avoidance; the methodology is particularly well suited for low-thrust spacecraft trajectory optimization. Examples are presented which consider the optimization of a low-thrust orbit transfer subject to the main disturbances due to Earth's gravity field together with Lunar and Solar attraction. Other example considers the optimization of a multiple asteroid rendezvous problem. In both cases, the ability of our proposed methodology to consider non-standard objective functions and constraints is illustrated. Future research directions are identified, involving the automatic scheduling and optimization of trajectory correction maneuvers. The sensitivity information provided by the methodology is expected to be invaluable in such research pursuit. The collocation scheme and nonlinear programming algorithm presented in this work, complement other existing methodologies by providing reliable and efficient numerical methods able to handle large scale, nonlinear dynamic models.
Scoping reviews: time for clarity in definition, methods, and reporting.
Colquhoun, Heather L; Levac, Danielle; O'Brien, Kelly K; Straus, Sharon; Tricco, Andrea C; Perrier, Laure; Kastner, Monika; Moher, David
2014-12-01
The scoping review has become increasingly popular as a form of knowledge synthesis. However, a lack of consensus on scoping review terminology, definition, methodology, and reporting limits the potential of this form of synthesis. In this article, we propose recommendations to further advance the field of scoping review methodology. We summarize current understanding of scoping review publication rates, terms, definitions, and methods. We propose three recommendations for clarity in term, definition and methodology. We recommend adopting the terms "scoping review" or "scoping study" and the use of a proposed definition. Until such time as further guidance is developed, we recommend the use of the methodological steps outlined in the Arksey and O'Malley framework and further enhanced by Levac et al. The development of reporting guidance for the conduct and reporting of scoping reviews is underway. Consistency in the proposed domains and methodologies of scoping reviews, along with the development of reporting guidance, will facilitate methodological advancement, reduce confusion, facilitate collaboration and improve knowledge translation of scoping review findings. Copyright © 2014 Elsevier Inc. All rights reserved.
A proposed framework for assessing risk from less-than-lifetime exposures to carcinogens.
Felter, Susan P; Conolly, Rory B; Bercu, Joel P; Bolger, P Michael; Boobis, Alan R; Bos, Peter M J; Carthew, Philip; Doerrer, Nancy G; Goodman, Jay I; Harrouk, Wafa A; Kirkland, David J; Lau, Serrine S; Llewellyn, G Craig; Preston, R Julian; Schoeny, Rita; Schnatter, A Robert; Tritscher, Angelika; van Velsen, Frans; Williams, Gary M
2011-07-01
Quantitative methods for estimation of cancer risk have been developed for daily, lifetime human exposures. There are a variety of studies or methodologies available to address less-than-lifetime exposures. However, a common framework for evaluating risk from less-than-lifetime exposures (including short-term and/or intermittent exposures) does not exist, which could result in inconsistencies in risk assessment practice. To address this risk assessment need, a committee of the International Life Sciences Institute (ILSI) Health and Environmental Sciences Institute conducted a multisector workshop in late 2009 to discuss available literature, different methodologies, and a proposed framework. The proposed framework provides a decision tree and guidance for cancer risk assessments for less-than-lifetime exposures based on current knowledge of mode of action and dose-response. Available data from rodent studies and epidemiological studies involving less-than-lifetime exposures are considered, in addition to statistical approaches described in the literature for evaluating the impact of changing the dose rate and exposure duration for exposure to carcinogens. The decision tree also provides for scenarios in which an assumption of potential carcinogenicity is appropriate (e.g., based on structural alerts or genotoxicity data), but bioassay or other data are lacking from which a chemical-specific cancer potency can be determined. This paper presents an overview of the rationale for the workshop, reviews historical background, describes the proposed framework for assessing less-than-lifetime exposures to potential human carcinogens, and suggests next steps.
NASA Technical Reports Server (NTRS)
Howard, R. A.; North, D. W.; Pezier, J. P.
1975-01-01
A new methodology is proposed for integrating planetary quarantine objectives into space exploration planning. This methodology is designed to remedy the major weaknesses inherent in the current formulation of planetary quarantine requirements. Application of the methodology is illustrated by a tutorial analysis of a proposed Jupiter Orbiter mission. The proposed methodology reformulates planetary quarantine planning as a sequential decision problem. Rather than concentrating on a nominal plan, all decision alternatives and possible consequences are laid out in a decision tree. Probabilities and values are associated with the outcomes, including the outcome of contamination. The process of allocating probabilities, which could not be made perfectly unambiguous and systematic, is replaced by decomposition and optimization techniques based on principles of dynamic programming. Thus, the new methodology provides logical integration of all available information and allows selection of the best strategy consistent with quarantine and other space exploration goals.
Comprehensible knowledge model creation for cancer treatment decision making.
Afzal, Muhammad; Hussain, Maqbool; Ali Khan, Wajahat; Ali, Taqdir; Lee, Sungyoung; Huh, Eui-Nam; Farooq Ahmad, Hafiz; Jamshed, Arif; Iqbal, Hassan; Irfan, Muhammad; Abbas Hydari, Manzar
2017-03-01
A wealth of clinical data exists in clinical documents in the form of electronic health records (EHRs). This data can be used for developing knowledge-based recommendation systems that can assist clinicians in clinical decision making and education. One of the big hurdles in developing such systems is the lack of automated mechanisms for knowledge acquisition to enable and educate clinicians in informed decision making. An automated knowledge acquisition methodology with a comprehensible knowledge model for cancer treatment (CKM-CT) is proposed. With the CKM-CT, clinical data are acquired automatically from documents. Quality of data is ensured by correcting errors and transforming various formats into a standard data format. Data preprocessing involves dimensionality reduction and missing value imputation. Predictive algorithm selection is performed on the basis of the ranking score of the weighted sum model. The knowledge builder prepares knowledge for knowledge-based services: clinical decisions and education support. Data is acquired from 13,788 head and neck cancer (HNC) documents for 3447 patients, including 1526 patients of the oral cavity site. In the data quality task, 160 staging values are corrected. In the preprocessing task, 20 attributes and 106 records are eliminated from the dataset. The Classification and Regression Trees (CRT) algorithm is selected and provides 69.0% classification accuracy in predicting HNC treatment plans, consisting of 11 decision paths that yield 11 decision rules. Our proposed methodology, CKM-CT, is helpful to find hidden knowledge in clinical documents. In CKM-CT, the prediction models are developed to assist and educate clinicians for informed decision making. The proposed methodology is generalizable to apply to data of other domains such as breast cancer with a similar objective to assist clinicians in decision making and education. Copyright © 2017 Elsevier Ltd. All rights reserved.
Rico-Contreras, José Octavio; Aguilar-Lasserre, Alberto Alfonso; Méndez-Contreras, Juan Manuel; López-Andrés, Jhony Josué; Cid-Chama, Gabriela
2017-11-01
The objective of this study is to determine the economic return of poultry litter combustion in boilers to produce bioenergy (thermal and electrical), as this biomass has a high-energy potential due to its component elements, using fuzzy logic to predict moisture and identify the high-impact variables. This is carried out using a proposed 7-stage methodology, which includes a statistical analysis of agricultural systems and practices to identify activities contributing to moisture in poultry litter (for example, broiler chicken management, number of air extractors, and avian population density), and thereby reduce moisture to increase the yield of the combustion process. Estimates of poultry litter production and heating value are made based on 4 different moisture content percentages (scenarios of 25%, 30%, 35%, and 40%), and then a risk analysis is proposed using the Monte Carlo simulation to select the best investment alternative and to estimate the environmental impact for greenhouse gas mitigation. The results show that dry poultry litter (25%) is slightly better for combustion, generating 3.20% more energy. Reducing moisture from 40% to 25% involves considerable economic investment due to the purchase of equipment to reduce moisture; thus, when calculating financial indicators, the 40% scenario is the most attractive, as it is the current scenario. Thus, this methodology proposes a technology approach based on the use of advanced tools to predict moisture and representation of the system (Monte Carlo simulation), where the variability and uncertainty of the system are accurately represented. Therefore, this methodology is considered generic for any bioenergy generation system and not just for the poultry sector, whether it uses combustion or another type of technology. Copyright © 2017 Elsevier Ltd. All rights reserved.
Sepúlveda, Nuno; Campino, Susana G; Assefa, Samuel A; Sutherland, Colin J; Pain, Arnab; Clark, Taane G
2013-02-26
The advent of next generation sequencing technology has accelerated efforts to map and catalogue copy number variation (CNV) in genomes of important micro-organisms for public health. A typical analysis of the sequence data involves mapping reads onto a reference genome, calculating the respective coverage, and detecting regions with too-low or too-high coverage (deletions and amplifications, respectively). Current CNV detection methods rely on statistical assumptions (e.g., a Poisson model) that may not hold in general, or require fine-tuning the underlying algorithms to detect known hits. We propose a new CNV detection methodology based on two Poisson hierarchical models, the Poisson-Gamma and Poisson-Lognormal, with the advantage of being sufficiently flexible to describe different data patterns, whilst robust against deviations from the often assumed Poisson model. Using sequence coverage data of 7 Plasmodium falciparum malaria genomes (3D7 reference strain, HB3, DD2, 7G8, GB4, OX005, and OX006), we showed that empirical coverage distributions are intrinsically asymmetric and overdispersed in relation to the Poisson model. We also demonstrated a low baseline false positive rate for the proposed methodology using 3D7 resequencing data and simulation. When applied to the non-reference isolate data, our approach detected known CNV hits, including an amplification of the PfMDR1 locus in DD2 and a large deletion in the CLAG3.2 gene in GB4, and putative novel CNV regions. When compared to the recently available FREEC and cn.MOPS approaches, our findings were more concordant with putative hits from the highest quality array data for the 7G8 and GB4 isolates. In summary, the proposed methodology brings an increase in flexibility, robustness, accuracy and statistical rigour to CNV detection using sequence coverage data.
Tracking by Identification Using Computer Vision and Radio
Mandeljc, Rok; Kovačič, Stanislav; Kristan, Matej; Perš, Janez
2013-01-01
We present a novel system for detection, localization and tracking of multiple people, which fuses a multi-view computer vision approach with a radio-based localization system. The proposed fusion combines the best of both worlds, excellent computer-vision-based localization, and strong identity information provided by the radio system, and is therefore able to perform tracking by identification, which makes it impervious to propagated identity switches. We present comprehensive methodology for evaluation of systems that perform person localization in world coordinate system and use it to evaluate the proposed system as well as its components. Experimental results on a challenging indoor dataset, which involves multiple people walking around a realistically cluttered room, confirm that proposed fusion of both systems significantly outperforms its individual components. Compared to the radio-based system, it achieves better localization results, while at the same time it successfully prevents propagation of identity switches that occur in pure computer-vision-based tracking. PMID:23262485
Preliminary Design of Aerial Spraying System for Microlight Aircraft
NASA Astrophysics Data System (ADS)
Omar, Zamri; Idris, Nurfazliawati; Rahim, M. Zulafif
2017-10-01
Undoubtedly agricultural is an important sector because it provides essential nutrients for human, and consequently is among the biggest sector for economic growth worldwide. It is crucial to ensure crops production is protected from any plant diseases and pests. Thus aerial spraying system on crops is developed to facilitate farmers to for crops pests control and it is very effective spraying method especially for large and hilly crop areas. However, the use of large aircraft for aerial spaying has a relatively high operational cost. Therefore, microlight aircraft is proposed to be used for crops aerial spraying works for several good reasons. In this paper, a preliminary design of aerial spraying system for microlight aircraft is proposed. Engineering design methodology is adopted in the development of the aerial sprayer and steps involved design are discussed thoroughly. A preliminary design for the microlight to be attached with an aerial spraying system is proposed.
Discontinuity Detection in the Shield Metal Arc Welding Process
Cocota, José Alberto Naves; Garcia, Gabriel Carvalho; da Costa, Adilson Rodrigues; de Lima, Milton Sérgio Fernandes; Rocha, Filipe Augusto Santos; Freitas, Gustavo Medeiros
2017-01-01
This work proposes a new methodology for the detection of discontinuities in the weld bead applied in Shielded Metal Arc Welding (SMAW) processes. The detection system is based on two sensors—a microphone and piezoelectric—that acquire acoustic emissions generated during the welding. The feature vectors extracted from the sensor dataset are used to construct classifier models. The approaches based on Artificial Neural Network (ANN) and Support Vector Machine (SVM) classifiers are able to identify with a high accuracy the three proposed weld bead classes: desirable weld bead, shrinkage cavity and burn through discontinuities. Experimental results illustrate the system’s high accuracy, greater than 90% for each class. A novel Hierarchical Support Vector Machine (HSVM) structure is proposed to make feasible the use of this system in industrial environments. This approach presented 96.6% overall accuracy. Given the simplicity of the equipment involved, this system can be applied in the metal transformation industries. PMID:28489045
A planar nano-positioner driven by shear piezoelectric actuators
NASA Astrophysics Data System (ADS)
Dong, W.; Li, H.; Du, Z.
2016-08-01
A planar nano-positioner driven by the shear piezoelectric actuators is proposed in this paper based on inertial sliding theory. The performance of the nano-positioner actuated by different driving signals is analyzed and discussed, e.g. the resolution and the average velocity which depend on the frequency, the amplitude and the wave form of the driving curves. Based on the proposed design, a prototype system of the nano-positioner is developed by using a capacitive sensor as the measurement device. The experiment results show that the proposed nano-positioner is capable of outputting two-dimensional motions within an area of 10 mm × 10 mm at a maximum speed of 0.25 mm/s. The corresponding resolution can be as small as 21 nm. The methodology outlined in this paper can be employed and extended to shear piezoelectric actuators involved in high precision positioning systems.
Discontinuity Detection in the Shield Metal Arc Welding Process.
Cocota, José Alberto Naves; Garcia, Gabriel Carvalho; da Costa, Adilson Rodrigues; de Lima, Milton Sérgio Fernandes; Rocha, Filipe Augusto Santos; Freitas, Gustavo Medeiros
2017-05-10
This work proposes a new methodology for the detection of discontinuities in the weld bead applied in Shielded Metal Arc Welding (SMAW) processes. The detection system is based on two sensors-a microphone and piezoelectric-that acquire acoustic emissions generated during the welding. The feature vectors extracted from the sensor dataset are used to construct classifier models. The approaches based on Artificial Neural Network (ANN) and Support Vector Machine (SVM) classifiers are able to identify with a high accuracy the three proposed weld bead classes: desirable weld bead, shrinkage cavity and burn through discontinuities. Experimental results illustrate the system's high accuracy, greater than 90% for each class. A novel Hierarchical Support Vector Machine (HSVM) structure is proposed to make feasible the use of this system in industrial environments. This approach presented 96.6% overall accuracy. Given the simplicity of the equipment involved, this system can be applied in the metal transformation industries.
Cegłowski, Michał; Kurczewska, Joanna; Smoluch, Marek; Reszke, Edward; Silberring, Jerzy; Schroeder, Grzegorz
2015-09-07
In this paper, a procedure for the preconcentration and transport of mixtures of acids, bases, and drug components to a mass spectrometer using magnetic scavengers is presented. Flowing atmospheric pressure afterglow mass spectrometry (FAPA-MS) was used as an analytical method for identification of the compounds by thermal desorption from the scavengers. The proposed procedure is fast and cheap, and does not involve time-consuming purification steps. The developed methodology can be applied for trapping harmful substances in minute quantities, to transport them to specialized, remotely located laboratories.
Fluid moments of the nonlinear Landau collision operator
Hirvijoki, E.; Lingam, M.; Pfefferle, D.; ...
2016-08-09
An important problem in plasma physics is the lack of an accurate and complete description of Coulomb collisions in associated fluid models. To shed light on the problem, this Letter introduces an integral identity involving the multivariate Hermite tensor polynomials and presents a method for computing exact expressions for the fluid moments of the nonlinear Landau collision operator. In conclusion, the proposed methodology provides a systematic and rigorous means of extending the validity of fluid models that have an underlying inverse-square force particle dynamics to arbitrary collisionality and flow.
Atwood's machine as a tool to introduce variable mass systems
NASA Astrophysics Data System (ADS)
de Sousa, Célia A.
2012-03-01
This article discusses an instructional strategy which explores eventual similarities and/or analogies between familiar problems and more sophisticated systems. In this context, the Atwood's machine problem is used to introduce students to more complex problems involving ropes and chains. The methodology proposed helps students to develop the ability needed to apply relevant concepts in situations not previously encountered. The pedagogical advantages are relevant for both secondary and high school students, showing that, through adequate examples, the question of the validity of Newton's second law may even be introduced to introductory level students.
NASA Astrophysics Data System (ADS)
Zendejas, Gerardo; Chiasson, Mike
This paper will propose and explore a method to enhance focal actors' abilities to enroll and control the many social and technical components interacting during the initiation, production, and diffusion of innovations. The reassembling and stabilizing of such components is the challenging goal of the focal actors involved in these processes. To address this possibility, a healthcare project involving the initiation, production, and diffusion of an IT-based innovation will be influenced by the researcher, using concepts from actor network theory (ANT), within an action research methodology (ARM). The experiences using this method, and the nature of enrolment and translation during its use, will highlight if and how ANT can provide a problem-solving method to help assemble the social and technical actants involved in the diffusion of an innovation. Finally, the paper will discuss the challenges and benefits of implementing such methods to attain widespread diffusion.
76 FR 39876 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-07
... Survey--Pretest of Proposed Questions and Methodology.'' In accordance with the Paperwork Reduction Act... Health Plan Survey-- Pretest of Proposed Questions and Methodology The Consumer Assessment of Healthcare... year to year. The CAHPS[supreg] program was designed to: Make it possible to compare survey results...
76 FR 57046 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-15
... Survey--Pretest of Proposed Questions and Methodology.'' In accordance with the Paperwork Reduction Act... Health Plan Survey-- Pretest of Proposed Questions and Methodology The Consumer Assessment of Healthcare... often changed from year to year. The CAHPS[reg] program was designed to: Make it possible to compare...
Code of Federal Regulations, 2012 CFR
2012-10-01
.... (iii) Methodological proposals must be submitted to CMS by June of the payment year and must be... the payment year. (4) CMS requires the qualifying MA organization to develop a methodological proposal... MA organization in the payment year. The methodological proposal— (i) Must be approved by CMS; and...
ERIC Educational Resources Information Center
Cochrane, Todd; Davis, Niki; Morrow, Donna
2013-01-01
A methodology for design based research (DBR) into effective development and use of Multi-User Virtual Environments (MUVE) in vocational education is proposed. It blends software development with DBR with two theories selected to inform the methodology. Legitimate peripheral participation LPP (Lave & Wenger, 1991) provides a filter when…
Expanding Simulations as a Means of Tactical Training with Multinational Partners
2017-06-09
gap through DOTMLPF in combination with an assessment of two case studies involving higher echelon use of simulations. Through this methodology , the...DOTMLPF in combination with an assessment of two case studies involving higher echelon use of simulations. Through this methodology , the findings...CHAPTER 3 RESEARCH METHODOLOGY .................................................................26 CHAPTER 4 ANALYSIS
Aeroelastic optimization methodology for viscous and turbulent flows
NASA Astrophysics Data System (ADS)
Barcelos Junior, Manuel Nascimento Dias
2007-12-01
In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.
Mistry, Pankaj; Dunn, Janet A; Marshall, Andrea
2017-07-18
The application of adaptive design methodology within a clinical trial setting is becoming increasingly popular. However the application of these methods within trials is not being reported as adaptive designs hence making it more difficult to capture the emerging use of these designs. Within this review, we aim to understand how adaptive design methodology is being reported, whether these methods are explicitly stated as an 'adaptive design' or if it has to be inferred and to identify whether these methods are applied prospectively or concurrently. Three databases; Embase, Ovid and PubMed were chosen to conduct the literature search. The inclusion criteria for the review were phase II, phase III and phase II/III randomised controlled trials within the field of Oncology that published trial results in 2015. A variety of search terms related to adaptive designs were used. A total of 734 results were identified, after screening 54 were eligible. Adaptive designs were more commonly applied in phase III confirmatory trials. The majority of the papers performed an interim analysis, which included some sort of stopping criteria. Additionally only two papers explicitly stated the term 'adaptive design' and therefore for most of the papers, it had to be inferred that adaptive methods was applied. Sixty-five applications of adaptive design methods were applied, from which the most common method was an adaptation using group sequential methods. This review indicated that the reporting of adaptive design methodology within clinical trials needs improving. The proposed extension to the current CONSORT 2010 guidelines could help capture adaptive design methods. Furthermore provide an essential aid to those involved with clinical trials.
Ergonomics and design: traffic sign and street name sign.
Moroni, Janaina Luisa da Silva; Aymone, José Luís Farinatti
2012-01-01
This work proposes a design methodology using ergonomics and anthropometry concepts applied to traffic sign and street name sign projects. Initially, a literature revision on cognitive ergonomics and anthropometry is performed. Several authors and their design methodologies are analyzed and the aspects to be considered in projects of traffic and street name signs are selected and other specific aspects are proposed for the design methodology. A case study of the signs of "Street of Antiques" in Porto Alegre city is presented. To do that, interviews with the population are made to evaluate the current situation of signs. After that, a new sign proposal with virtual prototyping is done using the developed methodology. The results obtained with new interviews about the proposal show the user satisfaction and the importance of cognitive ergonomics to development of this type of urban furniture.
Colombini, Daniela; Occhipinti, E; Di Leone, G
2011-01-01
During the last Congress of the International Ergonomics Association (IEA), Beijing, August 2009, an international group was founded with the task of developing a "toolkit for MSD prevention" under the IEA and in collaboration with the World Health Organization. The possible users of toolkits are: members of health and safety committees; health and safety representatives; line supervisors; foremen; workers; government representatives; health workers providing basic occupational health services; occupational health and safety specialists. According to the ISO standard 11228 series and the new Draft CD ISO 12259-2009: Application document guides for the potential user, our group developed a preliminary "mapping" methodology of occupational hazards in the craft industry, supported by software (Excel). The proposed methodology, using specific key enters and quick assessment criteria, allows a simple ergonomics hazards identification and risk estimation to be made. It is thus possible to decide for which occupational hazards a more exhaustive risk assessment will be necessary and which occupational consultant should be involved (occupational physician, safety engineer, industrial hygienist, etc.).
A Methodology and a Web Platform for the Collaborative Development of Context-Aware Systems
Martín, David; López-de-Ipiña, Diego; Alzua-Sorzabal, Aurkene; Lamsfus, Carlos; Torres-Manzanera, Emilio
2013-01-01
Information and services personalization is essential for an optimal user experience. Systems have to be able to acquire data about the user's context, process them in order to identify the user's situation and finally, adapt the functionality of the system to that situation, but the development of context-aware systems is complex. Data coming from distributed and heterogeneous sources have to be acquired, processed and managed. Several programming frameworks have been proposed in order to simplify the development of context-aware systems. These frameworks offer high-level application programming interfaces for programmers that complicate the involvement of domain experts in the development life-cycle. The participation of users that do not have programming skills but are experts in the application domain can speed up and improve the development process of these kinds of systems. Apart from that, there is a lack of methodologies to guide the development process. This article presents as main contributions, the implementation and evaluation of a web platform and a methodology to collaboratively develop context-aware systems by programmers and domain experts. PMID:23666131
[Core competencies in public health: a regional framework for the Americas].
Conejero, Juana Suárez; Godue, Charles; Gutiérrez, José Francisco García; Valladares, Laura Magaña; Rabionet, Silvia; Concha, José; Valdés, Manuel Vázquez; Gómez, Rubén Darío; Mujica, Oscar J; Cabezas, César; Lucano, Lindaura Liendo; Castellanos, Jorge
2013-07-01
The response is described to the 2010 call from the Pan American Health Organization to develop a Regional Framework on Core Competencies in Public Health, with a view to supporting the efforts of the countries in the Americas to build public health systems capacity as a strategy for optimal performance of the Essential Public Health Functions. The methodological process for the response was divided into four phases. In the first, a team of experts was convened who defined the methodology to be used during a workshop at the National Institute of Public Health of Mexico in 2010. The second phase involved formation of the working groups, using two criteria: experience and multidisciplinary membership, which resulted in a regional team with 225 members from 12 countries. This team prepared an initial proposal with 88 competencies. In the third phase, the competencies were cross-validated and their number reduced to 64. During the fourth phase, which included two workshops, in March 2011 (Medellín, Colombia) and June 2011 (Lima, Peru), discussions centered on analyzing the association between the results and the methodology.
Between hype and hope: What is really at stake with personalized medicine?
Abettan, Camille
2016-09-01
Over the last decade, personalized medicine has become a buzz word, which covers a broad spectrum of meanings and generates many different opinions. The purpose of this article is to achieve a better understanding of the reasons why personalized medicine gives rise to such conflicting opinions. We show that a major issue of personalized medicine is the gap existing between its claims and its reality. We then present and analyze different possible reasons for this gap. We propose an hypothesis inspired by the Windelband's distinction between nomothetic and idiographic methodology. We argue that the fuzzy situation of personalized medicine results from a mix between idiographic claims and nomothetic methodological procedures. Hence we suggest that the current quandary about personalized medicine cannot be solved without getting involved in a discussion about the complex epistemological and methodological status of medicine. To conclude, we show that the Gadamer's view of medicine as a dialogical process can be fruitfully used and reveals that personalization is not a theoretical task, but a practical one, which takes place within the clinical encounter.
Implementation of efficient trajectories for an ultrasonic scanner using chaotic maps
NASA Astrophysics Data System (ADS)
Almeda, A.; Baltazar, A.; Treesatayapun, C.; Mijarez, R.
2012-05-01
Typical ultrasonic methodology for nondestructive scanning evaluation uses systematic scanning paths. In many cases, this approach is time inefficient and also energy and computational power consuming. Here, a methodology for the scanning of defects using an ultrasonic echo-pulse scanning technique combined with chaotic trajectory generation is proposed. This is implemented in a Cartesian coordinate robotic system developed in our lab. To cover the entire search area, a chaotic function and a proposed mirror mapping were incorporated. To improve detection probability, our proposed scanning methodology is complemented with a probabilistic approach of discontinuity detection. The developed methodology was found to be more efficient than traditional ones used to localize and characterize hidden flaws.
Antia, B E; Omotara, B A; Rabasa, A I; Addy, E O; Tomfafi, O A A; Anaso, C C
2003-06-01
The aim of this study was to propose an alternative approach to traditional knowledge, attitude and practice (KAP) studies to enhance the quality of data on which educational health programmes are based. The methodology proposed and illustrated involved a triangulation of approaches derived from linguistics, cognitive science, and medical laboratory sciences. Three diarrhoeal health talks (educational messages) as given to mothers in three primary-care facilities in Borno State (Northeast Nigeria) were subjected to a linguistics analysis. Relationships were then sought between the ontology of knowledge in the health talks as revealed by the text analysis and two other kinds of data, namely: (a) mothers' answers to a set of ecologically-sensitive reasoning questions that test how much relevant inferential knowledge the health talks allow for and (b) results of microbiological and biochemical analyses of salt-sugar rehydration solutions prepared by mothers participating in the study. The findings of the study show a relationship between contents/formatting of the health talks and the extent to which relevant inferential competence was supported or demonstrated by mothers. It was also evident that the laboratory analyses could be related either directly to the health talks or indirectly in terms of what the health talks need to emphasize on. The conclusion shows how the methodology proposed addresses shortcomings of traditional KAP studies in respect of the gap between health knowledge and practice.
Espié, Stéphane; Boubezoul, Abderrahmane; Aupetit, Samuel; Bouaziz, Samir
2013-09-01
Instrumented vehicles are key tools for in-depth understanding of drivers' behaviours, thus for the design of scientifically based countermeasures to reduce fatalities and injuries. The instrumentation of Powered Two-Wheelers (PTW) has been less widely implemented that for vehicles, in part due to the technical challenges involved. The last decade has seen the development in Europe of several tools and methodologies to study motorcycle riders' behaviours and motorcycle dynamics for a range of situations, including crash events involving falls. Thanks to these tools, a broad-ranging research programme has been conducted, from the design and tuning of real-time falls detection to the study of riding training systems, as well as studies focusing on naturalistic riding situations such as filtering and line splitting. The methodology designed for the in-depth study of riders' behaviours in naturalistic situations can be based upon the combination of several sources of data such as: PTW sensors, context-based video retrieval system, Global Positioning System (GPS) and verbal data on the riders' decisions making process. The goals of this paper are: (1) to present the methodological tools developed and used by INRETS-MSIS (now Ifsttar-TS2/Simu) in the last decade for the study of riders' behaviours in real-world environment as well as on track for situations up to falls, (2) to illustrate the kind of results that can be gained from the conducted studies, (3) to identify the advantages and limitations of the proposed methodology to conduct large scale naturalistic riding studies, and (4) to highlight how the knowledge gained from this approach will fill many of the knowledge gaps about PTW-riders' behaviours and risk factors. Copyright © 2013 Elsevier Ltd. All rights reserved.
Comparison of two drug safety signals in a pharmacovigilance data mining framework.
Tubert-Bitter, Pascale; Bégaud, Bernard; Ahmed, Ismaïl
2016-04-01
Since adverse drug reactions are a major public health concern, early detection of drug safety signals has become a top priority for regulatory agencies and the pharmaceutical industry. Quantitative methods for analyzing spontaneous reporting material recorded in pharmacovigilance databases through data mining have been proposed in the last decades and are increasingly used to flag potential safety problems. While automated data mining is motivated by the usually huge size of pharmacovigilance databases, it does not systematically produce relevant alerts. Moreover, each detected signal requires appropriate assessment that may involve investigation of the whole therapeutic class. The goal of this article is to provide a methodology for comparing two detected signals. It is nested within the automated surveillance framework as (1) no extra information is required and (2) no simple inference on the actual risks can be extrapolated from spontaneous reporting data. We designed our methodology on the basis of two classical methods used for automated signal detection: the Bayesian Gamma Poisson Shrinker and the frequentist Proportional Reporting Ratio. A simulation study was conducted to assess the performances of both proposed methods. The latter were used to compare cardiovascular signals for two HIV treatments from the French pharmacovigilance database. © The Author(s) 2012.
A new approach to road accident rescue.
Morales, Alejandro; González-Aguilera, Diego; López, Alfonso I; Gutiérrez, Miguel A
2016-01-01
This article develops and validates a new methodology and tool for rescue assistance in traffic accidents, with the aim of improving its efficiency and safety in the evacuation of people, reducing the number of victims in road accidents. Different tests supported by professionals and experts have been designed under different circumstances and with different categories of damaged vehicles coming from real accidents and simulated trapped victims in order to calibrate and refine the proposed methodology and tool. To validate this new approach, a tool called App_Rescue has been developed. This tool is based on the use of a computer system that allows an efficient access to the technical information of the vehicle and sanitary information of the common passengers. The time spent during rescue using the standard protocol and the proposed method was compared. This rescue assistance system allows us to make vital information accessible in posttrauma care services, improving the effectiveness of interventions by the emergency services, reducing the rescue time and therefore minimizing the consequences involved and the number of victims. This could often mean saving lives. In the different simulated rescue operations, the rescue time has been reduced an average of 14%.
A feasibility study of damage detection in beams using high-speed camera (Conference Presentation)
NASA Astrophysics Data System (ADS)
Wan, Chao; Yuan, Fuh-Gwo
2017-04-01
In this paper a method for damage detection in beam structures using high-speed camera is presented. Traditional methods of damage detection in structures typically involve contact (i.e., piezoelectric sensor or accelerometer) or non-contact sensors (i.e., laser vibrometer) which can be costly and time consuming to inspect an entire structure. With the popularity of the digital camera and the development of computer vision technology, video cameras offer a viable capability of measurement including higher spatial resolution, remote sensing and low-cost. In the study, a damage detection method based on the high-speed camera was proposed. The system setup comprises a high-speed camera and a line-laser which can capture the out-of-plane displacement of a cantilever beam. The cantilever beam with an artificial crack was excited and the vibration process was recorded by the camera. A methodology called motion magnification, which can amplify subtle motions in a video is used for modal identification of the beam. A finite element model was used for validation of the proposed method. Suggestions for applications of this methodology and challenges in future work will be discussed.
Artistic image analysis using graph-based learning approaches.
Carneiro, Gustavo
2013-08-01
We introduce a new methodology for the problem of artistic image analysis, which among other tasks, involves the automatic identification of visual classes present in an art work. In this paper, we advocate the idea that artistic image analysis must explore a graph that captures the network of artistic influences by computing the similarities in terms of appearance and manual annotation. One of the novelties of our methodology is the proposed formulation that is a principled way of combining these two similarities in a single graph. Using this graph, we show that an efficient random walk algorithm based on an inverted label propagation formulation produces more accurate annotation and retrieval results compared with the following baseline algorithms: bag of visual words, label propagation, matrix completion, and structural learning. We also show that the proposed approach leads to a more efficient inference and training procedures. This experiment is run on a database containing 988 artistic images (with 49 visual classification problems divided into a multiclass problem with 27 classes and 48 binary problems), where we show the inference and training running times, and quantitative comparisons with respect to several retrieval and annotation performance measures.
Denadai, Rafael; Saad-Hossne, Rogério; Martinhão Souto, Luís Ricardo
2013-05-01
Because of ethical and medico-legal aspects involved in the training of cutaneous surgical skills on living patients, human cadavers and living animals, it is necessary the search for alternative and effective forms of training simulation. To propose and describe an alternative methodology for teaching and learning the principles of cutaneous surgery in a medical undergraduate program by using a chicken-skin bench model. One instructor for every four students, teaching materials on cutaneous surgical skills, chicken trunks, wings, or thighs, a rigid platform support, needled threads, needle holders, surgical blades with scalpel handles, rat-tooth tweezers, scissors, and marking pens were necessary for training simulation. A proposal for simulation-based training on incision, suture, biopsy, and on reconstruction techniques using a chicken-skin bench model distributed in several sessions and with increasing levels of difficultywas structured. Both feedback and objective evaluations always directed to individual students were also outlined. The teaching of a methodology for the principles of cutaneous surgery using a chicken-skin bench model versatile, portable, easy to assemble, and inexpensive is an alternative and complementary option to the armamentarium of methods based on other bench models described.
Writing a Research Proposal to The Research Council of Oman.
Al-Shukaili, Ahmed; Al-Maniri, Abdullah
2017-05-01
Writing a research proposal can be a challenging task for young researchers. This article explains how to write a strong research proposal to apply for funding, specifically, a proposal for The Research Council (TRC) of Oman. Three different research proposal application forms are currently used in TRC, including Open Research Grant (ORG), Graduate Research Support Program (GRSP), and Faculty-mentored Undergraduate Research Award Program (FURAP). The application forms are filled and submitted electronically on TRC website. Each of the proposals submitted to TRC is selected through a rigorous reviewing and screening process. Novelty and originality of the research idea is the most crucial element in writing a research proposal. Performing an in-depth review of the literature will assist you to compose a good researchable question and generate a strong hypothesis. The development of a good hypothesis will offer insight into the specific objectives of a study. Research objectives should be focused, measurable, and achievable by a specific time using the most appropriate methodology. Moreover, it is essential to select a proper study design in-line with the purpose of the study and the hypothesis. Furthermore, social/economic impact and reasonable budget of proposed research are important criteria in research proposal evaluation by TRC. Finally, ethical principles should be observed before writing a research proposal involving human or animal subjects.
Vargas, E; Ruiz, M A; Campuzano, S; Reviejo, A J; Pingarrón, J M
2016-03-31
A non-destructive, rapid and simple to use sensing method for direct determination of glucose in non-processed fruits is described. The strategy involved on-line microdialysis sampling coupled with a continuous flow system with amperometric detection at an enzymatic biosensor. Apart from direct determination of glucose in fruit juices and blended fruits, this work describes for the first time the successful application of an enzymatic biosensor-based electrochemical approach to the non-invasive determination of glucose in raw fruits. The methodology correlates, through previous calibration set-up, the amperometric signal generated from glucose in non-processed fruits with its content in % (w/w). The comparison of the obtained results using the proposed approach in different fruits with those provided by other method involving the same commercial biosensor as amperometric detector in stirred solutions pointed out that there were no significant differences. Moreover, in comparison with other available methodologies, this microdialysis-coupled continuous flow system amperometric biosensor-based procedure features straightforward sample preparation, low cost, reduced assay time (sampling rate of 7 h(-1)) and ease of automation. Copyright © 2016 Elsevier B.V. All rights reserved.
Macroeconomic effects on mortality revealed by panel analysis with nonlinear trends.
Ionides, Edward L; Wang, Zhen; Tapia Granados, José A
2013-10-03
Many investigations have used panel methods to study the relationships between fluctuations in economic activity and mortality. A broad consensus has emerged on the overall procyclical nature of mortality: perhaps counter-intuitively, mortality typically rises above its trend during expansions. This consensus has been tarnished by inconsistent reports on the specific age groups and mortality causes involved. We show that these inconsistencies result, in part, from the trend specifications used in previous panel models. Standard econometric panel analysis involves fitting regression models using ordinary least squares, employing standard errors which are robust to temporal autocorrelation. The model specifications include a fixed effect, and possibly a linear trend, for each time series in the panel. We propose alternative methodology based on nonlinear detrending. Applying our methodology on data for the 50 US states from 1980 to 2006, we obtain more precise and consistent results than previous studies. We find procyclical mortality in all age groups. We find clear procyclical mortality due to respiratory disease and traffic injuries. Predominantly procyclical cardiovascular disease mortality and countercyclical suicide are subject to substantial state-to-state variation. Neither cancer nor homicide have significant macroeconomic association.
Macroeconomic effects on mortality revealed by panel analysis with nonlinear trends
Ionides, Edward L.; Wang, Zhen; Tapia Granados, José A.
2013-01-01
Many investigations have used panel methods to study the relationships between fluctuations in economic activity and mortality. A broad consensus has emerged on the overall procyclical nature of mortality: perhaps counter-intuitively, mortality typically rises above its trend during expansions. This consensus has been tarnished by inconsistent reports on the specific age groups and mortality causes involved. We show that these inconsistencies result, in part, from the trend specifications used in previous panel models. Standard econometric panel analysis involves fitting regression models using ordinary least squares, employing standard errors which are robust to temporal autocorrelation. The model specifications include a fixed effect, and possibly a linear trend, for each time series in the panel. We propose alternative methodology based on nonlinear detrending. Applying our methodology on data for the 50 US states from 1980 to 2006, we obtain more precise and consistent results than previous studies. We find procyclical mortality in all age groups. We find clear procyclical mortality due to respiratory disease and traffic injuries. Predominantly procyclical cardiovascular disease mortality and countercyclical suicide are subject to substantial state-to-state variation. Neither cancer nor homicide have significant macroeconomic association. PMID:24587843
NASA Astrophysics Data System (ADS)
Zheng, Feifei; Simpson, Angus R.; Zecchin, Aaron C.
2011-08-01
This paper proposes a novel optimization approach for the least cost design of looped water distribution systems (WDSs). Three distinct steps are involved in the proposed optimization approach. In the first step, the shortest-distance tree within the looped network is identified using the Dijkstra graph theory algorithm, for which an extension is proposed to find the shortest-distance tree for multisource WDSs. In the second step, a nonlinear programming (NLP) solver is employed to optimize the pipe diameters for the shortest-distance tree (chords of the shortest-distance tree are allocated the minimum allowable pipe sizes). Finally, in the third step, the original looped water network is optimized using a differential evolution (DE) algorithm seeded with diameters in the proximity of the continuous pipe sizes obtained in step two. As such, the proposed optimization approach combines the traditional deterministic optimization technique of NLP with the emerging evolutionary algorithm DE via the proposed network decomposition. The proposed methodology has been tested on four looped WDSs with the number of decision variables ranging from 21 to 454. Results obtained show the proposed approach is able to find optimal solutions with significantly less computational effort than other optimization techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duan, Yuhua
2012-11-02
Since current technologies for capturing CO{sub 2} to fight global climate change are still too energy intensive, there is a critical need for development of new materials that can capture CO{sub 2} reversibly with acceptable energy costs. Accordingly, solid sorbents have been proposed to be used for CO{sub 2} capture applications through a reversible chemical transformation. By combining thermodynamic database mining with first principles density functional theory and phonon lattice dynamics calculations, a theoretical screening methodology to identify the most promising CO{sub 2} sorbent candidates from the vast array of possible solid materials has been proposed and validated. The calculatedmore » thermodynamic properties of different classes of solid materials versus temperature and pressure changes were further used to evaluate the equilibrium properties for the CO{sub 2} adsorption/desorption cycles. According to the requirements imposed by the pre- and post- combustion technologies and based on our calculated thermodynamic properties for the CO{sub 2} capture reactions by the solids of interest, we were able to screen only those solid materials for which lower capture energy costs are expected at the desired pressure and temperature conditions. Only those selected CO{sub 2} sorbent candidates were further considered for experimental validations. The ab initio thermodynamic technique has the advantage of identifying thermodynamic properties of CO{sub 2} capture reactions without any experimental input beyond crystallographic structural information of the solid phases involved. Such methodology not only can be used to search for good candidates from existing database of solid materials, but also can provide some guidelines for synthesis new materials. In this presentation, we first introduce our screening methodology and the results on a testing set of solids with known thermodynamic properties to validate our methodology. Then, by applying our computational method to several different kinds of solid systems, we demonstrate that our methodology can predict the useful information to help developing CO{sub 2} capture Technologies.« less
The methodology for modeling queuing systems using Petri nets
NASA Astrophysics Data System (ADS)
Kotyrba, Martin; Gaj, Jakub; Tvarůžka, Matouš
2017-07-01
This papers deals with the use of Petri nets in modeling and simulation of queuing systems. The first part is focused on the explanation of basic concepts and properties of Petri nets and queuing systems. The proposed methodology for the modeling of queuing systems using Petri nets is described in the practical part. The proposed methodology will be tested on specific cases.
Neurobiology of suicidal behaviour.
Pjevac, Milica; Pregelj, Peter
2012-10-01
It is known that suicidal behaviour has multiple causes. If triggers could be mainly attributed to environmental factors, predisposition could be associated with early stressors on one side such as childhood adversities and genetic predisposition. No convincing animal model of suicide has been produced to date. The study of endophenotypes has been proposed as a good strategy to overcome the methodological difficulties. However, research in suicidal behaviours using endophenotypes entrails important methodological problems. Further, serotoninergic system was studied in patients with suicidal behaviour primary due to its involvement of serotonin in impulsive-aggressive behaviour, which has been shown to be a major risk factor in suicidal behaviour. Not only on the level of neurotransmitters but also the regulation of neurotropic factors could be impaired in suicide victims. Multiple lines of evidence including studies of levels of BDNF in blood cells and plasma of suicidal patients, postmortem brain studies in suicidal subjects with or without depression, and genetic association studies linking BDNF to suicide suggest that suicidal behaviour may be associated with a decrease in BDNF functioning. It seems that especially specific gene variants regulating the serotoninergic system and other neuronal systems involved in stress response are associated with suicidal behaviour. Most genetic studies on suicidal behaviour have considered a small set of functional polymorphisms relevant mostly to monoaminergic neurotransmission. However, genes and epigenetic mechanisms involved in regulation of other factors such as BDNF seem to be even more relevant for further research.
Calibration of CORSIM models under saturated traffic flow conditions.
DOT National Transportation Integrated Search
2013-09-01
This study proposes a methodology to calibrate microscopic traffic flow simulation models. : The proposed methodology has the capability to calibrate simultaneously all the calibration : parameters as well as demand patterns for any network topology....
NASA Technical Reports Server (NTRS)
Stoughton, R. M.
1990-01-01
A proposed methodology applicable to the design of manipulator systems is described. The current design process is especially weak in the preliminary design phase, since there is no accepted measure to be used in trading off different options available for the various subsystems. The design process described uses Cartesian End-Effector Impedance as a measure of performance for the system. Having this measure of performance, it is shown how it may be used to determine the trade-offs necessary to the preliminary design phase. The design process involves three main parts: (1) determination of desired system performance in terms of End-Effector Impedance; (2) trade-off design options to achieve this desired performance; and (3) verification of system performance through laboratory testing. The design process is developed using numerous examples and experiments to demonstrate the feasability of this approach to manipulator design.
An integrative fuzzy Kansei engineering and Kano model for logistics services
NASA Astrophysics Data System (ADS)
Hartono, M.; Chuan, T. K.; Prayogo, D. N.; Santoso, A.
2017-11-01
Nowadays, customer emotional needs (known as Kansei) in product and especially in services become a major concern. One of the emerging services is the logistics services. In obtaining a global competitive advantage, logistics services should understand and satisfy their customer affective impressions (Kansei). How to capture, model and analyze the customer emotions has been well structured by Kansei Engineering, equipped with Kano model to strengthen its methodology. However, its methodology lacks of the dynamics of customer perception. More specifically, there is a criticism of perceived scores on user preferences, in both perceived service quality and Kansei response, whether they represent an exact numerical value. Thus, this paper is proposed to discuss an approach of fuzzy Kansei in logistics service experiences. A case study in IT-based logistics services involving 100 subjects has been conducted. Its findings including the service gaps accompanied with prioritized improvement initiatives are discussed.
Massive parallelization of serial inference algorithms for a complex generalized linear model
Suchard, Marc A.; Simpson, Shawn E.; Zorych, Ivan; Ryan, Patrick; Madigan, David
2014-01-01
Following a series of high-profile drug safety disasters in recent years, many countries are redoubling their efforts to ensure the safety of licensed medical products. Large-scale observational databases such as claims databases or electronic health record systems are attracting particular attention in this regard, but present significant methodological and computational concerns. In this paper we show how high-performance statistical computation, including graphics processing units, relatively inexpensive highly parallel computing devices, can enable complex methods in large databases. We focus on optimization and massive parallelization of cyclic coordinate descent approaches to fit a conditioned generalized linear model involving tens of millions of observations and thousands of predictors in a Bayesian context. We find orders-of-magnitude improvement in overall run-time. Coordinate descent approaches are ubiquitous in high-dimensional statistics and the algorithms we propose open up exciting new methodological possibilities with the potential to significantly improve drug safety. PMID:25328363
Wetzel, Hermann
2006-01-01
In a large number of mostly retrospective association studies, a statistical relationship between volume and quality of health care has been reported. However, the relevance of these results is frequently limited by methodological shortcomings. In this article, criteria for the evidence and definition of thresholds for volume-outcome relations are proposed, e.g. the specification of relevant outcomes for quality indicators, analysis of volume as a continuous variable with an adequate case-mix and risk adjustment, accounting for cluster effects and considering mathematical models for the derivation of cut-off values. Moreover, volume thresholds are regarded as surrogate parameters for the indirect classification of the quality of care, whose diagnostic validity and effectiveness in improving health care quality need to be evaluated in prospective studies.
Probabilistic self-organizing maps for continuous data.
Lopez-Rubio, Ezequiel
2010-10-01
The original self-organizing feature map did not define any probability distribution on the input space. However, the advantages of introducing probabilistic methodologies into self-organizing map models were soon evident. This has led to a wide range of proposals which reflect the current emergence of probabilistic approaches to computational intelligence. The underlying estimation theories behind them derive from two main lines of thought: the expectation maximization methodology and stochastic approximation methods. Here, we present a comprehensive view of the state of the art, with a unifying perspective of the involved theoretical frameworks. In particular, we examine the most commonly used continuous probability distributions, self-organization mechanisms, and learning schemes. Special emphasis is given to the connections among them and their relative advantages depending on the characteristics of the problem at hand. Furthermore, we evaluate their performance in two typical applications of self-organizing maps: classification and visualization.
Gregory, Katherine
2018-06-01
In the last 20 years, qualitative research scholars have begun to interrogate methodological and analytic issues concerning online research settings as both data sources and instruments for digital methods. This article examines the adaptation of parts of a qualitative research curriculum for understanding online communication settings. I propose methodological best practices for researchers and educators that I developed while teaching research methods to undergraduate and graduate students across disciplinary departments and discuss obstacles faced during my own research while gathering data from online sources. This article confronts issues concerning the disembodied aspects of applying what in practice should be rooted in a humanistic inquiry. Furthermore, as some approaches to online qualitative research as a digital method grow increasingly problematic with the development of new data mining technologies, I will also briefly touch upon borderline ethical practices involving data-scraping-based qualitative research.
Optimal allocation of testing resources for statistical simulations
NASA Astrophysics Data System (ADS)
Quintana, Carolina; Millwater, Harry R.; Singh, Gulshan; Golden, Patrick
2015-07-01
Statistical estimates from simulation involve uncertainty caused by the variability in the input random variables due to limited data. Allocating resources to obtain more experimental data of the input variables to better characterize their probability distributions can reduce the variance of statistical estimates. The methodology proposed determines the optimal number of additional experiments required to minimize the variance of the output moments given single or multiple constraints. The method uses multivariate t-distribution and Wishart distribution to generate realizations of the population mean and covariance of the input variables, respectively, given an amount of available data. This method handles independent and correlated random variables. A particle swarm method is used for the optimization. The optimal number of additional experiments per variable depends on the number and variance of the initial data, the influence of the variable in the output function and the cost of each additional experiment. The methodology is demonstrated using a fretting fatigue example.
Peffer, Melanie; Renken, Maggie
2016-01-01
Rather than pursue questions related to learning in biology from separate camps, recent calls highlight the necessity of interdisciplinary research agendas. Interdisciplinary collaborations allow for a complicated and expanded approach to questions about learning within specific science domains, such as biology. Despite its benefits, interdisciplinary work inevitably involves challenges. Some such challenges originate from differences in theoretical and methodological approaches across lines of work. Thus, aims at developing successful interdisciplinary research programs raise important considerations regarding methodologies for studying biology learning, strategies for approaching collaborations, and training of early-career scientists. Our goal here is to describe two fields important to understanding learning in biology, discipline-based education research and the learning sciences. We discuss differences between each discipline’s approach to biology education research and the benefits and challenges associated with incorporating these perspectives in a single research program. We then propose strategies for building productive interdisciplinary collaboration. PMID:27881446
Sciutto, Giorgia; Oliveri, Paolo; Catelli, Emilio; Bonacini, Irene
2017-01-01
In the field of applied researches in heritage science, the use of multivariate approach is still quite limited and often chemometric results obtained are often underinterpreted. Within this scenario, the present paper is aimed at disseminating the use of suitable multivariate methodologies and proposes a procedural workflow applied on a representative group of case studies, of considerable importance for conservation purposes, as a sort of guideline on the processing and on the interpretation of this FTIR data. Initially, principal component analysis (PCA) is performed and the score values are converted into chemical maps. Successively, the brushing approach is applied, demonstrating its usefulness for a deep understanding of the relationships between the multivariate map and PC score space, as well as for the identification of the spectral bands mainly involved in the definition of each area localised within the score maps. PMID:29333162
Daza-Caicedo, Sandra; Maldonado, Oscar; Arboleda-Castrillón, Tania; Falla, Sigrid; Moreno, Pablo; Tafur-Sequera, Mayali; Papagayo, Diana
2017-01-01
We propose a set of qualitative indicators for monitoring practices of social appropriation of science and technology. The design of this set is based on the Maloka case, but it can be of use to multiple actors involved in the social appropriation of science and technology (referred to by its Spanish acronym, ASCyT). The introduction discusses the concept of ASCyT. The first section provides a review of the literature about measuring activities that link science and society. The second section explains why it is important to develop this type of measurement. The third section lays out the methodology used in designing the indicators. The fourth section explains the set of indicators and the fifth reflects on that process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Hongxing; Fang, Hengrui; Miller, Mitchell D.
2016-07-15
An iterative transform algorithm is proposed to improve the conventional molecular-replacement method for solving the phase problem in X-ray crystallography. Several examples of successful trial calculations carried out with real diffraction data are presented. An iterative transform method proposed previously for direct phasing of high-solvent-content protein crystals is employed for enhancing the molecular-replacement (MR) algorithm in protein crystallography. Target structures that are resistant to conventional MR due to insufficient similarity between the template and target structures might be tractable with this modified phasing method. Trial calculations involving three different structures are described to test and illustrate the methodology. The relationshipmore » of the approach to PHENIX Phaser-MR and MR-Rosetta is discussed.« less
Applying the compound Poisson process model to the reporting of injury-related mortality rates.
Kegler, Scott R
2007-02-16
Injury-related mortality rate estimates are often analyzed under the assumption that case counts follow a Poisson distribution. Certain types of injury incidents occasionally involve multiple fatalities, however, resulting in dependencies between cases that are not reflected in the simple Poisson model and which can affect even basic statistical analyses. This paper explores the compound Poisson process model as an alternative, emphasizing adjustments to some commonly used interval estimators for population-based rates and rate ratios. The adjusted estimators involve relatively simple closed-form computations, which in the absence of multiple-case incidents reduce to familiar estimators based on the simpler Poisson model. Summary data from the National Violent Death Reporting System are referenced in several examples demonstrating application of the proposed methodology.
Discrete transparent boundary conditions for the mixed KDV-BBM equation
NASA Astrophysics Data System (ADS)
Besse, Christophe; Noble, Pascal; Sanchez, David
2017-09-01
In this paper, we consider artificial boundary conditions for the linearized mixed Korteweg-de Vries (KDV) and Benjamin-Bona-Mahoney (BBM) equation which models water waves in the small amplitude, large wavelength regime. Continuous (respectively discrete) artificial boundary conditions involve non local operators in time which in turn requires to compute time convolutions and invert the Laplace transform of an analytic function (respectively the Z-transform of an holomorphic function). In this paper, we propose a new, stable and fairly general strategy to carry out this crucial step in the design of transparent boundary conditions. For large time simulations, we also introduce a methodology based on the asymptotic expansion of coefficients involved in exact direct transparent boundary conditions. We illustrate the accuracy of our methods for Gaussian and wave packets initial data.
76 FR 62068 - Proposed Data Collections Submitted for Public Comment and Recommendations
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-06
... methodological objectives. The first objective is to test the feasibility of the proposed sampling frame and to... minutes. Results of the methodological component of the feasibility study will be used to assess the...
NASA Astrophysics Data System (ADS)
Vazquez Rascon, Maria de Lourdes
This thesis focuses on the implementation of a participatory and transparent decision making tool about the wind farm projects. This tool is based on an (argumentative) framework that reflects the stakeholder's values systems involved in these projects and it employs two multicriteria methods: the multicriteria decision aide and the participatory geographical information systems, making it possible to represent this value systems by criteria and indicators to be evaluated. The stakeholder's values systems will allow the inclusion of environmental, economic and social-cultural aspects of wind energy projects and, thus, a sustainable development wind projects vision. This vision will be analyzed using the 16 sustainable principles included in the Quebec's Sustainable Development Act. Four specific objectives have been instrumented to favor a logical completion work, and to ensure the development of a successfultool : designing a methodology to couple the MCDA and participatory GIS, testing the developed methodology by a case study, making a robustness analysis to address strategic issues and analyzing the strengths, weaknesses, opportunities and threads of the developed methodology. Achieving the first goal allowed us to obtain a decision-making tool called Territorial Intelligence Modeling for Energy Development (TIMED approach). The TIMED approach is visually represented by a figure expressing the idea of a co-construction decision and where ail stakeholders are the focus of this methodology. TIMED is composed of four modules: Multi-Criteria decision analysis, participatory geographic Information systems, active involvement of the stakeholders and scientific knowledge/local knowledge. The integration of these four modules allows for the analysis of different implementation scenarios of wind turbines in order to choose the best one based on a participatory and transparent decision-making process that takes into account stakeholders' concerns. The second objective enabled the testing of TIMED in an ex-post experience of a wind farm in operation since 2006. In this test, II people participated representing four stakeholder' categories: the private sector, the public sector, experts and civil society. This test allowed us to analyze the current situation in which wind projects are currently developed in Quebec. The concerns of some stakeholders regarding situations that are not considered in the current context were explored through a third goal. This third objective allowed us to make simulations taking into account the assumptions of strategic levels. Examples of the strategic level are the communication tools used to approach the host community and the park property type. Finally, the fourth objective, a SWOT analysis with the participation of eight experts, allowed us to verify the extent to which TIMED approach succeeded in constructing four fields for participatory decision-making: physical, intellectual, emotional and procedural. From these facts, 116 strengths, 28 weaknesses, 32 constraints and 54 opportunities were identified. Contributions, applications, limitations and extensions of this research are based on giving a participatory decision-making methodology taking into account socio-cultural, environmental and economic variables; making reflection sessions on a wind farm in operation; acquiring MCDA knowledge for participants involved in testing the proposed methodology; taking into account the physical, intellectual, emotional and procedural spaces to al1iculate a participatory decision; using the proposed methodology in renewable energy sources other than wind; the need to an interdisciplinary team for the methodology application; access to quality data; access to information technologies; the right to public participation; the neutrality of experts; the relationships between experts and non-experts; cultural constraints; improvement of designed indicators; the implementation of a Web platform for participatory decision-making and writing a manual on the use of the developed methodology. Keywords: wind farm, multicriteria decision, geographic information systems, TIMED approach, sustainable wind energy projects development, renewable energy, social participation, robustness concern, SWOT analysis.
Niaksu, Olegas; Zaptorius, Jonas
2014-01-01
This paper presents the methodology suitable for creation of a performance related remuneration system in healthcare sector, which would meet requirements for efficiency and sustainable quality of healthcare services. Methodology for performance indicators selection, ranking and a posteriori evaluation has been proposed and discussed. Priority Distribution Method is applied for unbiased performance criteria weighting. Data mining methods are proposed to monitor and evaluate the results of motivation system.We developed a method for healthcare specific criteria selection consisting of 8 steps; proposed and demonstrated application of Priority Distribution Method for the selected criteria weighting. Moreover, a set of data mining methods for evaluation of the motivational system outcomes was proposed. The described methodology for calculating performance related payment needs practical approbation. We plan to develop semi-automated tools for institutional and personal performance indicators monitoring. The final step would be approbation of the methodology in a healthcare facility.
D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco
2016-02-01
Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. Copyright © 2015 Elsevier Ltd. All rights reserved.
García-Madruga, Juan A.; Gómez-Veiga, Isabel; Vila, José Ó.
2016-01-01
In this paper, we propose a preliminary theory of executive functions that address in a specific way their relationship with working memory (WM) and higher-level cognition. It includes: (a) four core on-line WM executive functions that are involved in every novel and complex cognitive task; (b) two higher order off-line executive functions, planning and revision, that are required to resolving the most complex intellectual abilities; and (c) emotional control that is involved in any complex, novel and difficult task. The main assumption is that efficiency on thinking abilities may be improved by specific instruction or training on the executive functions necessary to solving novel and complex tasks involved in these abilities. Evidence for the impact of our training proposal on WM's executive functions involved in higher-level cognitive abilities comes from three studies applying an adaptive program designed to improve reading comprehension in primary school students by boosting the core WM's executive functions involved in it: focusing on relevant information, switching (or shifting) between representations or tasks, connecting incoming information from text with long-term representations, updating of the semantic representation of the text in WM, and inhibition of irrelevant information. The results are consistent with the assumption that cognitive enhancements from the training intervention may have affected not only a specific but also a more domain-general mechanism involved in various executive functions. We discuss some methodological issues in the studies of effects of WM training on reading comprehension. The perspectives and limitations of our approach are finally discussed. PMID:26869961
Extending the Reach of Evidence-Based Medicine: A Proposed Categorization of Lower-Level Evidence.
Detterbeck, Frank C; Gould, Michael K; Lewis, Sandra Zelman; Patel, Sheena
2018-02-01
Clinical practice involves making many treatment decisions for which only limited formal evidence exists. While the methodology of evidence-based medicine (EBM) has evolved tremendously, there is a need to better characterize lower-level evidence. This should enhance the ability to appropriately weigh the evidence against other considerations, and counter the temptation to think it is more robust than it actually is. A framework to categorize lower-level evidence is proposed, consisting of nonrandomized comparisons, extrapolation using indirect evidence, rationale, and clinical experience (ie, an accumulated general impression). Subtypes are recognized within these categories, based on the degree of confounding in nonrandomized comparisons, the uncertainty involved in extrapolation from indirect evidence, and the plausibility of a rationale. Categorizing the available evidence in this way can promote a better understanding of the strengths and limitations of using such evidence as the basis for treatment decisions in clinically relevant areas that are devoid of higher-level evidence. Copyright © 2017 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.
A fuzzy Bayesian network approach to quantify the human behaviour during an evacuation
NASA Astrophysics Data System (ADS)
Ramli, Nurulhuda; Ghani, Noraida Abdul; Ahmad, Nazihah
2016-06-01
Bayesian Network (BN) has been regarded as a successful representation of inter-relationship of factors affecting human behavior during an emergency. This paper is an extension of earlier work of quantifying the variables involved in the BN model of human behavior during an evacuation using a well-known direct probability elicitation technique. To overcome judgment bias and reduce the expert's burden in providing precise probability values, a new approach for the elicitation technique is required. This study proposes a new fuzzy BN approach for quantifying human behavior during an evacuation. Three major phases of methodology are involved, namely 1) development of qualitative model representing human factors during an evacuation, 2) quantification of BN model using fuzzy probability and 3) inferencing and interpreting the BN result. A case study of three inter-dependencies of human evacuation factors such as danger assessment ability, information about the threat and stressful conditions are used to illustrate the application of the proposed method. This approach will serve as an alternative to the conventional probability elicitation technique in understanding the human behavior during an evacuation.
Proposed Objective Odor Control Test Methodology for Waste Containment
NASA Technical Reports Server (NTRS)
Vos, Gordon
2010-01-01
The Orion Cockpit Working Group has requested that an odor control testing methodology be proposed to evaluate the odor containment effectiveness of waste disposal bags to be flown on the Orion Crew Exploration Vehicle. As a standardized "odor containment" test does not appear to be a matter of record for the project, a new test method is being proposed. This method is based on existing test methods used in industrial hygiene for the evaluation of respirator fit in occupational settings, and takes into consideration peer reviewed documentation of human odor thresholds for standardized contaminates, industry stardnard atmostpheric testing methodologies, and established criteria for laboratory analysis. The proposed methodology is quantitative, though it can readily be complimented with a qualitative subjective assessment. Isoamyl acetate (IAA - also known at isopentyl acetate) is commonly used in respirator fit testing, and there are documented methodologies for both measuring its quantitative airborne concentrations. IAA is a clear, colorless liquid with a banana-like odor, documented detectable smell threshold for humans of 0.025 PPM, and a 15 PPB level of quantation limit.
NASA Astrophysics Data System (ADS)
Rana, Sachin; Ertekin, Turgay; King, Gregory R.
2018-05-01
Reservoir history matching is frequently viewed as an optimization problem which involves minimizing misfit between simulated and observed data. Many gradient and evolutionary strategy based optimization algorithms have been proposed to solve this problem which typically require a large number of numerical simulations to find feasible solutions. Therefore, a new methodology referred to as GP-VARS is proposed in this study which uses forward and inverse Gaussian processes (GP) based proxy models combined with a novel application of variogram analysis of response surface (VARS) based sensitivity analysis to efficiently solve high dimensional history matching problems. Empirical Bayes approach is proposed to optimally train GP proxy models for any given data. The history matching solutions are found via Bayesian optimization (BO) on forward GP models and via predictions of inverse GP model in an iterative manner. An uncertainty quantification method using MCMC sampling in conjunction with GP model is also presented to obtain a probabilistic estimate of reservoir properties and estimated ultimate recovery (EUR). An application of the proposed GP-VARS methodology on PUNQ-S3 reservoir is presented in which it is shown that GP-VARS provides history match solutions in approximately four times less numerical simulations as compared to the differential evolution (DE) algorithm. Furthermore, a comparison of uncertainty quantification results obtained by GP-VARS, EnKF and other previously published methods shows that the P50 estimate of oil EUR obtained by GP-VARS is in close agreement to the true values for the PUNQ-S3 reservoir.
Bardy, Fabrice; Dillon, Harvey; Van Dun, Bram
2014-04-01
Rapid presentation of stimuli in an evoked response paradigm can lead to overlap of multiple responses and consequently difficulties interpreting waveform morphology. This paper presents a deconvolution method allowing overlapping multiple responses to be disentangled. The deconvolution technique uses a least-squared error approach. A methodology is proposed to optimize the stimulus sequence associated with the deconvolution technique under low-jitter conditions. It controls the condition number of the matrices involved in recovering the responses. Simulations were performed using the proposed deconvolution technique. Multiple overlapping responses can be recovered perfectly in noiseless conditions. In the presence of noise, the amount of error introduced by the technique can be controlled a priori by the condition number of the matrix associated with the used stimulus sequence. The simulation results indicate the need for a minimum amount of jitter, as well as a sufficient number of overlap combinations to obtain optimum results. An aperiodic model is recommended to improve reconstruction. We propose a deconvolution technique allowing multiple overlapping responses to be extracted and a method of choosing the stimulus sequence optimal for response recovery. This technique may allow audiologists, psychologists, and electrophysiologists to optimize their experimental designs involving rapidly presented stimuli, and to recover evoked overlapping responses. Copyright © 2013 International Federation of Clinical Neurophysiology. All rights reserved.
NASA Astrophysics Data System (ADS)
Subagadis, Yohannes Hagos; Schütze, Niels; Grundmann, Jens
2014-05-01
An amplified interconnectedness between a hydro-environmental and socio-economic system brings about profound challenges of water management decision making. In this contribution, we present a fuzzy stochastic approach to solve a set of decision making problems, which involve hydrologically, environmentally, and socio-economically motivated criteria subjected to uncertainty and ambiguity. The proposed methodological framework combines objective and subjective criteria in a decision making procedure for obtaining an acceptable ranking in water resources management alternatives under different type of uncertainty (subjective/objective) and heterogeneous information (quantitative/qualitative) simultaneously. The first step of the proposed approach involves evaluating the performance of alternatives with respect to different types of criteria. The ratings of alternatives with respect to objective and subjective criteria are evaluated by simulation-based optimization and fuzzy linguistic quantifiers, respectively. Subjective and objective uncertainties related to the input information are handled through linking fuzziness and randomness together. Fuzzy decision making helps entail the linguistic uncertainty and a Monte Carlo simulation process is used to map stochastic uncertainty. With this framework, the overall performance of each alternative is calculated using an Order Weighted Averaging (OWA) aggregation operator accounting for decision makers' experience and opinions. Finally, ranking is achieved by conducting pair-wise comparison of management alternatives. This has been done on the basis of the risk defined by the probability of obtaining an acceptable ranking and mean difference in total performance for the pair of management alternatives. The proposed methodology is tested in a real-world hydrosystem, to find effective and robust intervention strategies for the management of a coastal aquifer system affected by saltwater intrusion due to excessive groundwater extraction for irrigated agriculture and municipal use. The results show that the approach gives useful support for robust decision-making and is sensitive to the decision makers' degree of optimism.
A quality evaluation methodology of health web-pages for non-professionals.
Currò, Vincenzo; Buonuomo, Paola Sabrina; Onesimo, Roberta; de Rose, Paola; Vituzzi, Andrea; di Tanna, Gian Luca; D'Atri, Alessandro
2004-06-01
The proposal of an evaluation methodology for determining the quality of healthcare web sites for the dissemination of medical information to non-professionals. Three (macro) factors are considered for the quality evaluation: medical contents, accountability of the authors, and usability of the web site. Starting from two results in the literature the problem of whether or not to introduce a weighting function has been investigated. This methodology has been validated on a specialized information content, i.e., sore throats, due to the large interest such a topic enjoys with target users. The World Wide Web was accessed using a meta-search system merging several search engines. A statistical analysis was made to compare the proposed methodology with the obtained ranks of the sample web pages. The statistical analysis confirms that the variables examined (per item and sub factor) show substantially similar ranks and are capable of contributing to the evaluation of the main quality macro factors. A comparison between the aggregation functions in the proposed methodology (non-weighted averages) and the weighting functions, derived from the literature, allowed us to verify the suitability of the method. The proposed methodology suggests a simple approach which can quickly award an overall quality score for medical web sites oriented to non-professionals.
Nanosatellite and Plug-and-Play Architecture 2 (NAPA 2)
2017-02-28
potentially other militarily relevant roles. The "i- Missions" focus area studies the kinetics of rapid mission development. The methodology involves...the US and Sweden in the Nanosatellite and Plug-and-play Architecture or "NAPA" program) is to pioneer a methodology for creating mission capable 6U...spacecraft. The methodology involves interchangeable blackbox (self-describing) components, software (middleware and applications), advanced
[The process of organizational culture formation in a philanthropic hospital].
Machado, Valéria Bertonha; Kurcgant, Paulina
2004-09-01
This study was carried out in a philanthropic medium-size hospital institution in Sao Paulo - Brazil, aiming to disclose the cultural features of the institution. The adopted methodology was the qualitative study, following the steps proposed by Thévenet: document analysis, interview and observation. The analysis showed that when a new professional group starts working in an institution, it considers that some values must be changed. This change means to restructure the management of the organization and the people involved in it, facing the conflict posed by changing or preserving the old system.
Teaching ethical analysis in occupational therapy.
Haddad, A M
1988-05-01
Ethical decision making is a cognitive skill requiring education in ethical principles and an understanding of specific ethical issues. It is also a psychodynamic process involving personalities, values, opinions, and perceptions. This article proposes the use of case studies and role-playing techniques in teaching ethics in occupational therapy to supplement conventional methods of presenting ethical theories and principles. These two approaches invite students to discuss and analyze crucial issues in occupational therapy from a variety of viewpoints. Methodology of developing case studies and role-playing exercises are discussed. The techniques are evaluated and their application to the teaching of ethics is examined.
The Design and Development of BMI Calc Android Application
NASA Astrophysics Data System (ADS)
Mohd Ali, Iliana; Samsudin, Nooraida
2016-11-01
Body mass index is a familiar term for those who are weight conscious. It is the term that let user know about the overall body composition in terms of fat.The available body mass index calculators whether online or on Play Store do not provide Malaysian meal suggestions. Hence, this paper proposes an application for body mass index calculator together with Malaysian meal suggestion. The objectives of the study are to design and develop BMI Calc android application for the purpose of calculating body mass index while embedding meal suggestion module. The design and methodology involve in the process are also presented.
Wavelet transform analysis of dynamic speckle patterns texture
NASA Astrophysics Data System (ADS)
Limia, Margarita Fernandez; Nunez, Adriana Mavilio; Rabal, Hector; Trivi, Marcelo
2002-11-01
We propose the use of the wavelet transform to characterize the time evolution of dynamic speckle patterns. We describe it by using as an example a method used for the assessment of the drying of paint. Optimal texture features are determined and the time evolution is described in terms of the Mahalanobis distance to the final (dry) state. From the behavior of this distance function, two parameters are defined that characterize the evolution. Because detailed knowledge of the involved dynamics is not required, the methodology could be implemented for other complex or poorly understood dynamic phenomena.
2013-01-01
Background Understanding the function of a particular gene under various stresses is important for engineering plants for broad-spectrum stress tolerance. Although virus-induced gene silencing (VIGS) has been used to characterize genes involved in abiotic stress tolerance, currently available gene silencing and stress imposition methodology at the whole plant level is not suitable for high-throughput functional analyses of genes. This demands a robust and reliable methodology for characterizing genes involved in abiotic and multi-stress tolerance. Results Our methodology employs VIGS-based gene silencing in leaf disks combined with simple stress imposition and effect quantification methodologies for easy and faster characterization of genes involved in abiotic and multi-stress tolerance. By subjecting leaf disks from gene-silenced plants to various abiotic stresses and inoculating silenced plants with various pathogens, we show the involvement of several genes for multi-stress tolerance. In addition, we demonstrate that VIGS can be used to characterize genes involved in thermotolerance. Our results also showed the functional relevance of NtEDS1 in abiotic stress, NbRBX1 and NbCTR1 in oxidative stress; NtRAR1 and NtNPR1 in salinity stress; NbSOS1 and NbHSP101 in biotic stress; and NtEDS1, NbETR1, NbWRKY2 and NbMYC2 in thermotolerance. Conclusions In addition to widening the application of VIGS, we developed a robust, easy and high-throughput methodology for functional characterization of genes involved in multi-stress tolerance. PMID:24289810
Porras, Mauricio A; Villar, Marcelo A; Cubitto, María A
2018-05-01
The presence of intracellular polyhydroxyalkanoates (PHAs) is usually studied using Sudan black dye solution (SB). In a previous work it was shown that the PHA could be directly quantified using the absorbance of SB fixed by PHA granules in wet cell samples. In the present paper, the optimum SB amount and the optimum conditions to be used for SB assays were determined following an experimental design by hybrid response surface methodology and desirability-function. In addition, a new methodology was developed in which it is shown that the amount of SB fixed by PHA granules can also be determined indirectly through the absorbance of the supernatant obtained from the stained cell samples. This alternative methodology allows a faster determination of the PHA content (involving 23 and 42 min for indirect and direct determinations, respectively), and can be undertaken by means of basic laboratory equipment and reagents. The correlation between PHA content in wet cell samples and the spectra of the SB stained supernatant was determined by means of multivariate and linear regression analysis. The best calibration adjustment (R 2 = 0.91, RSE: 1.56%), and the good PHA prediction obtained (RSE = 1.81%), shows that the proposed methodology constitutes a reasonably precise way for PHA content determination. Thus, this methodology could anticipate the probable results of the above mentioned direct PHA determination. Compared with the most used techniques described in the scientific literature, the combined implementation of these two methodologies seems to be one of the most economical and environmentally friendly, suitable for rapid monitoring of the intracellular PHA content. Copyright © 2018 Elsevier B.V. All rights reserved.
Seismic Hazard Estimates Using Ill-defined Macroseismic Data at Site
NASA Astrophysics Data System (ADS)
Albarello, D.; Mucciarelli, M.
- A new approach is proposed to the seismic hazard estimate based on documentary data concerning local history of seismic effects. The adopted methodology allows for the use of ``poor'' data, such as the macroseismic ones, within a formally coherent approach that permits overcoming a number of problems connected to the forcing of available information in the frame of ``standard'' methodologies calibrated on the use of instrumental data. The use of the proposed methodology allows full exploitation of all the available information (that for many towns in Italy covers several centuries) making possible a correct use of macroseismic data characterized by different levels of completeness and reliability. As an application of the proposed methodology, seismic hazard estimates are presented for two towns located in Northern Italy: Bologna and Carpi.
A new approach to subjectively assess quality of plenoptic content
NASA Astrophysics Data System (ADS)
Viola, Irene; Řeřábek, Martin; Ebrahimi, Touradj
2016-09-01
Plenoptic content is becoming increasingly popular thanks to the availability of acquisition and display devices. Thanks to image-based rendering techniques, a plenoptic content can be rendered in real time in an interactive manner allowing virtual navigation through the captured scenes. This way of content consumption enables new experiences, and therefore introduces several challenges in terms of plenoptic data processing, transmission and consequently visual quality evaluation. In this paper, we propose a new methodology to subjectively assess the visual quality of plenoptic content. We also introduce a prototype software to perform subjective quality assessment according to the proposed methodology. The proposed methodology is further applied to assess the visual quality of a light field compression algorithm. Results show that this methodology can be successfully used to assess the visual quality of plenoptic content.
Approximate furrow infiltration model for time-variable ponding depth
USDA-ARS?s Scientific Manuscript database
A methodology is proposed for estimating furrow infiltration under time-variable ponding depth conditions. The methodology approximates the solution to the two-dimensional Richards equation, and is a modification of a procedure that was originally proposed for computing infiltration under constant ...
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Railkar, Sudhir B.
1988-01-01
This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.
Debaveye, Sam; De Soete, Wouter; De Meester, Steven; Vandijck, Dominique; Heirman, Bert; Kavanagh, Shane; Dewulf, Jo
2016-01-01
The effects of a pharmaceutical treatment have until now been evaluated by the field of Health Economics on the patient health benefits, expressed in Quality-Adjusted Life Years (QALYs) versus the monetary costs. However, there is also a Human Health burden associated with this process, resulting from emissions that originate from the pharmaceutical production processes, Use Phase and End of Life (EoL) disposal of the medicine. This Human Health burden is evaluated by the research field of Life Cycle Assessment (LCA) and expressed in Disability-Adjusted Life Years (DALYs), a metric similar to the QALY. The need for a new framework presents itself in which both the positive and negative health effects of a pharmaceutical treatment are integrated into a net Human Health effect. To do so, this article reviews the methodologies of both Health Economics and the area of protection Human Health of the LCA methodology and proposes a conceptual framework on which to base an integration of both health effects. Methodological issues such as the inclusion of future costs and benefits, discounting and age weighting are discussed. It is suggested to use the structure of an LCA as a backbone to cover all methodological challenges involved in the integration. The possibility of monetizing both Human Health benefits and burdens is explored. The suggested approach covers the main methodological aspects that should be considered in an integrated assessment of the health effects of a pharmaceutical treatment. Copyright © 2015 Elsevier Inc. All rights reserved.
Modeling Single-Event Transient Propagation in a SiGe BiCMOS Direct-Conversion Receiver
NASA Astrophysics Data System (ADS)
Ildefonso, Adrian; Song, Ickhyun; Tzintzarov, George N.; Fleetwood, Zachary E.; Lourenco, Nelson E.; Wachter, Mason T.; Cressler, John D.
2017-08-01
The propagation of single-event transient (SET) signals in a silicon-germanium direct-conversion receiver carrying modulated data is explored. A theoretical analysis of transient propagation, verified by simulation, is presented. A new methodology to characterize and quantify the impact of SETs in communication systems carrying modulated data is proposed. The proposed methodology uses a pulsed radiation source to induce distortions in the signal constellation. The error vector magnitude due to SETs can then be calculated to quantify errors. Two different modulation schemes were simulated: QPSK and 16-QAM. The distortions in the constellation diagram agree with the presented circuit theory. Furthermore, the proposed methodology was applied to evaluate the improvements in the SET response due to a known radiation-hardening-by-design (RHBD) technique, where the common-base device of the low-noise amplifier was operated in inverse mode. The proposed methodology can be a valid technique to determine the most sensitive parts of a system carrying modulated data.
NASA Astrophysics Data System (ADS)
Jia, Xiaodong; Jin, Chao; Buzza, Matt; Di, Yuan; Siegel, David; Lee, Jay
2018-01-01
Successful applications of Diffusion Map (DM) in machine failure detection and diagnosis have been reported in several recent studies. DM provides an efficient way to visualize the high-dimensional, complex and nonlinear machine data, and thus suggests more knowledge about the machine under monitoring. In this paper, a DM based methodology named as DM-EVD is proposed for machine degradation assessment, abnormality detection and diagnosis in an online fashion. Several limitations and challenges of using DM for machine health monitoring have been analyzed and addressed. Based on the proposed DM-EVD, a deviation based methodology is then proposed to include more dimension reduction methods. In this work, the incorporation of Laplacian Eigen-map and Principal Component Analysis (PCA) are explored, and the latter algorithm is named as PCA-Dev and is validated in the case study. To show the successful application of the proposed methodology, case studies from diverse fields are presented and investigated in this work. Improved results are reported by benchmarking with other machine learning algorithms.
Energy-efficient container handling using hybrid model predictive control
NASA Astrophysics Data System (ADS)
Xin, Jianbin; Negenborn, Rudy R.; Lodewijks, Gabriel
2015-11-01
The performance of container terminals needs to be improved to adapt the growth of containers while maintaining sustainability. This paper provides a methodology for determining the trajectory of three key interacting machines for carrying out the so-called bay handling task, involving transporting containers between a vessel and the stacking area in an automated container terminal. The behaviours of the interacting machines are modelled as a collection of interconnected hybrid systems. Hybrid model predictive control (MPC) is proposed to achieve optimal performance, balancing the handling capacity and energy consumption. The underlying control problem is hereby formulated as a mixed-integer linear programming problem. Simulation studies illustrate that a higher penalty on energy consumption indeed leads to improved sustainability using less energy. Moreover, simulations illustrate how the proposed energy-efficient hybrid MPC controller performs under different types of uncertainties.
Computational modeling of drug-resistant bacteria. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacDougall, Preston
2015-03-12
Initial proposal summary: The evolution of antibiotic-resistant mutants among bacteria (superbugs) is a persistent and growing threat to public health. In many ways, we are engaged in a war with these microorganisms, where the corresponding arms race involves chemical weapons and biological targets. Just as advances in microelectronics, imaging technology and feature recognition software have turned conventional munitions into smart bombs, the long-term objectives of this proposal are to develop highly effective antibiotics using next-generation biomolecular modeling capabilities in tandem with novel subatomic feature detection software. Using model compounds and targets, our design methodology will be validated with correspondingly ultra-highmore » resolution structure-determination methods at premier DOE facilities (single-crystal X-ray diffraction at Argonne National Laboratory, and neutron diffraction at Oak Ridge National Laboratory). The objectives and accomplishments are summarized.« less
Porosity estimation of aged mortar using a micromechanical model.
Hernández, M G; Anaya, J J; Sanchez, T; Segura, I
2006-12-22
Degradation of concrete structures located in high humidity atmospheres or under flowing water is a very important problem. In this study, a method for ultrasonic non-destructive characterization in aged mortar is presented. The proposed method makes a prediction of the behaviour of aged mortar accomplished with a three phase micromechanical model using ultrasonic measurements. Aging mortar was accelerated by immersing the probes in ammonium nitrate solution. Both destructive and non-destructive characterization of mortar was performed. Destructive tests of porosity were performed using a vacuum saturation method and non-destructive characterization was carried out using ultrasonic velocities. Aging experiments show that mortar degradation not only involves a porosity increase, but also microstructural changes in the cement matrix. Experimental results show that the estimated porosity using the proposed non-destructive methodology had a comparable performance to classical destructive techniques.
ERIC Educational Resources Information Center
Lacey, John H.; Kelley-Baker, Tara; Voas, Robert B.; Romano, Eduardo; Furr-Holden, C. Debra; Torres, Pedro; Berning, Amy
2011-01-01
This article describes the methodology used in the 2007 U.S. National Roadside Survey to estimate the prevalence of alcohol- and drug-impaired driving and alcohol- and drug-involved driving. This study involved randomly stopping drivers at 300 locations across the 48 continental U.S. states at sites selected through a stratified random sampling…
Local deformation for soft tissue simulation
Omar, Nadzeri; Zhong, Yongmin; Smith, Julian; Gu, Chengfan
2016-01-01
ABSTRACT This paper presents a new methodology to localize the deformation range to improve the computational efficiency for soft tissue simulation. This methodology identifies the local deformation range from the stress distribution in soft tissues due to an external force. A stress estimation method is used based on elastic theory to estimate the stress in soft tissues according to a depth from the contact surface. The proposed methodology can be used with both mass-spring and finite element modeling approaches for soft tissue deformation. Experimental results show that the proposed methodology can improve the computational efficiency while maintaining the modeling realism. PMID:27286482
Writing a Research Proposal to The Research Council of Oman
Al-Shukaili, Ahmed; Al-Maniri, Abdullah
2017-01-01
Writing a research proposal can be a challenging task for young researchers. This article explains how to write a strong research proposal to apply for funding, specifically, a proposal for The Research Council (TRC) of Oman. Three different research proposal application forms are currently used in TRC, including Open Research Grant (ORG), Graduate Research Support Program (GRSP), and Faculty-mentored Undergraduate Research Award Program (FURAP). The application forms are filled and submitted electronically on TRC website. Each of the proposals submitted to TRC is selected through a rigorous reviewing and screening process. Novelty and originality of the research idea is the most crucial element in writing a research proposal. Performing an in-depth review of the literature will assist you to compose a good researchable question and generate a strong hypothesis. The development of a good hypothesis will offer insight into the specific objectives of a study. Research objectives should be focused, measurable, and achievable by a specific time using the most appropriate methodology. Moreover, it is essential to select a proper study design in-line with the purpose of the study and the hypothesis. Furthermore, social/economic impact and reasonable budget of proposed research are important criteria in research proposal evaluation by TRC. Finally, ethical principles should be observed before writing a research proposal involving human or animal subjects. PMID:28584597
Automatic food intake detection based on swallowing sounds.
Makeyev, Oleksandr; Lopez-Meyer, Paulo; Schuckers, Stephanie; Besio, Walter; Sazonov, Edward
2012-11-01
This paper presents a novel fully automatic food intake detection methodology, an important step toward objective monitoring of ingestive behavior. The aim of such monitoring is to improve our understanding of eating behaviors associated with obesity and eating disorders. The proposed methodology consists of two stages. First, acoustic detection of swallowing instances based on mel-scale Fourier spectrum features and classification using support vector machines is performed. Principal component analysis and a smoothing algorithm are used to improve swallowing detection accuracy. Second, the frequency of swallowing is used as a predictor for detection of food intake episodes. The proposed methodology was tested on data collected from 12 subjects with various degrees of adiposity. Average accuracies of >80% and >75% were obtained for intra-subject and inter-subject models correspondingly with a temporal resolution of 30s. Results obtained on 44.1 hours of data with a total of 7305 swallows show that detection accuracies are comparable for obese and lean subjects. They also suggest feasibility of food intake detection based on swallowing sounds and potential of the proposed methodology for automatic monitoring of ingestive behavior. Based on a wearable non-invasive acoustic sensor the proposed methodology may potentially be used in free-living conditions.
Automatic food intake detection based on swallowing sounds
Makeyev, Oleksandr; Lopez-Meyer, Paulo; Schuckers, Stephanie; Besio, Walter; Sazonov, Edward
2012-01-01
This paper presents a novel fully automatic food intake detection methodology, an important step toward objective monitoring of ingestive behavior. The aim of such monitoring is to improve our understanding of eating behaviors associated with obesity and eating disorders. The proposed methodology consists of two stages. First, acoustic detection of swallowing instances based on mel-scale Fourier spectrum features and classification using support vector machines is performed. Principal component analysis and a smoothing algorithm are used to improve swallowing detection accuracy. Second, the frequency of swallowing is used as a predictor for detection of food intake episodes. The proposed methodology was tested on data collected from 12 subjects with various degrees of adiposity. Average accuracies of >80% and >75% were obtained for intra-subject and inter-subject models correspondingly with a temporal resolution of 30s. Results obtained on 44.1 hours of data with a total of 7305 swallows show that detection accuracies are comparable for obese and lean subjects. They also suggest feasibility of food intake detection based on swallowing sounds and potential of the proposed methodology for automatic monitoring of ingestive behavior. Based on a wearable non-invasive acoustic sensor the proposed methodology may potentially be used in free-living conditions. PMID:23125873
Transparency and public involvement in animal research.
Pound, Pandora; Blaug, Ricardo
2016-05-01
To be legitimate, research needs to be ethical, methodologically sound, of sufficient value to justify public expenditure and be transparent. Animal research has always been contested on ethical grounds, but there is now mounting evidence of poor scientific method, and growing doubts about its clinical value. So what of transparency? Here we examine the increasing focus on openness within animal research in the UK, analysing recent developments within the Home Office and within the main group representing the interests of the sector, Understanding Animal Research. We argue that, while important steps are being taken toward greater transparency, the legitimacy of animal research continues to be undermined by selective openness. We propose that openness could be increased through public involvement, and that this would bring about much needed improvements in animal research, as it has done in clinical research. 2016 FRAME.
Design Requirements for Communication-Intensive Interactive Applications
NASA Astrophysics Data System (ADS)
Bolchini, Davide; Garzotto, Franca; Paolini, Paolo
Online interactive applications call for new requirements paradigms to capture the growing complexity of computer-mediated communication. Crafting successful interactive applications (such as websites and multimedia) involves modeling the requirements for the user experience, including those leading to content design, usable information architecture and interaction, in profound coordination with the communication goals of all stakeholders involved, ranging from persuasion to social engagement, to call for action. To face this grand challenge, we propose a methodology for modeling communication requirements and provide a set of operational conceptual tools to be used in complex projects with multiple stakeholders. Through examples from real-life projects and lessons-learned from direct experience, we draw on the concepts of brand, value, communication goals, information and persuasion requirements to systematically guide analysts to master the multifaceted connections of these elements as drivers to inform successful communication designs.
NASA Astrophysics Data System (ADS)
Zheng, Mingfang; He, Cunfu; Lu, Yan; Wu, Bin
2018-01-01
We presented a numerical method to solve phase dispersion curve in general anisotropic plates. This approach involves an exact solution to the problem in the form of the Legendre polynomial of multiple integrals, which we substituted into the state-vector formalism. In order to improve the efficiency of the proposed method, we made a special effort to demonstrate the analytical methodology. Furthermore, we analyzed the algebraic symmetries of the matrices in the state-vector formalism for anisotropic plates. The basic feature of the proposed method was the expansion of field quantities by Legendre polynomials. The Legendre polynomial method avoid to solve the transcendental dispersion equation, which can only be solved numerically. This state-vector formalism combined with Legendre polynomial expansion distinguished the adjacent dispersion mode clearly, even when the modes were very close. We then illustrated the theoretical solutions of the dispersion curves by this method for isotropic and anisotropic plates. Finally, we compared the proposed method with the global matrix method (GMM), which shows excellent agreement.
Balakumar, Pitchai; Inamdar, Mohammed Naseeruddin; Jagadeesh, Gowraganahalli
2013-04-01
An interactive workshop on 'The Critical Steps for Successful Research: The Research Proposal and Scientific Writing' was conducted in conjunction with the 64(th) Annual Conference of the Indian Pharmaceutical Congress-2012 at Chennai, India. In essence, research is performed to enlighten our understanding of a contemporary issue relevant to the needs of society. To accomplish this, a researcher begins search for a novel topic based on purpose, creativity, critical thinking, and logic. This leads to the fundamental pieces of the research endeavor: Question, objective, hypothesis, experimental tools to test the hypothesis, methodology, and data analysis. When correctly performed, research should produce new knowledge. The four cornerstones of good research are the well-formulated protocol or proposal that is well executed, analyzed, discussed and concluded. This recent workshop educated researchers in the critical steps involved in the development of a scientific idea to its successful execution and eventual publication.
Tucker, Conrad S; Behoora, Ishan; Nembhard, Harriet Black; Lewis, Mechelle; Sterling, Nicholas W; Huang, Xuemei
2015-11-01
Medication non-adherence is a major concern in the healthcare industry and has led to increases in health risks and medical costs. For many neurological diseases, adherence to medication regimens can be assessed by observing movement patterns. However, physician observations are typically assessed based on visual inspection of movement and are limited to clinical testing procedures. Consequently, medication adherence is difficult to measure when patients are away from the clinical setting. The authors propose a data mining driven methodology that uses low cost, non-wearable multimodal sensors to model and predict patients' adherence to medication protocols, based on variations in their gait. The authors conduct a study involving Parkinson's disease patients that are "on" and "off" their medication in order to determine the statistical validity of the methodology. The data acquired can then be used to quantify patients' adherence while away from the clinic. Accordingly, this data-driven system may allow for early warnings regarding patient safety. Using whole-body movement data readings from the patients, the authors were able to discriminate between PD patients on and off medication, with accuracies greater than 97% for some patients using an individually customized model and accuracies of 78% for a generalized model containing multiple patient gait data. The proposed methodology and study demonstrate the potential and effectiveness of using low cost, non-wearable hardware and data mining models to monitor medication adherence outside of the traditional healthcare facility. These innovations may allow for cost effective, remote monitoring of treatment of neurological diseases. Copyright © 2015 Elsevier Ltd. All rights reserved.
Tucker, Conrad; Behoora, Ishan; Nembhard, Harriet Black; Lewis, Mechelle; Sterling, Nicholas W; Huang, Xuemei
2017-01-01
Medication non-adherence is a major concern in the healthcare industry and has led to increases in health risks and medical costs. For many neurological diseases, adherence to medication regimens can be assessed by observing movement patterns. However, physician observations are typically assessed based on visual inspection of movement and are limited to clinical testing procedures. Consequently, medication adherence is difficult to measure when patients are away from the clinical setting. The authors propose a data mining driven methodology that uses low cost, non-wearable multimodal sensors to model and predict patients’ adherence to medication protocols, based on variations in their gait. The authors conduct a study involving Parkinson’s Disease patients that are “on” and “off” their medication in order to determine the statistical validity of the methodology. The data acquired can then be used to quantify patients’ adherence while away from the clinic. Accordingly, this data-driven system may allow for early warnings regarding patient safety. Using whole-body movement data readings from the patients, the authors were able to discriminate between PD patients on and off medication, with accuracies greater than 97% for some patients using an individually customized model and accuracies of 78% for a generalized model containing multiple patient gait data. The proposed methodology and study demonstrate the potential and effectiveness of using low cost, non-wearable hardware and data mining models to monitor medication adherence outside of the traditional healthcare facility. These innovations may allow for cost effective, remote monitoring of treatment of neurological diseases. PMID:26406881
Teaching methodologies to promote creativity in the professional skills related to optics knowledge
NASA Astrophysics Data System (ADS)
Fernández-Oliveras, Alicia; Fernandez, Paz; Peña-García, Antonio; Oliveras, Maria L.
2014-07-01
We present the methodologies proposed and applied in the context of a teaching-innovation project developed at the University of Granada, Spain. The main objective of the project is the implementation of teaching methodologies that promote the creativity in the learning process and, subsequently, in the acquisition of professional skills. This project involves two subjects related with optics knowledge in undergraduate students. The subjects are "Illumination Engineering" (Bachelor's degree in Civil-Engineering) and "Optical and Optometric Instrumentation" (Bachelor's degree in and Optics and Optometry). For the first subject, the activities of our project were carried out in the theoretical classes. By contrast, in the case of the second subject, such activities were designed for the laboratory sessions. For "Illumination Engineering" we applied the maieutic technique. With this method the students were encouraged to establish relationships between the main applications of the subject and concepts that apparently unrelated with the subject framework. By means of several examples, the students became aware of the importance of cross-curricular and lateral thinking. We used the technique based on protocols of control and change in "Optical and Optometric Instrumentation". The modus operandi was focused on prompting the students to adopt the role of the professionals and to pose questions to themselves concerning the practical content of the subject from that professional role. This mechanism boosted the critical capacity and the independent-learning ability of the students. In this work, we describe in detail both subject proposals and the results of their application in the 2011-2012 academic course.
Mokel, Melissa Jennifer; Shellman, Juliette M
2013-01-01
Many instruments in which religious involvement is measured often (a) contain unclear, poorly developed constructs; (b) lack methodological rigor in scale development; and (c) contain language and content culturally incongruent with the religious experiences of diverse ethnic groups. The primary aims of this review were to (a) synthesize the research on instruments designed to measure religious involvement, (b) evaluate the methodological quality of instruments that measure religious involvement, and (c) examine these instruments for conceptual congruency with African American religious involvement. An updated integrative research review method guided the process (Whittemore & Knafl, 2005). 152 articles were reviewed and 23 articles retrieved. Only 3 retained instruments were developed under methodologically rigorous conditions. All 3 instruments were congruent with a conceptual model of African American religious involvement. The Fetzer Multidimensional Measure of Religious Involvement and Spirituality (FMMRS; Idler et al., 2003) was found to have favorable characteristics. Further examination and psychometric testing is warranted to determine its acceptability, readability, and cultural sensitivity in an African American population.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-31
... Fishery Management Council (Council); Work Session To Review Proposed Salmon Methodology Changes AGENCY.... ACTION: Notice of a public meeting. SUMMARY: The Pacific Fishery Management Council's Salmon Technical Team (STT), Scientific and Statistical Committee (SSC) Salmon Subcommittee, and Model Evaluation...
Le Jeunne, C; Plétan, Y; Boissel, J P
2002-01-01
The Marketing Authorization (MA) granted to a new molecular entity does not allow for proper anticipation of its future positioning within the therapeutic strategy. A specific methodology should be devised as early as during the pre-MA development phase that could result in an initial positioning that should be subjected to further reappraisal with regard to scientific advances, the arrival of new treatments and further developments with this molecule. A methodology is thus proposed, based on early optimisation of the development plan, the granting of subsequent MAs, and reappraisal of the positioning within the strategy, based on analysis of all available data. It should be possible to take into account the economic context, within an agreed system with pre-defined medico-economic criteria. This may in turn raise the issue of the role of the various parties involved in this assessment, as well as how to understand the respective opinions of stakeholders: authorities, sponsors, prescribers and patients, each of whom has a specific view of the definition of the strategic objective that should apply to the disease concerned.
NASA Astrophysics Data System (ADS)
Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan
2018-01-01
Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.
Cellular neural network-based hybrid approach toward automatic image registration
NASA Astrophysics Data System (ADS)
Arun, Pattathal VijayaKumar; Katiyar, Sunil Kumar
2013-01-01
Image registration is a key component of various image processing operations that involve the analysis of different image data sets. Automatic image registration domains have witnessed the application of many intelligent methodologies over the past decade; however, inability to properly model object shape as well as contextual information has limited the attainable accuracy. A framework for accurate feature shape modeling and adaptive resampling using advanced techniques such as vector machines, cellular neural network (CNN), scale invariant feature transform (SIFT), coreset, and cellular automata is proposed. CNN has been found to be effective in improving feature matching as well as resampling stages of registration and complexity of the approach has been considerably reduced using coreset optimization. The salient features of this work are cellular neural network approach-based SIFT feature point optimization, adaptive resampling, and intelligent object modelling. Developed methodology has been compared with contemporary methods using different statistical measures. Investigations over various satellite images revealed that considerable success was achieved with the approach. This system has dynamically used spectral and spatial information for representing contextual knowledge using CNN-prolog approach. This methodology is also illustrated to be effective in providing intelligent interpretation and adaptive resampling.
Diagnostic radiograph based 3D bone reconstruction framework: application to the femur.
Gamage, P; Xie, S Q; Delmas, P; Xu, W L
2011-09-01
Three dimensional (3D) visualization of anatomy plays an important role in image guided orthopedic surgery and ultimately motivates minimally invasive procedures. However, direct 3D imaging modalities such as Computed Tomography (CT) are restricted to a minority of complex orthopedic procedures. Thus the diagnostics and planning of many interventions still rely on two dimensional (2D) radiographic images, where the surgeon has to mentally visualize the anatomy of interest. The purpose of this paper is to apply and validate a bi-planar 3D reconstruction methodology driven by prominent bony anatomy edges and contours identified on orthogonal radiographs. The results obtained through the proposed methodology are benchmarked against 3D CT scan data to assess the accuracy of reconstruction. The human femur has been used as the anatomy of interest throughout the paper. The novelty of this methodology is that it not only involves the outer contours of the bony anatomy in the reconstruction but also several key interior edges identifiable on radiographic images. Hence, this framework is not simply limited to long bones, but is generally applicable to a multitude of other bony anatomies as illustrated in the results section. Copyright © 2010 Elsevier Ltd. All rights reserved.
Yusuf, Afiqah; Elsabbagh, Mayada
2015-12-15
Identifying biomarkers for autism can improve outcomes for those affected by autism. Engaging the diverse stakeholders in the research process using community-based participatory research (CBPR) can accelerate biomarker discovery into clinical applications. However, there are limited examples of stakeholder involvement in autism research, possibly due to conceptual and practical concerns. We evaluate the applicability of CBPR principles to biomarker discovery in autism and critically review empirical studies adopting these principles. Using a scoping review methodology, we identified and evaluated seven studies using CBPR principles in biomarker discovery. The limited number of studies in biomarker discovery adopting CBPR principles coupled with their methodological limitations suggests that such applications are feasible but challenging. These studies illustrate three CBPR themes: community assessment, setting global priorities, and collaboration in research design. We propose that further research using participatory principles would be useful in accelerating the pace of discovery and the development of clinically meaningful biomarkers. For this goal to be successful we advocate for increased attention to previously identified conceptual and methodological challenges to participatory approaches in health research, including improving scientific rigor and developing long-term partnerships among stakeholders.
Mateo, Estibaliz; Sevillano, Elena
2018-07-01
In the recent years, there has been a decrease in the number of medical professionals dedicated to a research career. There is evidence that students with a research experience during their training acquire knowledge and skills that increase the probability of getting involved in research more successfully. In the Degree of Medicine (University of the Basque Country) the annual core subject 'Research Project' introduces students to research. The aim of this work was to implement a project-based learning methodology, with the students working on microbiology, and to analyse its result along time. Given an initial scenario, the students had to come up with a research idea related to medical microbiology and to carry out a research project, including writing a funding proposal, developing the experimental assays and analyzing and presenting their results to a congress organized by the University. Summative assessment was performed by both students and teachers. A satisfaction survey was carried out to gather the students' opinion. The overall results regarding to the classroom dynamics, learning results and motivation after the implementation were favourable. Students referred a greater interest about research than they had before. They would choose the project based methodology versus the traditional one.
Crespi, Francesco
2010-06-02
A dual probing methodology was implemented so that combined in vivo voltammetric (electrochemical) and in vivo electrophysiological analysis could be carried out concomitantly in two distinct brain regions of the same anaesthetized animal, i.e., cell body such as the dorsal raphe nucleus (DRN) and related terminal region such as the hippocampus, the frontal cortex, and the amygdala. In particular, this methodology allowed: In addition, the dual probing methodology has been applied to verify the original proposal that a combined treatment with a potassium (SK) channel blocker such as apamin and an SSRI (i.e., fluoxetine) could overcome the slow onset of the SSRI upon central 5-HT activity that could be related to the slow onset of its therapeutic action. Briefly, the effect of apamin either alone or followed by fluoxetine upon cell firing in the DRN (in vivo electrophysiology) and concomitantly upon 5-HT levels (in vivo voltammetry) in the amygdala (forebrain structure involved in mood regulation and innervated by ascending 5-HT projections from the DRN) was studied. Copyright 2010 Elsevier B.V. All rights reserved.
Duarte, Elisabeth Carmen; Garcia, Leila Posenato; de Araújo, Wildo Navegantes; Velez, Maria P
2017-12-02
Zika infection during pregnancy (ZIKVP) is known to be associated with adverse outcomes. Studies on this matter involve both rare outcomes and rare exposures and methodological choices are not straightforward. Cohort studies will surely offer more robust evidences, but their efficiency must be enhanced. We aim to contribute to the debate on sample selection strategies in cohort studies to assess outcomes associated with ZKVP. A study can be statistically more efficient than another if its estimates are more accurate (precise and valid), even if the studies involve the same number of subjects. Sample size and specific design strategies can enhance or impair the statistical efficiency of a study, depending on how the subjects are distributed in subgroups pertinent to the analysis. In most ZIKVP cohort studies to date there is an a priori identification of the source population (pregnant women, regardless of their exposure status) which is then sampled or included in its entirety (census). Subsequently, the group of pregnant women is classified according to exposure (presence or absence of ZIKVP), respecting the exposed:unexposed ratio in the source population. We propose that the sample selection be done from the a priori identification of groups of pregnant women exposed and unexposed to ZIKVP. This method will allow for an oversampling (even 100%) of the pregnant women with ZKVP and a optimized sampling from the general population of pregnant women unexposed to ZIKVP, saving resources in the unexposed group and improving the expected number of incident cases (outcomes) overall. We hope that this proposal will broaden the methodological debate on the improvement of statistical power and protocol harmonization of cohort studies that aim to evaluate the association between Zika infection during pregnancy and outcomes for the offspring, as well as those with similar objectives.
Anguera, M Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana
2018-01-01
Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts.
Deb, Kalyanmoy; Sinha, Ankur
2010-01-01
Bilevel optimization problems involve two optimization tasks (upper and lower level), in which every feasible upper level solution must correspond to an optimal solution to a lower level optimization problem. These problems commonly appear in many practical problem solving tasks including optimal control, process optimization, game-playing strategy developments, transportation problems, and others. However, they are commonly converted into a single level optimization problem by using an approximate solution procedure to replace the lower level optimization task. Although there exist a number of theoretical, numerical, and evolutionary optimization studies involving single-objective bilevel programming problems, not many studies look at the context of multiple conflicting objectives in each level of a bilevel programming problem. In this paper, we address certain intricate issues related to solving multi-objective bilevel programming problems, present challenging test problems, and propose a viable and hybrid evolutionary-cum-local-search based algorithm as a solution methodology. The hybrid approach performs better than a number of existing methodologies and scales well up to 40-variable difficult test problems used in this study. The population sizing and termination criteria are made self-adaptive, so that no additional parameters need to be supplied by the user. The study indicates a clear niche of evolutionary algorithms in solving such difficult problems of practical importance compared to their usual solution by a computationally expensive nested procedure. The study opens up many issues related to multi-objective bilevel programming and hopefully this study will motivate EMO and other researchers to pay more attention to this important and difficult problem solving activity.
Accuracy and Calibration of Computational Approaches for Inpatient Mortality Predictive Modeling.
Nakas, Christos T; Schütz, Narayan; Werners, Marcus; Leichtle, Alexander B
2016-01-01
Electronic Health Record (EHR) data can be a key resource for decision-making support in clinical practice in the "big data" era. The complete database from early 2012 to late 2015 involving hospital admissions to Inselspital Bern, the largest Swiss University Hospital, was used in this study, involving over 100,000 admissions. Age, sex, and initial laboratory test results were the features/variables of interest for each admission, the outcome being inpatient mortality. Computational decision support systems were utilized for the calculation of the risk of inpatient mortality. We assessed the recently proposed Acute Laboratory Risk of Mortality Score (ALaRMS) model, and further built generalized linear models, generalized estimating equations, artificial neural networks, and decision tree systems for the predictive modeling of the risk of inpatient mortality. The Area Under the ROC Curve (AUC) for ALaRMS marginally corresponded to the anticipated accuracy (AUC = 0.858). Penalized logistic regression methodology provided a better result (AUC = 0.872). Decision tree and neural network-based methodology provided even higher predictive performance (up to AUC = 0.912 and 0.906, respectively). Additionally, decision tree-based methods can efficiently handle Electronic Health Record (EHR) data that have a significant amount of missing records (in up to >50% of the studied features) eliminating the need for imputation in order to have complete data. In conclusion, we show that statistical learning methodology can provide superior predictive performance in comparison to existing methods and can also be production ready. Statistical modeling procedures provided unbiased, well-calibrated models that can be efficient decision support tools for predicting inpatient mortality and assigning preventive measures.
Identifiability of PBPK Models with Applications to ...
Any statistical model should be identifiable in order for estimates and tests using it to be meaningful. We consider statistical analysis of physiologically-based pharmacokinetic (PBPK) models in which parameters cannot be estimated precisely from available data, and discuss different types of identifiability that occur in PBPK models and give reasons why they occur. We particularly focus on how the mathematical structure of a PBPK model and lack of appropriate data can lead to statistical models in which it is impossible to estimate at least some parameters precisely. Methods are reviewed which can determine whether a purely linear PBPK model is globally identifiable. We propose a theorem which determines when identifiability at a set of finite and specific values of the mathematical PBPK model (global discrete identifiability) implies identifiability of the statistical model. However, we are unable to establish conditions that imply global discrete identifiability, and conclude that the only safe approach to analysis of PBPK models involves Bayesian analysis with truncated priors. Finally, computational issues regarding posterior simulations of PBPK models are discussed. The methodology is very general and can be applied to numerous PBPK models which can be expressed as linear time-invariant systems. A real data set of a PBPK model for exposure to dimethyl arsinic acid (DMA(V)) is presented to illustrate the proposed methodology. We consider statistical analy
Denadai, Rafael; Saad-Hossne, Rogério; Martinhão Souto, Luís Ricardo
2013-01-01
Background: Because of ethical and medico-legal aspects involved in the training of cutaneous surgical skills on living patients, human cadavers and living animals, it is necessary the search for alternative and effective forms of training simulation. Aims: To propose and describe an alternative methodology for teaching and learning the principles of cutaneous surgery in a medical undergraduate program by using a chicken-skin bench model. Materials and Methods: One instructor for every four students, teaching materials on cutaneous surgical skills, chicken trunks, wings, or thighs, a rigid platform support, needled threads, needle holders, surgical blades with scalpel handles, rat-tooth tweezers, scissors, and marking pens were necessary for training simulation. Results: A proposal for simulation-based training on incision, suture, biopsy, and on reconstruction techniques using a chicken-skin bench model distributed in several sessions and with increasing levels of difficultywas structured. Both feedback and objective evaluations always directed to individual students were also outlined. Conclusion: The teaching of a methodology for the principles of cutaneous surgery using a chicken-skin bench model versatile, portable, easy to assemble, and inexpensive is an alternative and complementary option to the armamentarium of methods based on other bench models described. PMID:23723471
Combining EEG and eye movement recording in free viewing: Pitfalls and possibilities.
Nikolaev, Andrey R; Meghanathan, Radha Nila; van Leeuwen, Cees
2016-08-01
Co-registration of EEG and eye movement has promise for investigating perceptual processes in free viewing conditions, provided certain methodological challenges can be addressed. Most of these arise from the self-paced character of eye movements in free viewing conditions. Successive eye movements occur within short time intervals. Their evoked activity is likely to distort the EEG signal during fixation. Due to the non-uniform distribution of fixation durations, these distortions are systematic, survive across-trials averaging, and can become a source of confounding. We illustrate this problem with effects of sequential eye movements on the evoked potentials and time-frequency components of EEG and propose a solution based on matching of eye movement characteristics between experimental conditions. The proposal leads to a discussion of which eye movement characteristics are to be matched, depending on the EEG activity of interest. We also compare segmentation of EEG into saccade-related epochs relative to saccade and fixation onsets and discuss the problem of baseline selection and its solution. Further recommendations are given for implementing EEG-eye movement co-registration in free viewing conditions. By resolving some of the methodological problems involved, we aim to facilitate the transition from the traditional stimulus-response paradigm to the study of visual perception in more naturalistic conditions. Copyright © 2016 Elsevier Inc. All rights reserved.
Gay, J Rebecca; Korre, Anna
2009-07-01
The authors have previously published a methodology which combines quantitative probabilistic human health risk assessment and spatial statistical methods (geostatistics) to produce an assessment, incorporating uncertainty, of risks to human health from exposure to contaminated land. The model assumes a constant soil to plant concentration factor (CF(veg)) when calculating intake of contaminants. This model is modified here to enhance its use in a situation where CF(veg) varies according to soil pH, as is the case for cadmium. The original methodology uses sequential indicator simulation (SIS) to map soil concentration estimates for one contaminant across a site. A real, age-stratified population is mapped across the contaminated area, and intake of soil contaminants by individuals is calculated probabilistically using an adaptation of the Contaminated Land Exposure Assessment (CLEA) model. The proposed improvement involves not only the geostatistical estimation of the contaminant concentration, but also that of soil pH, which in turn leads to a variable CF(veg) estimate which influences the human intake results. The results presented demonstrate that taking pH into account can influence the outcome of the risk assessment greatly. It is proposed that a similar adaptation could be used for other combinations of soil variables which influence CF(veg).
Advanced engineering tools for design and fabrication of a custom nasal prosthesis
NASA Astrophysics Data System (ADS)
Oliveira, Inês; Leal, Nuno; Silva, Pedro; da Costa Ferreira, A.; Neto, Rui J.; Lino, F. Jorge; Reis, Ana
2012-09-01
Unexpected external defects resulting from neoplasms, burns, congenital malformations, trauma or other diseases, particularly when involving partial or total loss of an external organ, can be emotionally devastating. These defects can be restored with prosthesis, obtained by different techniques, materials and methods. The increase of patient numbers and cost constraints lead to the need of exploring new techniques that can increase efficiency. The main goal of this project was to develop a full engineering-based manufacturing process to obtain soft-tissue prosthesis that could provide faster and less expensive options in the manufacturing of customized prosthesis, and at the same time being able to reproduce the highest degree of details, with the maximum comfort for the patient. Design/methodology/approach - This case report describes treatment using silicone prosthesis with an anatomic retention for an 80-years-old woman with a rhinectomy. The proposed methodology integrates non-contact structured light scanning, CT and reverse engineering with CAD/CAM and additive manufacturing technology. Findings - The proposed protocol showed encouraging results since reveals being a better solution for fabricating custom-made facial prostheses for asymmetrical organs than conventional approaches. The process allows the attainment of prosthesis with the minimum contact and discomfort for the patient, disclosing excellent results in terms of aesthetic, prosthesis retention and in terms of time and resources consumed.
Wind speed time series reconstruction using a hybrid neural genetic approach
NASA Astrophysics Data System (ADS)
Rodriguez, H.; Flores, J. J.; Puig, V.; Morales, L.; Guerra, A.; Calderon, F.
2017-11-01
Currently, electric energy is used in practically all modern human activities. Most of the energy produced came from fossil fuels, making irreversible damage to the environment. Lately, there has been an effort by nations to produce energy using clean methods, such as solar and wind energy, among others. Wind energy is one of the cleanest alternatives. However, the wind speed is not constant, making the planning and operation at electric power systems a difficult activity. Knowing in advance the amount of raw material (wind speed) used for energy production allows us to estimate the energy to be generated by the power plant, helping the maintenance planning, the operational management, optimal operational cost. For these reasons, the forecast of wind speed becomes a necessary task. The forecast process involves the use of past observations from the variable to forecast (wind speed). To measure wind speed, weather stations use devices called anemometers, but due to poor maintenance, connection error, or natural wear, they may present false or missing data. In this work, a hybrid methodology is proposed, and it uses a compact genetic algorithm with an artificial neural network to reconstruct wind speed time series. The proposed methodology reconstructs the time series using a ANN defined by a Compact Genetic Algorithm.
[Social network analysis: a method to improve safety in healthcare organizations].
Marqués Sánchez, Pilar; González Pérez, Marta Eva; Agra Varela, Yolanda; Vega Núñez, Jorge; Pinto Carral, Arrate; Quiroga Sánchez, Enedina
2013-01-01
Patient safety depends on the culture of the healthcare organization involving relationships between professionals. This article proposes that the study of these relations should be conducted from a network perspective and using a methodology called Social Network Analysis (SNA). This methodology includes a set of mathematical constructs grounded in Graph Theory. With the SNA we can know aspects of the individual's position in the network (centrality) or cohesion among team members. Thus, the SNA allows to know aspects related to security such as the kind of links that can increase commitment among professionals, how to build those links, which nodes have more prestige in the team in generating confidence or collaborative network, which professionals serve as intermediaries between the subgroups of a team to transmit information or smooth conflicts, etc. Useful aspects in stablishing a safety culture. The SNA would analyze the relations among professionals, their level of communication to communicate errors and spontaneously seek help and coordination between departments to participate in projects that enhance safety. Thus, they related through a network, using the same language, a fact that helps to build a culture. In summary, we propose an approach to safety culture from a SNA perspective that would complement other commonly used methods.
A methodology for creating greenways through multidisciplinary sustainable landscape planning.
Pena, Selma Beatriz; Abreu, Maria Manuela; Teles, Rui; Espírito-Santo, Maria Dalila
2010-01-01
This research proposes a methodology for defining greenways via sustainable planning. This approach includes the analysis and discussion of culture and natural processes that occur in the landscape. The proposed methodology is structured in three phases: eco-cultural analysis; synthesis and diagnosis; and proposal. An interdisciplinary approach provides an assessment of the relationships between landscape structure and landscape dynamics, which are essential to any landscape management or land use. The landscape eco-cultural analysis provides a biophysical, dynamic (geomorphologic rate), vegetation (habitats from directive 92/43/EEC) and cultural characterisation. The knowledge obtained by this analysis then supports the definition of priority actions to stabilise the landscape and the management measures for the habitats. After the analysis and diagnosis phases, a proposal for the development of sustainable greenways can be achieved. This methodology was applied to a study area of the Azambuja Municipality in the Lisbon Metropolitan Area (Portugal). The application of the proposed methodology to the study area shows that landscape stability is crucial for greenway users in order to appreciate the landscape and its natural and cultural elements in a sustainable and healthy way, both by cycling or by foot. A balanced landscape will increase the value of greenways and in return, they can develop socio-economic activities with benefits for rural communities. Copyright 2009 Elsevier Ltd. All rights reserved.
A Security Assessment Mechanism for Software-Defined Networking-Based Mobile Networks.
Luo, Shibo; Dong, Mianxiong; Ota, Kaoru; Wu, Jun; Li, Jianhua
2015-12-17
Software-Defined Networking-based Mobile Networks (SDN-MNs) are considered the future of 5G mobile network architecture. With the evolving cyber-attack threat, security assessments need to be performed in the network management. Due to the distinctive features of SDN-MNs, such as their dynamic nature and complexity, traditional network security assessment methodologies cannot be applied directly to SDN-MNs, and a novel security assessment methodology is needed. In this paper, an effective security assessment mechanism based on attack graphs and an Analytic Hierarchy Process (AHP) is proposed for SDN-MNs. Firstly, this paper discusses the security assessment problem of SDN-MNs and proposes a methodology using attack graphs and AHP. Secondly, to address the diversity and complexity of SDN-MNs, a novel attack graph definition and attack graph generation algorithm are proposed. In order to quantify security levels, the Node Minimal Effort (NME) is defined to quantify attack cost and derive system security levels based on NME. Thirdly, to calculate the NME of an attack graph that takes the dynamic factors of SDN-MN into consideration, we use AHP integrated with the Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) as the methodology. Finally, we offer a case study to validate the proposed methodology. The case study and evaluation show the advantages of the proposed security assessment mechanism.
A Security Assessment Mechanism for Software-Defined Networking-Based Mobile Networks
Luo, Shibo; Dong, Mianxiong; Ota, Kaoru; Wu, Jun; Li, Jianhua
2015-01-01
Software-Defined Networking-based Mobile Networks (SDN-MNs) are considered the future of 5G mobile network architecture. With the evolving cyber-attack threat, security assessments need to be performed in the network management. Due to the distinctive features of SDN-MNs, such as their dynamic nature and complexity, traditional network security assessment methodologies cannot be applied directly to SDN-MNs, and a novel security assessment methodology is needed. In this paper, an effective security assessment mechanism based on attack graphs and an Analytic Hierarchy Process (AHP) is proposed for SDN-MNs. Firstly, this paper discusses the security assessment problem of SDN-MNs and proposes a methodology using attack graphs and AHP. Secondly, to address the diversity and complexity of SDN-MNs, a novel attack graph definition and attack graph generation algorithm are proposed. In order to quantify security levels, the Node Minimal Effort (NME) is defined to quantify attack cost and derive system security levels based on NME. Thirdly, to calculate the NME of an attack graph that takes the dynamic factors of SDN-MN into consideration, we use AHP integrated with the Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) as the methodology. Finally, we offer a case study to validate the proposed methodology. The case study and evaluation show the advantages of the proposed security assessment mechanism. PMID:26694409
NASA Astrophysics Data System (ADS)
Babaveisi, Vahid; Paydar, Mohammad Mahdi; Safaei, Abdul Sattar
2018-07-01
This study aims to discuss the solution methodology for a closed-loop supply chain (CLSC) network that includes the collection of used products as well as distribution of the new products. This supply chain is presented on behalf of the problems that can be solved by the proposed meta-heuristic algorithms. A mathematical model is designed for a CLSC that involves three objective functions of maximizing the profit, minimizing the total risk and shortages of products. Since three objective functions are considered, a multi-objective solution methodology can be advantageous. Therefore, several approaches have been studied and an NSGA-II algorithm is first utilized, and then the results are validated using an MOSA and MOPSO algorithms. Priority-based encoding, which is used in all the algorithms, is the core of the solution computations. To compare the performance of the meta-heuristics, random numerical instances are evaluated by four criteria involving mean ideal distance, spread of non-dominance solution, the number of Pareto solutions, and CPU time. In order to enhance the performance of the algorithms, Taguchi method is used for parameter tuning. Finally, sensitivity analyses are performed and the computational results are presented based on the sensitivity analyses in parameter tuning.
M.E.366-J embodiment design project: Portable foot restraint
NASA Technical Reports Server (NTRS)
Heaton, Randall; Meyer, Eikar; Schmidt, Davey; Enders, Kevin
1994-01-01
During space shuttle operations, astronauts require support to carry out tasks in the weightless environment. In the past, portable foot restraints (PFR) with orientations adjustable in pitch, roll, and yaw provided this support for payload bay operations. These foot restraints, however, were designed for specific tasks with a load limit of 111.2 Newtons. Since the original design, new applications for foot restraints have been identified. New designs for the foot restraints have been created to boost the operational work load to 444.8 Newtons and decrease setup times. What remains to be designed is an interface between the restraint system and the extravehicular mobility unit (EMU) boots. NASA provided a proposed locking device involving a spring-loaded mechanism. This locking mechanism must withstand loads of 1334.4 Newtons in any direction and weigh less than 222.4 Newtons. This paper develops an embodiment design for the interface between the PFR and the EMU boots. This involves design of the locking mechanism and a removable cleat that allows the boot to interface with this mechanism. The design team used the Paul Beitz engineering methodology to present the systematic development, structural analysis, and production considerations of the embodiment design. This methodology provides a basis for understanding the justification behind the decisions made in the design.
NASA Astrophysics Data System (ADS)
Babaveisi, Vahid; Paydar, Mohammad Mahdi; Safaei, Abdul Sattar
2017-07-01
This study aims to discuss the solution methodology for a closed-loop supply chain (CLSC) network that includes the collection of used products as well as distribution of the new products. This supply chain is presented on behalf of the problems that can be solved by the proposed meta-heuristic algorithms. A mathematical model is designed for a CLSC that involves three objective functions of maximizing the profit, minimizing the total risk and shortages of products. Since three objective functions are considered, a multi-objective solution methodology can be advantageous. Therefore, several approaches have been studied and an NSGA-II algorithm is first utilized, and then the results are validated using an MOSA and MOPSO algorithms. Priority-based encoding, which is used in all the algorithms, is the core of the solution computations. To compare the performance of the meta-heuristics, random numerical instances are evaluated by four criteria involving mean ideal distance, spread of non-dominance solution, the number of Pareto solutions, and CPU time. In order to enhance the performance of the algorithms, Taguchi method is used for parameter tuning. Finally, sensitivity analyses are performed and the computational results are presented based on the sensitivity analyses in parameter tuning.
Writing implementation research grant proposals: ten key ingredients
2012-01-01
Background All investigators seeking funding to conduct implementation research face the challenges of preparing a high-quality proposal and demonstrating their capacity to conduct the proposed study. Applicants need to demonstrate the progressive nature of their research agenda and their ability to build cumulatively upon the literature and their own preliminary studies. Because implementation science is an emerging field involving complex and multilevel processes, many investigators may not feel equipped to write competitive proposals, and this concern is pronounced among early stage implementation researchers. Discussion This article addresses the challenges of preparing grant applications that succeed in the emerging field of dissemination and implementation. We summarize ten ingredients that are important in implementation research grants. For each, we provide examples of how preliminary data, background literature, and narrative detail in the application can strengthen the application. Summary Every investigator struggles with the challenge of fitting into a page-limited application the research background, methodological detail, and information that can convey the project’s feasibility and likelihood of success. While no application can include a high level of detail about every ingredient, addressing the ten ingredients summarized in this article can help assure reviewers of the significance, feasibility, and impact of the proposed research. PMID:23062065
This document describes a proposed methodology for setting levels of concern (LOCs) for atrazine in natural freshwater systems to prevent unacceptably adverse effects on the aquatic plant communities in those systems. LOCs regarding effects on humans and possible effects on amph...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-26
... Request; Methodological Studies for the Population Assessment of Tobacco and Health (PATH) Study SUMMARY... Collection: Title: Methodological Studies for Population Assessment of Tobacco and Health (PATH) Study. Type... methodological studies to improve the PATH study instrumentation and data collection procedures. These...
2013-01-01
Background The advent of next generation sequencing technology has accelerated efforts to map and catalogue copy number variation (CNV) in genomes of important micro-organisms for public health. A typical analysis of the sequence data involves mapping reads onto a reference genome, calculating the respective coverage, and detecting regions with too-low or too-high coverage (deletions and amplifications, respectively). Current CNV detection methods rely on statistical assumptions (e.g., a Poisson model) that may not hold in general, or require fine-tuning the underlying algorithms to detect known hits. We propose a new CNV detection methodology based on two Poisson hierarchical models, the Poisson-Gamma and Poisson-Lognormal, with the advantage of being sufficiently flexible to describe different data patterns, whilst robust against deviations from the often assumed Poisson model. Results Using sequence coverage data of 7 Plasmodium falciparum malaria genomes (3D7 reference strain, HB3, DD2, 7G8, GB4, OX005, and OX006), we showed that empirical coverage distributions are intrinsically asymmetric and overdispersed in relation to the Poisson model. We also demonstrated a low baseline false positive rate for the proposed methodology using 3D7 resequencing data and simulation. When applied to the non-reference isolate data, our approach detected known CNV hits, including an amplification of the PfMDR1 locus in DD2 and a large deletion in the CLAG3.2 gene in GB4, and putative novel CNV regions. When compared to the recently available FREEC and cn.MOPS approaches, our findings were more concordant with putative hits from the highest quality array data for the 7G8 and GB4 isolates. Conclusions In summary, the proposed methodology brings an increase in flexibility, robustness, accuracy and statistical rigour to CNV detection using sequence coverage data. PMID:23442253
de Brún, Tomas; O'Reilly-de Brún, Mary; O'Donnell, Catherine A; MacFarlane, Anne
2016-08-03
The implementation of research findings is not a straightforward matter. There are substantive and recognised gaps in the process of translating research findings into practice and policy. In order to overcome some of these translational difficulties, a number of strategies have been proposed for researchers. These include greater use of theoretical approaches in research focused on implementation, and use of a wider range of research methods appropriate to policy questions and the wider social context in which they are placed. However, questions remain about how to combine theory and method in implementation research. In this paper, we respond to these proposals. Focussing on a contemporary social theory, Normalisation Process Theory, and a participatory research methodology, Participatory Learning and Action, we discuss the potential of their combined use for implementation research. We note ways in which Normalisation Process Theory and Participatory Learning and Action are congruent and may therefore be used as heuristic devices to explore, better understand and support implementation. We also provide examples of their use in our own research programme about community involvement in primary healthcare. Normalisation Process Theory alone has, to date, offered useful explanations for the success or otherwise of implementation projects post-implementation. We argue that Normalisation Process Theory can also be used to prospectively support implementation journeys. Furthermore, Normalisation Process Theory and Participatory Learning and Action can be used together so that interventions to support implementation work are devised and enacted with the expertise of key stakeholders. We propose that the specific combination of this theory and methodology possesses the potential, because of their combined heuristic force, to offer a more effective means of supporting implementation projects than either one might do on its own, and of providing deeper understandings of implementation contexts, rather than merely describing change.
Automated flood extent identification using WorldView imagery for the insurance industry
NASA Astrophysics Data System (ADS)
Geller, Christina
2017-10-01
Flooding is the most common and costly natural disaster around the world, causing the loss of human life and billions in economic and insured losses each year. In 2016, pluvial and fluvial floods caused an estimated 5.69 billion USD in losses worldwide with the most severe events occurring in Germany, France, China, and the United States. While catastrophe modeling has begun to help bridge the knowledge gap about the risk of fluvial flooding, understanding the extent of a flood - pluvial and fluvial - in near real-time allows insurance companies around the world to quantify the loss of property that their clients face during a flooding event and proactively respond. To develop this real-time, global analysis of flooded areas and the associated losses, a new methodology utilizing optical multi-spectral imagery from DigitalGlobe (DGI) WorldView satellite suite is proposed for the extraction of pluvial and fluvial flood extents. This methodology involves identifying flooded areas visible to the sensor, filling in the gaps left by the built environment (i.e. buildings, trees) with a nearest neighbor calculation, and comparing the footprint against an Industry Exposure Database (IE) to calculate a loss estimate. Full-automation of the methodology allows production of flood extents and associated losses anywhere around the world as required. The methodology has been tested and proven effective for the 2016 flood in Louisiana, USA.
Mammana, Sabrina B; Berton, Paula; Camargo, Alejandra B; Lascalea, Gustavo E; Altamirano, Jorgelina C
2017-05-01
An analytical methodology based on coprecipitation-assisted coacervative extraction coupled to HPLC-UV was developed for determination of five organophosphorus pesticides (OPPs), including fenitrothion, guthion, parathion, methidathion, and chlorpyrifos, in water samples. It involves a green technique leading to an efficient and simple analytical methodology suitable for high-throughput analysis. Relevant physicochemical variables were studied and optimized on the analytical response of each OPP. Under optimized conditions, the resulting methodology was as follows: an aliquot of 9 mL of water sample was placed into a centrifuge tube and 0.5 mL sodium citrate 0.1 M, pH 4; 0.08 mL Al 2 (SO 4 ) 3 0.1 M; and 0.7 mL SDS 0.1 M were added and homogenized. After centrifugation the supernatant was discarded. A 700 μL aliquot of the coacervate-rich phase obtained was dissolved with 300 μL of methanol and 20 μL of the resulting solution was analyzed by HPLC-UV. The resulting LODs ranged within 0.7-2.5 ng/mL and the achieved RSD and recovery values were <8% (n = 3) and >81%, respectively. The proposed analytical methodology was successfully applied for the analysis of five OPPs in water samples for human consumption of different locations of Mendoza. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Adly, Amr A.; Abd-El-Hafiz, Salwa K.
2014-01-01
Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper. PMID:26257939
Adly, Amr A; Abd-El-Hafiz, Salwa K
2015-05-01
Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper.
The kidney allocation score: methodological problems, moral concerns and unintended consequences.
Hippen, B
2009-07-01
The growing disparity between the demand for and supply of kidneys for transplantation has generated interest in alternative systems of allocating kidneys from deceased donors. This personal viewpoint focuses attention on the Kidney Allocation Score (KAS) proposal promulgated by the UNOS/OPTN Kidney Committee. I identify several methodological and moral flaws in the proposed system, concluding that any iteration of the KAS proposal should be met with more skepticism than sanguinity.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-01
... DEPARTMENT OF COMMERCE International Trade Administration Antidumping Methodologies in Proceedings Involving Non-Market Economies: Valuing the Factor of Production: Labor; Correction to Request for Comment AGENCY: Import Administration, International Trade Administration, Department of Commerce DATES: Effective Date: March 1, 2011. FOR FURTHER...
Measuring political polarization: Twitter shows the two sides of Venezuela
NASA Astrophysics Data System (ADS)
Morales, A. J.; Borondo, J.; Losada, J. C.; Benito, R. M.
2015-03-01
We say that a population is perfectly polarized when divided in two groups of the same size and opposite opinions. In this paper, we propose a methodology to study and measure the emergence of polarization from social interactions. We begin by proposing a model to estimate opinions in which a minority of influential individuals propagate their opinions through a social network. The result of the model is an opinion probability density function. Next, we propose an index to quantify the extent to which the resulting distribution is polarized. Finally, we apply the proposed methodology to a Twitter conversation about the late Venezuelan president, Hugo Chávez, finding a good agreement between our results and offline data. Hence, we show that our methodology can detect different degrees of polarization, depending on the structure of the network.
An Evolutionary Method for Financial Forecasting in Microscopic High-Speed Trading Environment.
Huang, Chien-Feng; Li, Hsu-Chih
2017-01-01
The advancement of information technology in financial applications nowadays have led to fast market-driven events that prompt flash decision-making and actions issued by computer algorithms. As a result, today's markets experience intense activity in the highly dynamic environment where trading systems respond to others at a much faster pace than before. This new breed of technology involves the implementation of high-speed trading strategies which generate significant portion of activity in the financial markets and present researchers with a wealth of information not available in traditional low-speed trading environments. In this study, we aim at developing feasible computational intelligence methodologies, particularly genetic algorithms (GA), to shed light on high-speed trading research using price data of stocks on the microscopic level. Our empirical results show that the proposed GA-based system is able to improve the accuracy of the prediction significantly for price movement, and we expect this GA-based methodology to advance the current state of research for high-speed trading and other relevant financial applications.
Flexible Macroblock Ordering for Context-Aware Ultrasound Video Transmission over Mobile WiMAX
Martini, Maria G.; Hewage, Chaminda T. E. R.
2010-01-01
The most recent network technologies are enabling a variety of new applications, thanks to the provision of increased bandwidth and better management of Quality of Service. Nevertheless, telemedical services involving multimedia data are still lagging behind, due to the concern of the end users, that is, clinicians and also patients, about the low quality provided. Indeed, emerging network technologies should be appropriately exploited by designing the transmission strategy focusing on quality provision for end users. Stemming from this principle, we propose here a context-aware transmission strategy for medical video transmission over WiMAX systems. Context, in terms of regions of interest (ROI) in a specific session, is taken into account for the identification of multiple regions of interest, and compression/transmission strategies are tailored to such context information. We present a methodology based on H.264 medical video compression and Flexible Macroblock Ordering (FMO) for ROI identification. Two different unequal error protection methodologies, providing higher protection to the most diagnostically relevant data, are presented. PMID:20827292
The colloquial approach: An active learning technique
NASA Astrophysics Data System (ADS)
Arce, Pedro
1994-09-01
This paper addresses the very important problem of the effectiveness of teaching methodologies in fundamental engineering courses such as transport phenomena. An active learning strategy, termed the colloquial approach, is proposed in order to increase student involvement in the learning process. This methodology is a considerable departure from traditional methods that use solo lecturing. It is based on guided discussions, and it promotes student understanding of new concepts by directing the student to construct new ideas by building upon the current knowledge and by focusing on key cases that capture the essential aspects of new concepts. The colloquial approach motivates the student to participate in discussions, to develop detailed notes, and to design (or construct) his or her own explanation for a given problem. This paper discusses the main features of the colloquial approach within the framework of other current and previous techniques. Problem-solving strategies and the need for new textbooks and for future investigations based on the colloquial approach are also outlined.
A stochastic method for stand-alone photovoltaic system sizing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cabral, Claudia Valeria Tavora; Filho, Delly Oliveira; Martins, Jose Helvecio
Photovoltaic systems utilize solar energy to generate electrical energy to meet load demands. Optimal sizing of these systems includes the characterization of solar radiation. Solar radiation at the Earth's surface has random characteristics and has been the focus of various academic studies. The objective of this study was to stochastically analyze parameters involved in the sizing of photovoltaic generators and develop a methodology for sizing of stand-alone photovoltaic systems. Energy storage for isolated systems and solar radiation were analyzed stochastically due to their random behavior. For the development of the methodology proposed stochastic analysis were studied including the Markov chainmore » and beta probability density function. The obtained results were compared with those for sizing of stand-alone using from the Sandia method (deterministic), in which the stochastic model presented more reliable values. Both models present advantages and disadvantages; however, the stochastic one is more complex and provides more reliable and realistic results. (author)« less
A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.
Płotka-Wasylka, J
2018-05-01
A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.
Andrade, Luís Renato Balbão; Amaral, Fernando Gonçalves
2012-01-01
Nanotechnologies is a multidisciplinary set of techniques to manipulate matter on nanoscale level, more precisely particles below 100 nm whose characteristic due to small size is essentially different from those found in macro form materials. Regarding to these new properties of the materials there are knowledge gaps about the effects of these particles on human organism and the environment. Although it still being considered emerging technology it is growing increasingly fast as well as the number of products using nanotechnologies in some production level and so the number of researchers involved with the subject. Given this scenario and based on literature related, a comprehensive methodology for health and safety at work for researching laboratories with activities in nanotechnologies was developed, based on ILO structure guidelines for safety and health at work system on which a number of nanospecific recommendations were added to. The work intends to offer food for thought on controlling risks associated to nanotechnologies.
Multi-scale structural community organisation of the human genome.
Boulos, Rasha E; Tremblay, Nicolas; Arneodo, Alain; Borgnat, Pierre; Audit, Benjamin
2017-04-11
Structural interaction frequency matrices between all genome loci are now experimentally achievable thanks to high-throughput chromosome conformation capture technologies. This ensues a new methodological challenge for computational biology which consists in objectively extracting from these data the structural motifs characteristic of genome organisation. We deployed the fast multi-scale community mining algorithm based on spectral graph wavelets to characterise the networks of intra-chromosomal interactions in human cell lines. We observed that there exist structural domains of all sizes up to chromosome length and demonstrated that the set of structural communities forms a hierarchy of chromosome segments. Hence, at all scales, chromosome folding predominantly involves interactions between neighbouring sites rather than the formation of links between distant loci. Multi-scale structural decomposition of human chromosomes provides an original framework to question structural organisation and its relationship to functional regulation across the scales. By construction the proposed methodology is independent of the precise assembly of the reference genome and is thus directly applicable to genomes whose assembly is not fully determined.
NASA Technical Reports Server (NTRS)
Wiegmann, Douglas A.a
2005-01-01
The NASA Aviation Safety Program (AvSP) has defined several products that will potentially modify airline and/or ATC operations, enhance aircraft systems, and improve the identification of potential hazardous situations within the National Airspace System (NAS). Consequently, there is a need to develop methods for evaluating the potential safety benefit of each of these intervention products so that resources can be effectively invested to produce the judgments to develop Bayesian Belief Networks (BBN's) that model the potential impact that specific interventions may have. Specifically, the present report summarizes methodologies for improving the elicitation of probability estimates during expert evaluations of AvSP products for use in BBN's. The work involved joint efforts between Professor James Luxhoj from Rutgers University and researchers at the University of Illinois. The Rutgers' project to develop BBN's received funding by NASA entitled "Probabilistic Decision Support for Evaluating Technology Insertion and Assessing Aviation Safety System Risk." The proposed project was funded separately but supported the existing Rutgers' program.
Design and analysis of multiple diseases genome-wide association studies without controls.
Chen, Zhongxue; Huang, Hanwen; Ng, Hon Keung Tony
2012-11-15
In genome-wide association studies (GWAS), multiple diseases with shared controls is one of the case-control study designs. If data obtained from these studies are appropriately analyzed, this design can have several advantages such as improving statistical power in detecting associations and reducing the time and cost in the data collection process. In this paper, we propose a study design for GWAS which involves multiple diseases but without controls. We also propose corresponding statistical data analysis strategy for GWAS with multiple diseases but no controls. Through a simulation study, we show that the statistical association test with the proposed study design is more powerful than the test with single disease sharing common controls, and it has comparable power to the overall test based on the whole dataset including the controls. We also apply the proposed method to a real GWAS dataset to illustrate the methodologies and the advantages of the proposed design. Some possible limitations of this study design and testing method and their solutions are also discussed. Our findings indicate that the proposed study design and statistical analysis strategy could be more efficient than the usual case-control GWAS as well as those with shared controls. Copyright © 2012 Elsevier B.V. All rights reserved.
Development of Methodology to Gather Seated Anthropometry Data in a Microgravity Environment
NASA Technical Reports Server (NTRS)
Rajulu, Sudhakar; Young, Karen; Mesloh, Miranda
2010-01-01
The Constellation Program is designing a new vehicle based off of new anthropometric requirements. These requirements specify the need to account for a spinal elongation factor for anthropometric measurements involving the spine, such as eye height and seated height. However, to date there is no data relating spinal elongation to a seated posture. Only data relating spinal elongation to stature has been collected in microgravity. Therefore, it was proposed to collect seated height in microgravity to provide the Constellation designers appropriate data for their analyses. This document will describe the process in which the best method to collect seated height in microgravity was developed.
Low-cost educational robotics applied to physics teaching in Brazil
NASA Astrophysics Data System (ADS)
Souza, Marcos A. M.; Duarte, José R. R.
2015-07-01
In this paper, we propose some of the strategies and methodologies for teaching high-school physics topics through an educational robotics show. This exhibition was part of a set of actions promoted by a Brazilian government program of incentive for teaching activities, whose primary focus is the training of teachers, the improvement of teaching in public schools, the dissemination of science, and the formation of new scientists and researchers. By means of workshops, banners and the prototyping of robotics, we were able to create a connection between the study areas and their surroundings, making learning meaningful and accessible for the students involved and contributing to their cognitive development.
Maori responsiveness in health and medical research: key issues for researchers (part 1).
Sporle, Andrew; Koea, Jonathan
2004-08-06
Application for contestable government-research funding and ethical approval requires researchers to outline how their intended research project contributes to Maori development or advancement. When formulating their research proposals, the key issues for researchers are research utility, defining Maori, informed consent, confidentiality, issues with human tissues and genetic material, participant remuneration and recognition (koha), intellectual property, and involvement of local Maori health or social services. The most common Maori responsiveness issues in research applications can be readily approached by researchers who address straightforward methodological concerns, by working through precedents established by peers and colleagues, as well as by working with end-users of their research.
Social Media Participation in Urban Planning: a New way to Interact and Take Decisions
NASA Astrophysics Data System (ADS)
López-Ornelas, E.; Abascal-Mena, R.; Zepeda-Hernández, S.
2017-09-01
Social Media Participation can be very important when you have to make an important decision about a topic related to urban planning. Textual analysis to identify the sentiment about a topic or, community detection and user analysis to identify the actors involved on a discussion can be very important for the persons or institutions that have to take an important decision. In this paper we propose a methodological design to analyse participation in social media. We study the installation of a new airport in Mexico City as a case of study to highlight the importance of conducting a study of this nature.
Nabieva, T N
1993-01-01
Behavioral experiments were carried out in cats following methodology which simulates complexly organized, nonautomatized behavior with elements of generalization and abstraction. A conclusion was reached regarding the participation of this formation in the structural-functional support of complex integrative forms of activity, cognitive and gnostic processes, was reached on the basis of the results of the performance of test tasks by the animals with partial destruction of the magnocellular basal nucleus. The proposed mechanism of the involvement of the basal nucleus in gnostic and cognitive processes is the nonspecific support of the system of structures which participate directly in thinking and learning.
Das, Arpita; Bhattacharya, Mahua
2011-01-01
In the present work, authors have developed a treatment planning system implementing genetic based neuro-fuzzy approaches for accurate analysis of shape and margin of tumor masses appearing in breast using digital mammogram. It is obvious that a complicated structure invites the problem of over learning and misclassification. In proposed methodology, genetic algorithm (GA) has been used for searching of effective input feature vectors combined with adaptive neuro-fuzzy model for final classification of different boundaries of tumor masses. The study involves 200 digitized mammograms from MIAS and other databases and has shown 86% correct classification rate.
A Selective Review of Group Selection in High-Dimensional Models
Huang, Jian; Breheny, Patrick; Ma, Shuangge
2013-01-01
Grouping structures arise naturally in many statistical modeling problems. Several methods have been proposed for variable selection that respect grouping structure in variables. Examples include the group LASSO and several concave group selection methods. In this article, we give a selective review of group selection concerning methodological developments, theoretical properties and computational algorithms. We pay particular attention to group selection methods involving concave penalties. We address both group selection and bi-level selection methods. We describe several applications of these methods in nonparametric additive models, semiparametric regression, seemingly unrelated regressions, genomic data analysis and genome wide association studies. We also highlight some issues that require further study. PMID:24174707
Preparation of water-soluble magnetic nanocrystals using aryl diazonium salt chemistry.
Griffete, Nébéwia; Herbst, Frédéric; Pinson, Jean; Ammar, Souad; Mangeney, Claire
2011-02-16
A novel and facile methodology for the in situ surface functionalization of Fe(3)O(4) nanoparticles is proposed, based on the use of aryl diazonium salts chemistry. The grafting reaction involves the formation of diazoates in a basic medium. These species are unstable and dediazonize along a homolytic pathway to give aryl radicals which further react with the Fe(3)O(4) NPs during their formation and stop their growth. Advantages of the present approach rely not only on the simplicity, rapidity, and efficiency of the procedure but also on the formation of strong Fe(3)O(4)-aryl surface bonds, highly suitable for further applications.
A Methodological Proposal for Learning Games Selection and Quality Assessment
ERIC Educational Resources Information Center
Dondi, Claudio; Moretti, Michela
2007-01-01
This paper presents a methodological proposal elaborated in the framework of two European projects dealing with game-based learning, both of which have focused on "quality" aspects in order to create suitable tools that support European educators, practitioners and lifelong learners in selecting and assessing learning games for use in…
Development of economic consequence methodology for process risk analysis.
Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed
2015-04-01
A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Ganesh, V.; Muthurasu, A.
2012-04-01
In this paper, we propose various strategies for an enzyme immobilization on electrodes (both metal and semiconductor electrodes). In general, the proposed methodology involves two critical steps viz., (1) chemical modification of substrates using functional monolayers [Langmuir - Blodgett (LB) films and/or self-assembled monolayers (SAMs)] and (2) anchoring of a target enzyme using specific chemical and physical interactions by attacking the terminal functionality of the modified films. Basically there are three ways to immobilize an enzyme on chemically modified electrodes. First method consists of an electrostatic interaction between the enzyme and terminal functional groups present within the chemically modified films. Second and third methods involve the introduction of nanomaterials followed by an enzyme immobilization using both the physical and chemical adsorption processes. As a proof of principle, in this work we demonstrate the sensing and catalytic activity of horseradish peroxidase (HRP) anchored onto SAM modified indium tin oxide (ITO) electrodes towards hydrogen peroxide (H2O2). Structural characterization of such modified electrodes is performed using X-ray photoelectron spectroscopy (XPS), atomic force microscopy (AFM) and contact angle measurements. The binding events and the enzymatic reactions are monitored using electrochemical techniques mainly cyclic voltammetry (CV).
Use of Action Research in Nursing Education
Pehler, Shelley-Rae; Stombaugh, Angela
2016-01-01
Purpose. The purpose of this article is to describe action research in nursing education and to propose a definition of action research for providing guidelines for research proposals and criteria for assessing potential publications for nursing higher education. Methods. The first part of this project involved a search of the literature on action research in nursing higher education from 1994 to 2013. Searches were conducted in the CINAHL and MEDLINE databases. Applying the criteria identified, 80 publications were reviewed. The second part of the project involved a literature review of action research methodology from several disciplines to assist in assessing articles in this review. Results. This article summarizes the nursing higher education literature reviewed and provides processes and content related to four topic areas in nursing higher education. The descriptions assist researchers in learning more about the complexity of both the action research process and the varied outcomes. The literature review of action research in many disciplines along with the review of action research in higher education provided a framework for developing a nursing-education-centric definition of action research. Conclusions. Although guidelines for developing action research and criteria for publication are suggested, continued development of methods for synthesizing action research is recommended. PMID:28078138
A variational Bayes spatiotemporal model for electromagnetic brain mapping.
Nathoo, F S; Babul, A; Moiseev, A; Virji-Babul, N; Beg, M F
2014-03-01
In this article, we present a new variational Bayes approach for solving the neuroelectromagnetic inverse problem arising in studies involving electroencephalography (EEG) and magnetoencephalography (MEG). This high-dimensional spatiotemporal estimation problem involves the recovery of time-varying neural activity at a large number of locations within the brain, from electromagnetic signals recorded at a relatively small number of external locations on or near the scalp. Framing this problem within the context of spatial variable selection for an underdetermined functional linear model, we propose a spatial mixture formulation where the profile of electrical activity within the brain is represented through location-specific spike-and-slab priors based on a spatial logistic specification. The prior specification accommodates spatial clustering in brain activation, while also allowing for the inclusion of auxiliary information derived from alternative imaging modalities, such as functional magnetic resonance imaging (fMRI). We develop a variational Bayes approach for computing estimates of neural source activity, and incorporate a nonparametric bootstrap for interval estimation. The proposed methodology is compared with several alternative approaches through simulation studies, and is applied to the analysis of a multimodal neuroimaging study examining the neural response to face perception using EEG, MEG, and fMRI. © 2013, The International Biometric Society.
A study of overproduction and enhanced secretion of enzymes. Quarterly report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dashek, W.V.
1993-09-01
Wood decay within forests, a significant renewable photosynthetic energy resource, is caused primarily by Basidiomycetous fungi, e.g., white rot fungi. These organisms possess the ability to degrade lignin, cellulose and hemicellulose, the main organic polymers of wood. In the case of the white rot fungi, e.g., Coriolus versicolor, the capacity results from the fungus` ability to elaborate extracellular cellulolytic and ligninolytic enzymes. With regard to the latter, at least one of the enzymes, polyphenol oxidase (PPO) appears within a defined growth medium. This proposal focuses on the over-production and enhanced secretion of PPO, cellulase and lignin peroxidase. There are twomore » major sections to the proposal: (1) overproduction of lignocellulolytic enzymes by genetic engineering methodologies and hyper-production and enhanced secretion of these enzymes by biochemical/electro microscopical techniques and (2) the biochemical/electron microscopical method involves substrate induction and the time-dependent addition of respiration and PPO enzymes.« less
NASA Astrophysics Data System (ADS)
Chevrié, Mathieu; Farges, Christophe; Sabatier, Jocelyn; Guillemard, Franck; Pradere, Laetitia
2017-04-01
In automotive application field, reducing electric conductors dimensions is significant to decrease the embedded mass and the manufacturing costs. It is thus essential to develop tools to optimize the wire diameter according to thermal constraints and protection algorithms to maintain a high level of safety. In order to develop such tools and algorithms, accurate electro-thermal models of electric wires are required. However, thermal equation solutions lead to implicit fractional transfer functions involving an exponential that cannot be embedded in a car calculator. This paper thus proposes an integer order transfer function approximation methodology based on a spatial discretization for this class of fractional transfer functions. Moreover, the H2-norm is used to minimize approximation error. Accuracy of the proposed approach is confirmed with measured data on a 1.5 mm2 wire implemented in a dedicated test bench.
2013-01-01
Background There is a need for qualitative research to help develop case conceptualisations to guide the development of Metacognitive Therapy interventions for Eating Disorders. Method A qualitative study informed by grounded theory methodology was conducted involving open-ended interviews with 27 women aged 18–55 years, who were seeking or receiving treatment for a diagnosed ED. Results The categories identified in this study appeared to be consistent with a metacognitive model including constructs of a Cognitive Attentional Syndrome and metacognitive beliefs. These categories appear to be transdiagnostic, and the interaction between the categories is proposed to explain the maintenance of EDs. Conclusions The transdiagnostic model proposed may be useful to guide the development of future metacognitive therapy interventions for EDs with the hope that this will lead to improved outcomes for individuals with EDs. PMID:24999403
Semi-automated knowledge discovery: identifying and profiling human trafficking
NASA Astrophysics Data System (ADS)
Poelmans, Jonas; Elzinga, Paul; Ignatov, Dmitry I.; Kuznetsov, Sergei O.
2012-11-01
We propose an iterative and human-centred knowledge discovery methodology based on formal concept analysis. The proposed approach recognizes the important role of the domain expert in mining real-world enterprise applications and makes use of specific domain knowledge, including human intelligence and domain-specific constraints. Our approach was empirically validated at the Amsterdam-Amstelland police to identify suspects and victims of human trafficking in 266,157 suspicious activity reports. Based on guidelines of the Attorney Generals of the Netherlands, we first defined multiple early warning indicators that were used to index the police reports. Using concept lattices, we revealed numerous unknown human trafficking and loverboy suspects. In-depth investigation by the police resulted in a confirmation of their involvement in illegal activities resulting in actual arrestments been made. Our human-centred approach was embedded into operational policing practice and is now successfully used on a daily basis to cope with the vastly growing amount of unstructured information.
Moral deliberation and nursing ethics cases: elements of a methodological proposal.
Schneider, Dulcinéia Ghizoni; Ramos, Flávia Regina Souza
2012-11-01
A qualitative study with an exploratory, descriptive and documentary design that was conducted with the objective of identifying the elements to constitute a method for the analysis of accusations of and proceedings for professional ethics infringements. The method is based on underlying elements identified inductively during analysis of professional ethics hearings judged by and filed in the archives of the Regional Nursing Board of Santa Catarina, Brazil, between 1999 and 2007. The strategies developed were based on the results of an analysis of the findings of fact (occurrences/infractions, causes and outcomes) contained in the records of 128 professional ethics hearings and on the structural elements (statements, rules and practices) identified in five example professional ethics cases. The strategies suggested for evaluating accusations of ethics infringements and the procedures involved in deliberating on ethics hearings constitute a generic proposal that will require adaptation to the context of specific professional ethics accusations.
An Integrated Environment for Efficient Formal Design and Verification
NASA Technical Reports Server (NTRS)
1998-01-01
The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-19
... DEPARTMENT OF COMMERCE International Trade Administration Methodological Change for Implementation..., the Department of Commerce (``the Department'') will implement a methodological change to reduce... administrative reviews involving merchandise from the PRC and Vietnam. Methodological Change In antidumping duty...
The Development of a Checklist to Enhance Methodological Quality in Intervention Programs.
Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Sánchez-Martín, Milagrosa
2016-01-01
The methodological quality of primary studies is an important issue when performing meta-analyses or systematic reviews. Nevertheless, there are no clear criteria for how methodological quality should be analyzed. Controversies emerge when considering the various theoretical and empirical definitions, especially in relation to three interrelated problems: the lack of representativeness, utility, and feasibility. In this article, we (a) systematize and summarize the available literature about methodological quality in primary studies; (b) propose a specific, parsimonious, 12-items checklist to empirically define the methodological quality of primary studies based on a content validity study; and (c) present an inter-coder reliability study for the resulting 12-items. This paper provides a precise and rigorous description of the development of this checklist, highlighting the clearly specified criteria for the inclusion of items and a substantial inter-coder agreement in the different items. Rather than simply proposing another checklist, however, it then argues that the list constitutes an assessment tool with respect to the representativeness, utility, and feasibility of the most frequent methodological quality items in the literature, one that provides practitioners and researchers with clear criteria for choosing items that may be adequate to their needs. We propose individual methodological features as indicators of quality, arguing that these need to be taken into account when designing, implementing, or evaluating an intervention program. This enhances methodological quality of intervention programs and fosters the cumulative knowledge based on meta-analyses of these interventions. Future development of the checklist is discussed.
NASA Astrophysics Data System (ADS)
Dib, Alain; Kavvas, M. Levent
2018-03-01
The characteristic form of the Saint-Venant equations is solved in a stochastic setting by using a newly proposed Fokker-Planck Equation (FPE) methodology. This methodology computes the ensemble behavior and variability of the unsteady flow in open channels by directly solving for the flow variables' time-space evolutionary probability distribution. The new methodology is tested on a stochastic unsteady open-channel flow problem, with an uncertainty arising from the channel's roughness coefficient. The computed statistical descriptions of the flow variables are compared to the results obtained through Monte Carlo (MC) simulations in order to evaluate the performance of the FPE methodology. The comparisons show that the proposed methodology can adequately predict the results of the considered stochastic flow problem, including the ensemble averages, variances, and probability density functions in time and space. Unlike the large number of simulations performed by the MC approach, only one simulation is required by the FPE methodology. Moreover, the total computational time of the FPE methodology is smaller than that of the MC approach, which could prove to be a particularly crucial advantage in systems with a large number of uncertain parameters. As such, the results obtained in this study indicate that the proposed FPE methodology is a powerful and time-efficient approach for predicting the ensemble average and variance behavior, in both space and time, for an open-channel flow process under an uncertain roughness coefficient.
The Development of a Checklist to Enhance Methodological Quality in Intervention Programs
Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Sánchez-Martín, Milagrosa
2016-01-01
The methodological quality of primary studies is an important issue when performing meta-analyses or systematic reviews. Nevertheless, there are no clear criteria for how methodological quality should be analyzed. Controversies emerge when considering the various theoretical and empirical definitions, especially in relation to three interrelated problems: the lack of representativeness, utility, and feasibility. In this article, we (a) systematize and summarize the available literature about methodological quality in primary studies; (b) propose a specific, parsimonious, 12-items checklist to empirically define the methodological quality of primary studies based on a content validity study; and (c) present an inter-coder reliability study for the resulting 12-items. This paper provides a precise and rigorous description of the development of this checklist, highlighting the clearly specified criteria for the inclusion of items and a substantial inter-coder agreement in the different items. Rather than simply proposing another checklist, however, it then argues that the list constitutes an assessment tool with respect to the representativeness, utility, and feasibility of the most frequent methodological quality items in the literature, one that provides practitioners and researchers with clear criteria for choosing items that may be adequate to their needs. We propose individual methodological features as indicators of quality, arguing that these need to be taken into account when designing, implementing, or evaluating an intervention program. This enhances methodological quality of intervention programs and fosters the cumulative knowledge based on meta-analyses of these interventions. Future development of the checklist is discussed. PMID:27917143
Boote, Jonathan; Baird, Wendy; Beecroft, Claire
2010-04-01
To review published examples of public involvement in research design, to synthesise the contributions made by members of the public, as well as the identified barriers, tensions and facilitating strategies. Systematic literature search and narrative review. Seven papers were identified covering the following topics: breast-feeding, antiretroviral and nutrition interventions; paediatric resuscitation; exercise and cognitive behavioural therapy; hormone replacement therapy and breast cancer; stroke; and parents' experiences of having a pre-term baby. Six papers reported public involvement in the development of a clinical trial, while one reported public involvement in the development of a mixed methods study. Group meetings were the most common method of public involvement. Contributions that members of the public made to research design were: review of consent procedures and patient information sheets; outcome suggestions; review of acceptability of data collection procedures; and recommendations on the timing of potential participants into the study and the timing of follow-up. Numerous barriers, tensions and facilitating strategies were identified. The issues raised here should assist researchers in developing research proposals with members of the public. Substantive and methodological directions for further research on the impact of public involvement in research design are set out. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.
Metzger, Lia; Ahalt, Cyrus; Kushel, Margot; Riker, Alissa; Williams, Brie
2017-09-11
Purpose The rapidly increasing number of older adults cycling through local criminal justice systems (jails, probation, and parole) suggests a need for greater collaboration among a diverse group of local stakeholders including professionals from healthcare delivery, public health, and criminal justice and directly affected individuals, their families, and advocates. The purpose of this paper is to develop a framework that local communities can use to understand and begin to address the needs of criminal justice-involved older adults. Design/methodology/approach The framework included solicit input from community stakeholders to identify pressing challenges facing criminal justice-involved older adults, conduct needs assessments of criminal justice-involved older adults and professionals working with them; implement quick-response interventions based on needs assessments; share findings with community stakeholders and generate public feedback; engage interdisciplinary group to develop an action plan to optimize services. Findings A five-step framework for creating an interdisciplinary community response is an effective approach to action planning and broad stakeholder engagement on behalf of older adults cycling through the criminal justice system. Originality/value This study proposes the Criminal Justice Involved Older Adults in Need of Treatment Initiative Framework for establishing an interdisciplinary community response to the growing population of medically and socially vulnerable criminal justice-involved older adults.
Hierarchical Brain Networks Active in Approach and Avoidance Goal Pursuit
Spielberg, Jeffrey M.; Heller, Wendy; Miller, Gregory A.
2013-01-01
Effective approach/avoidance goal pursuit is critical for attaining long-term health and well-being. Research on the neural correlates of key goal-pursuit processes (e.g., motivation) has long been of interest, with lateralization in prefrontal cortex being a particularly fruitful target of investigation. However, this literature has often been limited by a lack of spatial specificity and has not delineated the precise aspects of approach/avoidance motivation involved. Additionally, the relationships among brain regions (i.e., network connectivity) vital to goal-pursuit remain largely unexplored. Specificity in location, process, and network relationship is vital for moving beyond gross characterizations of function and identifying the precise cortical mechanisms involved in motivation. The present paper integrates research using more spatially specific methodologies (e.g., functional magnetic resonance imaging) with the rich psychological literature on approach/avoidance to propose an integrative network model that takes advantage of the strengths of each of these literatures. PMID:23785328
Hierarchical brain networks active in approach and avoidance goal pursuit.
Spielberg, Jeffrey M; Heller, Wendy; Miller, Gregory A
2013-01-01
Effective approach/avoidance goal pursuit is critical for attaining long-term health and well-being. Research on the neural correlates of key goal-pursuit processes (e.g., motivation) has long been of interest, with lateralization in prefrontal cortex being a particularly fruitful target of investigation. However, this literature has often been limited by a lack of spatial specificity and has not delineated the precise aspects of approach/avoidance motivation involved. Additionally, the relationships among brain regions (i.e., network connectivity) vital to goal-pursuit remain largely unexplored. Specificity in location, process, and network relationship is vital for moving beyond gross characterizations of function and identifying the precise cortical mechanisms involved in motivation. The present paper integrates research using more spatially specific methodologies (e.g., functional magnetic resonance imaging) with the rich psychological literature on approach/avoidance to propose an integrative network model that takes advantage of the strengths of each of these literatures.
Metasynthesis and bricolage: an artistic exercise of creating a collage of meaning.
Kinn, Liv Grethe; Holgersen, Helge; Ekeland, Tor-Johan; Davidson, Larry
2013-09-01
During the past decades, new approaches to synthesizing qualitative data have been developed. However, this methodology continues to face significant philosophical and practical challenges. By reviewing the literature on this topic, our overall aim in this article is to explore the systematic and creative research processes involved in the act of metasynthesizing. By investigating synthesizing processes borrowed from two studies, we discuss matters of transparency and transferability in relation to how multiple qualitative studies are interpreted and transformed into one narrative. We propose concepts such as bricolage, metaphor, playfulness, and abduction as ideas that might enhance understanding of the importance of combinations of scientific and artistic approaches to the way the synthesizer "puzzles together" an interpretive account of qualitative studies. This study can benefit researchers by increasing their awareness of the artistic processes involved in qualitative analysis and metasynthesis to expand the domain and methods of their fields.
Rockfall exposures in Montserrat mountain
NASA Astrophysics Data System (ADS)
Fontquerni Gorchs, Sara; Vilaplana Fernández, Joan Manuel; Guinau Sellés, Marta; Jesús Royán Cordero, Manuel
2015-04-01
This study shows the developed methodology to analyze the exposure level on a 1:25000 scale, and the results obtained by applying it to an important part of the Monataña de Montserrat Natural Park for vehicles with and without considering their occupants. The development of this proposal is part of an ongoing study which focuses more in-depth in the analysis of the rockfall risk exposure in different scales and in different natural and social contexts. This research project applies a methodology to evaluate the rockfall exposure level based on the product of the frequency of occurrence of the event by an exposure function of the vulnerable level on a 1:25,000 scale although the scale used for the study was 1:10,000. The proposed methodology to calculate the exposure level is based on six phases: 1- Identification, classification and inventory of every element potentially under risk. 2- Zoning of the frequency of occurrence of the event in the studied area. 3- Design of the exposure function for each studied element. 4- Obtaining the Exposure index, it can be defined as the product of the frequency of occurrence by the exposure function of the vulnerable element through SIG analysis obtained with ArcGis software (ESRI) 5- Obtaining exposure level by grouping into categories the numerical values of the exposure index. 6- Production of the exposition zoning map. The different types of vulnerable elements considered in the totality of the study are: Vehicles in motion, people in vehicles in motion, people on paths, permanent elements and people in buildings. Each defined typology contains all elements with same characteristics and an exposure function has been designed for each of them. For the exposure calculation, two groups of elements have been considered; firstly the group of elements with no people involved and afterwards same group of elements but with people involved. This is a first comprehensive and synthetic work about rockfall exposure on the Montserrat Mountain. It is important to mention that the exposure level calculation has been obtained from natural hazard data do not protected by defense works. Results of this work enable us to consider best strategies to reduce rockfalls risk in the PNMM. It is clear that, apart from the required structural defense works, some of them already made, implementation of strategies not involving structural defense is, in the medium and long term, the best policy to mitigate the risk. In the PNMM case, rethinking of mobility and traffic management on the mountain access would be definitely helpful to achieve a minimized geological risk.
Introducing a new bond reactivity index: Philicities for natural bond orbitals.
Sánchez-Márquez, Jesús; Zorrilla, David; García, Víctor; Fernández, Manuel
2017-12-22
In the present work, a new methodology defined for obtaining reactivity indices (philicities) is proposed. This is based on reactivity functions such as the Fukui function or the dual descriptor, and makes it possible to project the information from reactivity functions onto molecular orbitals, instead of onto the atoms of the molecule (atomic reactivity indices). The methodology focuses on the molecules' natural bond orbitals (bond reactivity indices) because these orbitals have the advantage of being localized, allowing the reaction site of an electrophile or nucleophile to be determined within a very precise molecular region. This methodology provides a "philicity" index for every NBO, and a representative set of molecules has been used to test the new definition. A new methodology has also been developed to compare the "finite difference" and the "frontier molecular orbital" approximations. To facilitate their use, the proposed methodology as well as the possibility of calculating the new indices have been implemented in a new version of UCA-FUKUI software. In addition, condensation schemes based on atomic populations of the "atoms in molecules" theory, the Hirshfeld population analysis, the approximation of Mulliken (with a minimal basis set) and electrostatic potential-derived charges have also been implemented, including the calculation of "bond reactivity indices" defined in previous studies. Graphical abstract A new methodology defined for obtaining bond reactivity indices (philicities) is proposed and makes it possible to project the information from reactivity functions onto molecular orbitals. The proposed methodology as well as the possibility of calculating the new indices have been implemented in a new version of UCA-FUKUI software. In addition, this version can use new atomic condensation schemes and new "utilities" have also been included in this second version.
ERIC Educational Resources Information Center
Palma, Lisiane Celia; Pedrozo, Eugênio Ávila
2015-01-01
Several papers propose analytical methods relating to the inclusion of sustainability in courses and universities. However, as sustainability is a complex subject, methodological proposals on the topic must avoid making disjointed analyses which focus exclusively on curricula or on organisational strategy, as often seen in the literature.…
The Common Topoi of STEM Discourse: An Apologia and Methodological Proposal, with Pilot Survey
ERIC Educational Resources Information Center
Walsh, Lynda
2010-01-01
In this article, the author proposes a methodology for the rhetorical analysis of scientific, technical, mathematical, and engineering (STEM) discourse based on the common topics (topoi) of this discourse. Beginning with work by Miller, Prelli, and other rhetoricians of STEM discourse--but factoring in related studies in cognitive linguistics--she…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-28
... . You can search for the document by entering ``Public Notice '' in the Search bar. If necessary, use... the time and cost burden for this proposed collection, including the validity of the methodology and.... Methodology: The Bureau of Consular Affairs will be posting this form on Department of State Web sites to give...
The Construction and Analysis of a Science Story: A Proposed Methodology
ERIC Educational Resources Information Center
Klassen, Stephen
2009-01-01
Science educators are beginning to establish a theoretical and methodological foundation for constructing and using stories in science teaching. At the same time, it is not clear to what degree science stories that have recently been written adhere to the guidelines that are being proposed. The author has written a story about Louis Slotin, which…
Action Research as a Congruent Methodology for Understanding Wikis: The Case of Wikiversity
ERIC Educational Resources Information Center
Lawler, Cormac
2008-01-01
It is proposed that action research is an appropriate methodology for studying wikis, and is akin to research "the wiki way". This proposal is contextualised within the case of Wikiversity, a project of the Wikimedia Foundation. A framework for a participative research project is outlined, and challenges and implications of such a…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-06
... our estimate of the burden of the proposed collection, including the validity of the methodology and... conduct of the training or internship. Methodology: The collection will be submitted to the Department by... sponsor organization, or during the investigation of a complaint or incident. Dated: May 22, 2012. Robin J...
Carlsson, Ing-Marie; Blomqvist, Marjut; Jormfeldt, Henrika
2017-01-01
Undertaking research studies in the field of mental health is essential in mental health nursing. Qualitative research methodologies enable human experiences to become visible and recognize the importance of lived experiences. This paper argues that involving people with schizophrenia in research is critical to promote their health and well-being. The quality of qualitative research needs scrutinizing according to methodological issues such as trustworthiness and ethical standards that are a fundamental part of qualitative research and nursing curricula. The aim of this study was to critically review recent qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions, regarding descriptions of ethical and methodological issues in data collection and analysis. A search for relevant papers was conducted in three electronic databases, in December 2016. Fifteen qualitative interview studies were included and reviewed regarding methodological issues related to ethics, and data collection and analysis. The results revealed insufficient descriptions of methodology regarding ethical considerations and issues related to recruitment and sampling in qualitative interview studies with individuals with severe mental illness, putting trustworthiness at risk despite detailed descriptions of data analysis. Knowledge from the perspective of individuals with their own experience of mental illness is essential. Issues regarding sampling and trustworthiness in qualitative studies involving people with severe mental illness are vital to counteract the stigmatization of mental illness.
Carlsson, Ing-Marie; Blomqvist, Marjut; Jormfeldt, Henrika
2017-01-01
ABSTRACT Undertaking research studies in the field of mental health is essential in mental health nursing. Qualitative research methodologies enable human experiences to become visible and recognize the importance of lived experiences. This paper argues that involving people with schizophrenia in research is critical to promote their health and well-being. The quality of qualitative research needs scrutinizing according to methodological issues such as trustworthiness and ethical standards that are a fundamental part of qualitative research and nursing curricula. The aim of this study was to critically review recent qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions, regarding descriptions of ethical and methodological issues in data collection and analysis. A search for relevant papers was conducted in three electronic databases, in December 2016. Fifteen qualitative interview studies were included and reviewed regarding methodological issues related to ethics, and data collection and analysis. The results revealed insufficient descriptions of methodology regarding ethical considerations and issues related to recruitment and sampling in qualitative interview studies with individuals with severe mental illness, putting trustworthiness at risk despite detailed descriptions of data analysis. Knowledge from the perspective of individuals with their own experience of mental illness is essential. Issues regarding sampling and trustworthiness in qualitative studies involving people with severe mental illness are vital to counteract the stigmatization of mental illness. PMID:28901217
Fuzzy logic controllers for electrotechnical devices - On-site tuning approach
NASA Astrophysics Data System (ADS)
Hissel, D.; Maussion, P.; Faucher, J.
2001-12-01
Fuzzy logic offers nowadays an interesting alternative to the designers of non linear control laws for electrical or electromechanical systems. However, due to the huge number of tuning parameters, this kind of control is only used in a few industrial applications. This paper proposes a new, very simple, on-site tuning strategy for a PID-like fuzzy logic controller. Thanks to the experimental designs methodology, we will propose sets of optimized pre-established settings for this kind of fuzzy controllers. The proposed parameters are only depending on one on-site open-loop identification test. In this way, this on-site tuning methodology has to be compared to the Ziegler-Nichols one's for conventional controllers. Experimental results (on a permanent magnets synchronous motor and on a DC/DC converter) will underline all the efficiency of this tuning methodology. Finally, the field of validity of the proposed pre-established settings will be given.
NASA Astrophysics Data System (ADS)
Xu, Kun; Xu, Guo-Qing; Zheng, Chun-Hua
2016-04-01
The wheel-rail adhesion control for regenerative braking systems of high speed electric multiple unit trains is crucial to maintaining the stability, improving the adhesion utilization, and achieving deep energy recovery. There remain technical challenges mainly because of the nonlinear, uncertain, and varying features of wheel-rail contact conditions. This research analyzes the torque transmitting behavior during regenerative braking, and proposes a novel methodology to detect the wheel-rail adhesion stability. Then, applications to the wheel slip prevention during braking are investigated, and the optimal slip ratio control scheme is proposed, which is based on a novel optimal reference generation of the slip ratio and a robust sliding mode control. The proposed methodology achieves the optimal braking performance without the wheel-rail contact information. Numerical simulation results for uncertain slippery rails verify the effectiveness of the proposed methodology.
Combining user logging with eye tracking for interactive and dynamic applications.
Ooms, Kristien; Coltekin, Arzu; De Maeyer, Philippe; Dupont, Lien; Fabrikant, Sara; Incoul, Annelies; Kuhn, Matthias; Slabbinck, Hendrik; Vansteenkiste, Pieter; Van der Haegen, Lise
2015-12-01
User evaluations of interactive and dynamic applications face various challenges related to the active nature of these displays. For example, users can often zoom and pan on digital products, and these interactions cause changes in the extent and/or level of detail of the stimulus. Therefore, in eye tracking studies, when a user's gaze is at a particular screen position (gaze position) over a period of time, the information contained in this particular position may have changed. Such digital activities are commonplace in modern life, yet it has been difficult to automatically compare the changing information at the viewed position, especially across many participants. Existing solutions typically involve tedious and time-consuming manual work. In this article, we propose a methodology that can overcome this problem. By combining eye tracking with user logging (mouse and keyboard actions) with cartographic products, we are able to accurately reference screen coordinates to geographic coordinates. This referencing approach allows researchers to know which geographic object (location or attribute) corresponds to the gaze coordinates at all times. We tested the proposed approach through two case studies, and discuss the advantages and disadvantages of the applied methodology. Furthermore, the applicability of the proposed approach is discussed with respect to other fields of research that use eye tracking-namely, marketing, sports and movement sciences, and experimental psychology. From these case studies and discussions, we conclude that combining eye tracking and user-logging data is an essential step forward in efficiently studying user behavior with interactive and static stimuli in multiple research fields.
NASA Technical Reports Server (NTRS)
Hermann, Robert
1997-01-01
The aim of this research is to develop new mathematical methodology for the analysis of hybrid systems of the type involved in Air Traffic Control (ATC) problems. Two directions of investigation were initiated. The first used the methodology of nonlinear generalized functions, whose mathematical foundations were initiated by Colombeau and developed further by Oberguggenberger; it has been extended to apply to ordinary differential. Systems of the type encountered in control in joint work with the PI and M. Oberguggenberger. This involved a 'mixture' of 'continuous' and 'discrete' methodology. ATC clearly involves mixtures of two sorts of mathematical problems: (1) The 'continuous' dynamics of a standard control type described by ordinary differential equations (ODE) of the form: {dx/dt = f(x, u)} and (2) the discrete lattice dynamics involved of cellular automata. Most of the CA literature involves a discretization of a partial differential equation system of the type encountered in physics problems (e.g. fluid and gas problems). Both of these directions requires much thinking and new development of mathematical fundamentals before they may be utilized in the ATC work. Rather than consider CA as 'discretization' of PDE systems, I believe that the ATC applications will require a completely different and new mathematical methodology, a sort of discrete analogue of jet bundles and/or the sheaf-theoretic techniques to topologists. Here too, I have begun work on virtually 'virgin' mathematical ground (at least from an 'applied' point of view) which will require considerable preliminary work.
Computational Modeling of Mixed Solids for CO2 CaptureSorbents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duan, Yuhua
2015-01-01
Since current technologies for capturing CO2 to fight global climate change are still too energy intensive, there is a critical need for development of new materials that can capture CO2 reversibly with acceptable energy costs. Accordingly, solid sorbents have been proposed to be used for CO2 capture applications through a reversible chemical transformation. By combining thermodynamic database mining with first principles density functional theory and phonon lattice dynamics calculations, a theoretical screening methodology to identify the most promising CO2 sorbent candidates from the vast array of possible solid materials has been proposed and validated. The calculated thermodynamic properties of differentmore » classes of solid materials versus temperature and pressure changes were further used to evaluate the equilibrium properties for the CO2 adsorption/desorption cycles. According to the requirements imposed by the pre- and post- combustion technologies and based on our calculated thermodynamic properties for the CO2 capture reactions by the solids of interest, we were able to screen only those solid materials for which lower capture energy costs are expected at the desired pressure and temperature conditions. Only those selected CO2 sorbent candidates were further considered for experimental validations. The ab initio thermodynamic technique has the advantage of identifying thermodynamic properties of CO2 capture reactions without any experimental input beyond crystallographic structural information of the solid phases involved. Such methodology not only can be used to search for good candidates from existing database of solid materials, but also can provide some guidelines for synthesis new materials. In this presentation, we apply our screening methodology to mixing solid systems to adjust the turnover temperature to help on developing CO2 capture Technologies.« less
New methodology for fast prediction of wheel wear evolution
NASA Astrophysics Data System (ADS)
Apezetxea, I. S.; Perez, X.; Casanueva, C.; Alonso, A.
2017-07-01
In railway applications wear prediction in the wheel-rail interface is a fundamental matter in order to study problems such as wheel lifespan and the evolution of vehicle dynamic characteristic with time. However, one of the principal drawbacks of the existing methodologies for calculating the wear evolution is the computational cost. This paper proposes a new wear prediction methodology with a reduced computational cost. This methodology is based on two main steps: the first one is the substitution of the calculations over the whole network by the calculation of the contact conditions in certain characteristic point from whose result the wheel wear evolution can be inferred. The second one is the substitution of the dynamic calculation (time integration calculations) by the quasi-static calculation (the solution of the quasi-static situation of a vehicle at a certain point which is the same that neglecting the acceleration terms in the dynamic equations). These simplifications allow a significant reduction of computational cost to be obtained while maintaining an acceptable level of accuracy (error order of 5-10%). Several case studies are analysed along the paper with the objective of assessing the proposed methodology. The results obtained in the case studies allow concluding that the proposed methodology is valid for an arbitrary vehicle running through an arbitrary track layout.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-18
... parties to comment on these methodological issues described above. Request for Comment on Interim Industry... comments. \\15\\ Indicator: GNI per capita, Atlas Method (current US$) is obtained from http://data.worldbank... methodology, the Department has encountered a number of methodological and practical challenges that must be...
Methodological challenges to human medical study.
Zhong, Yixin; Liu, Baoyan; Qu, Hua; Xie, Qi
2014-09-01
With the transformation of modern medicinal pattern, medical studies are confronted with methodological challenges. By analyzing two methodologies existing in the study of physical matter system and information system, the article points out that traditional Chinese medicine (TCM), especially the treatment based on syndrome differentiation, embodies information conception of methodological positions, while western medicine represents matter conception of methodological positions. It proposes a new way of thinking about combination of TCM and western medicine by combinating two kinds of methodological methods.
A methodology for the evaluation of the human-bioclimatic performance of open spaces
NASA Astrophysics Data System (ADS)
Charalampopoulos, Ioannis; Tsiros, Ioannis; Chronopoulou-Sereli, Aik.; Matzarakis, Andreas
2017-05-01
The purpose of this paper is to present a simple methodology to improve the evaluation of the human-biometeorological benefits of open spaces. It is based on two groups of new indices using as basis the well-known PET index. This simple methodology along with the accompanying indices allows a qualitative and quantitative evaluation of the climatic behavior of the selected sites. The proposed methodology was applied in a human-biometeorology research in the city of Athens, Greece. The results of this study are in line with the results of other related studies indicating the considerable influence of the sky view factor (SVF), the existence of the vegetation and the building material on human-biometeorological conditions. The proposed methodology may provide new insights in the decision-making process related to urban open spaces' best configuration.
A case study analysis to examine motorcycle crashes in Bogota, Colombia.
Jimenez, Adriana; Bocarejo, Juan Pablo; Zarama, Roberto; Yerpez, Joël
2015-02-01
Contributory factors to motorcycle crashes vary among populations depending on several aspects such as the users' profiles, the composition and density of traffic, and the infrastructure features. A better understanding of local motorcycle crashes can be reached in those places where a comprehensive analysis is performed. This paper presents the results obtained from a case study analysis of 400 police records of accidents involving motorcycles in Bogota. To achieve a deeper level of understanding of how these accidents occur, we propose a systemic approach that uses available crash data. The methodology is inspired by accident prototypical scenarios, a tool for analysis developed in France. When grouping cases we identified three categories: solo motorcycle accidents, motorcyclist and pedestrian accidents, and accidents involving a motorcycle and another vehicle. Within these categories we undertook in-depth analyses of 32 groups of accidents obtaining valuable information to better comprehend motorcyclists' road crashes in a local context. Recurrent contributory factors in the groups of accidents include: inexperienced motorcyclists, wide urban roads that incite speeding and risky overtaking maneuvers, flowing urban roads that encourage high speed and increased interaction between vehicles, and lack of infrastructure maintenance. The results obtained are a valuable asset to define measures that will be conveniently adapted to the group of accident on which we want to act. The methodology exposed in this paper is applicable to the study of road crashes that involve all types of actors, not only the motorcyclists, and in contexts different than those presented in Bogota. Copyright © 2014 National Safety Council and Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bennett, C.; Dunne, J. F.; Trimby, S.; Richardson, D.
2017-02-01
A recurrent non-linear autoregressive with exogenous input (NARX) neural network is proposed, and a suitable fully-recurrent training methodology is adapted and tuned, for reconstructing cylinder pressure in multi-cylinder IC engines using measured crank kinematics. This type of indirect sensing is important for cost effective closed-loop combustion control and for On-Board Diagnostics. The challenge addressed is to accurately predict cylinder pressure traces within the cycle under generalisation conditions: i.e. using data not previously seen by the network during training. This involves direct construction and calibration of a suitable inverse crank dynamic model, which owing to singular behaviour at top-dead-centre (TDC), has proved difficult via physical model construction, calibration, and inversion. The NARX architecture is specialised and adapted to cylinder pressure reconstruction, using a fully-recurrent training methodology which is needed because the alternatives are too slow and unreliable for practical network training on production engines. The fully-recurrent Robust Adaptive Gradient Descent (RAGD) algorithm, is tuned initially using synthesised crank kinematics, and then tested on real engine data to assess the reconstruction capability. Real data is obtained from a 1.125 l, 3-cylinder, in-line, direct injection spark ignition (DISI) engine involving synchronised measurements of crank kinematics and cylinder pressure across a range of steady-state speed and load conditions. The paper shows that a RAGD-trained NARX network using both crank velocity and crank acceleration as input information, provides fast and robust training. By using the optimum epoch identified during RAGD training, acceptably accurate cylinder pressures, and especially accurate location-of-peak-pressure, can be reconstructed robustly under generalisation conditions, making it the most practical NARX configuration and recurrent training methodology for use on production engines.
Kesar, Trisha M; Stinear, James W; Wolf, Steven L
2018-05-05
Neuroplasticity is a fundamental yet relatively unexplored process that can impact rehabilitation of lower extremity (LE) movements. Transcranial magnetic stimulation (TMS) has gained widespread application as a non-invasive brain stimulation technique for evaluating neuroplasticity of the corticospinal pathway. However, a majority of TMS studies have been performed on hand muscles, with a paucity of TMS investigations focused on LE muscles. This perspective review paper proposes that there are unique methodological challenges associated with using TMS to evaluate corticospinal excitability of lower limb muscles. The challenges include: (1) the deeper location of the LE motor homunculus; (2) difficulty with targeting individual LE muscles during TMS; and (3) differences in corticospinal circuity controlling upper and lower limb muscles. We encourage future investigations that modify traditional methodological approaches to help address these challenges. Systematic TMS investigations are needed to determine the extent of overlap in corticomotor maps for different LE muscles. A simple, yet informative methodological solution involves simultaneous recordings from multiple LE muscles, which will provide the added benefit of observing how other relevant muscles co-vary in their responses during targeted TMS assessment directed toward a specific muscle. Furthermore, conventionally used TMS methods (e.g., determination of hot spot location and motor threshold) may need to be modified for TMS studies involving LE muscles. Additional investigations are necessary to determine the influence of testing posture as well as activation state of adjacent and distant LE muscles on TMS-elicited responses. An understanding of these challenges and solutions specific to LE TMS will improve the ability of neurorehabilitation clinicians to interpret TMS literature, and forge novel future directions for neuroscience research focused on elucidating neuroplasticity processes underlying locomotion and gait training.
Near field planar microwave probe sensor for nondestructive condition assessment of wood products
NASA Astrophysics Data System (ADS)
Tiwari, Nilesh Kumar; Singh, Surya Prakash; Akhtar, M. Jaleel
2018-06-01
In this work, the unified methodology based on the newly designed electrically small planar resonant microwave sensor to detect the subsurface defect in wood products is presented. The proposed planar sensor involves loading of the specially designed coupled microstrip line with a novel small resonating element at its end. The novel design topology of the proposed near field sensor substantially increases the overall resolution and sensitivity of the microwave scanning system due to the strong localization of the electric field in the electrically small sensing region. A detailed electromagnetic and quasi static analysis of the near field scanning mechanism is also described in this work, which helps to understand the physics involved in the proposed scanning mechanism. The prototype of the designed sensor is fabricated on a 0.8 mm Roger 5880 substrate, and accordingly, the scattering parameters of the sensor under both loaded and unloaded conditions are measured. The measured and simulated scattering parameters under the unloaded condition are compared to validate the fabricated sensor, and a closed match between the simulated and measured resonance frequencies is observed. The fabricated sensor is used here for two potential applications, viz., the dielectric sensing of various low permittivity contrast dielectric materials and subsurface imaging of wood products to trace concealed defects and moisture content under the thin paint layer. The proposed resonant sensor can potentially be used to develop the low profile, low cost, non-destructive, and non-invasive quality monitoring system for inspecting various types of wood products without peeling off the upper paint coating.
NASA Astrophysics Data System (ADS)
Besson, Ugo; Borghi, Lidia; De Ambrosis, Anna; Mascheretti, Paolo
2010-07-01
We have developed a teaching-learning sequence (TLS) on friction based on a preliminary study involving three dimensions: an analysis of didactic research on the topic, an overview of usual approaches, and a critical analysis of the subject, considered also in its historical development. We found that mostly the usual presentations do not take into account the complexity of friction as it emerges from scientific research, may reinforce some inaccurate students' conceptions, and favour a limited vision of friction phenomena. The TLS we propose begins by considering a wide range of friction phenomena to favour an initial motivation and a broader view of the topic and then develops a path of interrelated observations, experiments, and theoretical aspects. It proposes the use of structural models, involving visual representations and stimulating intuition, aimed at helping students build mental models of friction mechanisms. To facilitate the reproducibility in school contexts, the sequence is designed as an open source structure, with a core of contents, conceptual correlations and methodological choices, and a cloud of elements that can be re-designed by teachers. The sequence has been tested in teacher education and in upper secondary school, and has shown positive results in overcoming student difficulties and stimulating richer reasoning based on the structural models we suggested. The proposed path has modified the teachers' view of the topic, producing a motivation to change their traditional presentations. The open structure of the sequence has facilitated its implementation by teachers in school in coherence with the rationale of the proposal.
Achieving Systemic Information Operations for Australian Defence
1999-10-01
is Checkland’s Soft Systems Methodology and some emphasis is placed on this methodology in the present document. Other soft methodologies also exist...Warfare 2 2 Proposed Development Method 5 3 Soft Systems Methodology 8 DSTO-TN-0235 DSTO-TN-0235 1 Introduction Widespread concern...that will be adopted will be one chosen from the burgeoning field of soft systems theory, for example Checkland’s Soft Systems Methodology (SSM)[8
Intentionality, degree of damage, and moral judgments.
Berg-Cross, L G
1975-12-01
153 first graders were given Piagetian moral judgment problems with a new simplified methodology as well as the usual story-pair paradigm. The new methodology involved making quantitative judgments about single stories and examined the influence of level of intentionality and degree of damage upon absolute punishment ratings. Contrary to results obtained with a story-pair methodology, it was found that with single stories even 6-year-old children responded to the level of intention in the stories as well as the quantity and quality of damage involved. This suggested that Piaget's methodology may be forcing children to employ a simplifying strategy while under other conditions they are able to perform the mental operations necessary to make complex moral judgments.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-08
...; Comment Request; Study Logistic Formative Research Methodology Studies for the National Children's Study... Collection Title: Study Logistics Formative Research Methodology Studies for the National Children's Study... national longitudinal study of environmental influences (including physical, chemical, biological, and...
Chen, Gang; Adleman, Nancy E; Saad, Ziad S; Leibenluft, Ellen; Cox, Robert W
2014-10-01
All neuroimaging packages can handle group analysis with t-tests or general linear modeling (GLM). However, they are quite hamstrung when there are multiple within-subject factors or when quantitative covariates are involved in the presence of a within-subject factor. In addition, sphericity is typically assumed for the variance-covariance structure when there are more than two levels in a within-subject factor. To overcome such limitations in the traditional AN(C)OVA and GLM, we adopt a multivariate modeling (MVM) approach to analyzing neuroimaging data at the group level with the following advantages: a) there is no limit on the number of factors as long as sample sizes are deemed appropriate; b) quantitative covariates can be analyzed together with within-subject factors; c) when a within-subject factor is involved, three testing methodologies are provided: traditional univariate testing (UVT) with sphericity assumption (UVT-UC) and with correction when the assumption is violated (UVT-SC), and within-subject multivariate testing (MVT-WS); d) to correct for sphericity violation at the voxel level, we propose a hybrid testing (HT) approach that achieves equal or higher power via combining traditional sphericity correction methods (Greenhouse-Geisser and Huynh-Feldt) with MVT-WS. To validate the MVM methodology, we performed simulations to assess the controllability for false positives and power achievement. A real FMRI dataset was analyzed to demonstrate the capability of the MVM approach. The methodology has been implemented into an open source program 3dMVM in AFNI, and all the statistical tests can be performed through symbolic coding with variable names instead of the tedious process of dummy coding. Our data indicates that the severity of sphericity violation varies substantially across brain regions. The differences among various modeling methodologies were addressed through direct comparisons between the MVM approach and some of the GLM implementations in the field, and the following two issues were raised: a) the improper formulation of test statistics in some univariate GLM implementations when a within-subject factor is involved in a data structure with two or more factors, and b) the unjustified presumption of uniform sphericity violation and the practice of estimating the variance-covariance structure through pooling across brain regions. Published by Elsevier Inc.
NASA Technical Reports Server (NTRS)
Nakajima, Yukio; Padovan, Joe
1987-01-01
In a three-part series of papers, a generalized finite element methodology is formulated to handle traveling load problems involving large deformation fields in structure composed of viscoelastic media. The main thrust of this paper is to develop an overall finite element methodology and associated solution algorithms to handle the transient aspects of moving problems involving contact impact type loading fields. Based on the methodology and algorithms formulated, several numerical experiments are considered. These include the rolling/sliding impact of tires with road obstructions.
Translating Oral Health-Related Quality of Life Measures: Are There Alternative Methodologies?
ERIC Educational Resources Information Center
Brondani, Mario; He, Sarah
2013-01-01
Translating existing sociodental indicators to another language involves a rigorous methodology, which can be costly. Free-of-charge online translator tools are available, but have not been evaluated in the context of research involving quality of life measures. To explore the value of using online translator tools to develop oral health-related…
Combining users' activity survey and simulators to evaluate human activity recognition systems.
Azkune, Gorka; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming
2015-04-08
Evaluating human activity recognition systems usually implies following expensive and time-consuming methodologies, where experiments with humans are run with the consequent ethical and legal issues. We propose a novel evaluation methodology to overcome the enumerated problems, which is based on surveys for users and a synthetic dataset generator tool. Surveys allow capturing how different users perform activities of daily living, while the synthetic dataset generator is used to create properly labelled activity datasets modelled with the information extracted from surveys. Important aspects, such as sensor noise, varying time lapses and user erratic behaviour, can also be simulated using the tool. The proposed methodology is shown to have very important advantages that allow researchers to carry out their work more efficiently. To evaluate the approach, a synthetic dataset generated following the proposed methodology is compared to a real dataset computing the similarity between sensor occurrence frequencies. It is concluded that the similarity between both datasets is more than significant.
Modern proposal of methodology for retrieval of characteristic synthetic rainfall hyetographs
NASA Astrophysics Data System (ADS)
Licznar, Paweł; Burszta-Adamiak, Ewa; Łomotowski, Janusz; Stańczyk, Justyna
2017-11-01
Modern engineering workshop of designing and modelling complex drainage systems is based on hydrodynamic modelling and has a probabilistic character. Its practical application requires a change regarding rainfall models accepted at the input. Previously used artificial rainfall models of simplified form, e.g. block precipitation or Euler's type II model rainfall are no longer sufficient. It is noticeable that urgent clarification is needed as regards the methodology of standardized rainfall hyetographs that would take into consideration the specifics of local storm rainfall temporal dynamics. The aim of the paper is to present a proposal for innovative methodology for determining standardized rainfall hyetographs, based on statistical processing of the collection of actual local precipitation characteristics. Proposed methodology is based on the classification of standardized rainfall hyetographs with the use of cluster analysis. Its application is presented on the example of selected rain gauges localized in Poland. Synthetic rainfall hyetographs achieved as a final result may be used for hydrodynamic modelling of sewerage systems, including probabilistic detection of necessary capacity of retention reservoirs.
Colombini, Daniela; Occhipinti, Enrico; Peluso, Raffaele; Montomoli, Loretta
2012-01-01
In August 2009, an international group was founded with the task of developing a "toolkit for MSD prevention" under the IEA and in collaboration with the World Health Organization.According to the ISO standard 11228 series and the new Draft ISO TR 12259 "Application document guides for the potential user", our group developed a preliminary "mapping" methodology of occupational hazards in the craft industry, supported by software (Excel®, free download on: www.epmresearch.org).The possible users of toolkits are: members of health and safety committees; health and safety representatives; line supervisors; foremen; workers; government representatives; health workers providing basic occupational health services; occupational health and safety specialists.The proposed methodology, using specific key enters and quick assessment criteria, allows a simple ergonomics hazards identification and risk estimation to be made. It is thus possible to decide for which occupational hazards a more exhaustive risk assessment will be necessary and which occupational consultant should be involved (occupational physician, safety engineer, industrial hygienist, etc.).The methodology has been applied in different situations of small and medium craftsmanship Italian enterprises: leather goods, food, technical dental work, production of artistic ceramics and stained glass, beekeeping activities. The results are synthetically reported and discussed in this paper.
Appropriate methodologies for empirical bioethics: it's all relative.
Ives, Jonathan; Draper, Heather
2009-05-01
In this article we distinguish between philosophical bioethics (PB), descriptive policy orientated bioethics (DPOB) and normative policy oriented bioethics (NPOB). We argue that finding an appropriate methodology for combining empirical data and moral theory depends on what the aims of the research endeavour are, and that, for the most part, this combination is only required for NPOB. After briefly discussing the debate around the is/ought problem, and suggesting that both sides of this debate are misunderstanding one another (i.e. one side treats it as a conceptual problem, whilst the other treats it as an empirical claim), we outline and defend a methodological approach to NPOB based on work we have carried out on a project exploring the normative foundations of paternal rights and responsibilities. We suggest that given the prominent role already played by moral intuition in moral theory, one appropriate way to integrate empirical data and philosophical bioethics is to utilize empirically gathered lay intuition as the foundation for ethical reasoning in NPOB. The method we propose involves a modification of a long-established tradition on non-intervention in qualitative data gathering, combined with a form of reflective equilibrium where the demands of theory and data are given equal weight and a pragmatic compromise reached.
A graph-based approach to detect spatiotemporal dynamics in satellite image time series
NASA Astrophysics Data System (ADS)
Guttler, Fabio; Ienco, Dino; Nin, Jordi; Teisseire, Maguelonne; Poncelet, Pascal
2017-08-01
Enhancing the frequency of satellite acquisitions represents a key issue for Earth Observation community nowadays. Repeated observations are crucial for monitoring purposes, particularly when intra-annual process should be taken into account. Time series of images constitute a valuable source of information in these cases. The goal of this paper is to propose a new methodological framework to automatically detect and extract spatiotemporal information from satellite image time series (SITS). Existing methods dealing with such kind of data are usually classification-oriented and cannot provide information about evolutions and temporal behaviors. In this paper we propose a graph-based strategy that combines object-based image analysis (OBIA) with data mining techniques. Image objects computed at each individual timestamp are connected across the time series and generates a set of evolution graphs. Each evolution graph is associated to a particular area within the study site and stores information about its temporal evolution. Such information can be deeply explored at the evolution graph scale or used to compare the graphs and supply a general picture at the study site scale. We validated our framework on two study sites located in the South of France and involving different types of natural, semi-natural and agricultural areas. The results obtained from a Landsat SITS support the quality of the methodological approach and illustrate how the framework can be employed to extract and characterize spatiotemporal dynamics.
NASA Astrophysics Data System (ADS)
Asoodeh, Mojtaba; Bagheripour, Parisa
2012-01-01
Measurement of compressional, shear, and Stoneley wave velocities, carried out by dipole sonic imager (DSI) logs, provides invaluable data in geophysical interpretation, geomechanical studies and hydrocarbon reservoir characterization. The presented study proposes an improved methodology for making a quantitative formulation between conventional well logs and sonic wave velocities. First, sonic wave velocities were predicted from conventional well logs using artificial neural network, fuzzy logic, and neuro-fuzzy algorithms. Subsequently, a committee machine with intelligent systems was constructed by virtue of hybrid genetic algorithm-pattern search technique while outputs of artificial neural network, fuzzy logic and neuro-fuzzy models were used as inputs of the committee machine. It is capable of improving the accuracy of final prediction through integrating the outputs of aforementioned intelligent systems. The hybrid genetic algorithm-pattern search tool, embodied in the structure of committee machine, assigns a weight factor to each individual intelligent system, indicating its involvement in overall prediction of DSI parameters. This methodology was implemented in Asmari formation, which is the major carbonate reservoir rock of Iranian oil field. A group of 1,640 data points was used to construct the intelligent model, and a group of 800 data points was employed to assess the reliability of the proposed model. The results showed that the committee machine with intelligent systems performed more effectively compared with individual intelligent systems performing alone.
NASA Astrophysics Data System (ADS)
Bazilevs, Y.; Moutsanidis, G.; Bueno, J.; Kamran, K.; Kamensky, D.; Hillman, M. C.; Gomez, H.; Chen, J. S.
2017-07-01
In this two-part paper we begin the development of a new class of methods for modeling fluid-structure interaction (FSI) phenomena for air blast. We aim to develop accurate, robust, and practical computational methodology, which is capable of modeling the dynamics of air blast coupled with the structure response, where the latter involves large, inelastic deformations and disintegration into fragments. An immersed approach is adopted, which leads to an a-priori monolithic FSI formulation with intrinsic contact detection between solid objects, and without formal restrictions on the solid motions. In Part I of this paper, the core air-blast FSI methodology suitable for a variety of discretizations is presented and tested using standard finite elements. Part II of this paper focuses on a particular instantiation of the proposed framework, which couples isogeometric analysis (IGA) based on non-uniform rational B-splines and a reproducing-kernel particle method (RKPM), which is a meshfree technique. The combination of IGA and RKPM is felt to be particularly attractive for the problem class of interest due to the higher-order accuracy and smoothness of both discretizations, and relative simplicity of RKPM in handling fragmentation scenarios. A collection of mostly 2D numerical examples is presented in each of the parts to illustrate the good performance of the proposed air-blast FSI framework.
Baumgartner, Thomas; Jäkle, Frieder; Rulkens, Ron; Zech, Gernot; Lough, Alan J; Manners, Ian
2002-08-28
To obtain mechanistic insight, detailed studies of the intriguing "spontaneous" ambient temperature ring-opening polymerization (ROP) of tin-bridged [1]ferrocenophanes Fe(eta-C(5)H(4))(2)SnR(2) 3a (R = t-Bu) and 3b (R = Mes) in solution have been performed. The investigations explored the influence of non-nucleophilic additives such as radicals and radical traps, neutral and anionic nucleophiles, Lewis acids, protic species, and other cationic electrophiles. Significantly, two novel methodologies and mechanisms for the ROP of strained [1]ferrocenophanes are proposed based on this study. First, as the addition of amine nucleophiles such as pyridine was found to strongly accelerate the polymerization rate in solution, a new nucleophilicallyassisted ROP methodology was proposed. This operates at ambient temperature in solution even in the presence of chlorosilanes but, unlike the anionic polymerization of ferrocenophanes, does not involve cyclopentadienyl anions. Second, the addition of small quantities of the electrophilic species H(+) and Bu(3)Sn(+) was found to lead to a cationic ROP process. These studies suggest that the "spontaneous" ROP of tin-bridged [1]ferrocenophanes may be a consequence of the presence of spurious, trace quantities of Lewis basic or acidic impurities. The new ROP mechanisms reported are likely to be of general significance for the ROP of other metallocenophanes (e.g., for thermal ROP in the melt) and for other metallacycles containing group 14 elements.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-27
... proposed methodological change to reduce the export price or constructed export price in certain non-market... Magnesium, as upheld in the Mag. Corp. cases, with respect to China and Vietnam. Accordingly, pursuant to... the price. In such cases, the Department would adjust the export price or constructed export price...
Mechanical modulation method for ultrasensitive phase measurements in photonics biosensing.
Patskovsky, S; Maisonneuve, M; Meunier, M; Kabashin, A V
2008-12-22
A novel polarimetry methodology for phase-sensitive measurements in single reflection geometry is proposed for applications in optical transduction-based biological sensing. The methodology uses altering step-like chopper-based mechanical phase modulation for orthogonal s- and p- polarizations of light reflected from the sensing interface and the extraction of phase information at different harmonics of the modulation. We show that even under a relatively simple experimental arrangement, the methodology provides the resolution of phase measurements as low as 0.007 deg. We also examine the proposed approach using Total Internal Reflection (TIR) and Surface Plasmon Resonance (SPR) geometries. For TIR geometry, the response appears to be strongly dependent on the prism material with the best values for high refractive index Si. The detection limit for Si-based TIR is estimated as 10(-5) in terms Refractive Index Units (RIU) change. SPR geometry offers much stronger phase response due to a much sharper phase characteristics. With the detection limit of 3.2*10(-7) RIU, the proposed methodology provides one of best sensitivities for phase-sensitive SPR devices. Advantages of the proposed method include high sensitivity, simplicity of experimental setup and noise immunity as a result of a high stability modulation.
Methodological Behaviorism from the Standpoint of a Radical Behaviorist.
Moore, J
2013-01-01
Methodological behaviorism is the name for a prescriptive orientation to psychological science. Its first and original feature is that the terms and concepts deployed in psychological theories and explanations should be based on observable stimuli and behavior. I argue that the interpretation of the phrase "based on" has changed over the years because of the influence of operationism. Its second feature, which developed after the first and is prominent in contemporary psychology, is that research should emphasize formal testing of a theory that involves mediating theoretical entities from an nonbehavioral dimension according to the hypothetico-deductive method. I argue that for contemporary methodological behaviorism, explanations of the behavior of both participants and scientists appeal to the mediating entities as mental causes, if only indirectly. In contrast to methodological behaviorism is the radical behaviorism of B. F. Skinner. Unlike methodological behaviorism, radical behaviorism conceives of verbal behavior in terms of an operant process that involves antecedent circumstances and reinforcing consequences, rather than in terms of a nonbehavioral process that involves reference and symbolism. In addition, radical behaviorism recognizes private behavioral events and subscribes to research and explanatory practices that do not include testing hypotheses about supposed mediating entities from another dimension. I conclude that methodological behaviorism is actually closer to mentalism than to Skinner's radical behaviorism.
Methodological Behaviorism from the Standpoint of a Radical Behaviorist
2013-01-01
Methodological behaviorism is the name for a prescriptive orientation to psychological science. Its first and original feature is that the terms and concepts deployed in psychological theories and explanations should be based on observable stimuli and behavior. I argue that the interpretation of the phrase “based on” has changed over the years because of the influence of operationism. Its second feature, which developed after the first and is prominent in contemporary psychology, is that research should emphasize formal testing of a theory that involves mediating theoretical entities from an nonbehavioral dimension according to the hypothetico-deductive method. I argue that for contemporary methodological behaviorism, explanations of the behavior of both participants and scientists appeal to the mediating entities as mental causes, if only indirectly. In contrast to methodological behaviorism is the radical behaviorism of B. F. Skinner. Unlike methodological behaviorism, radical behaviorism conceives of verbal behavior in terms of an operant process that involves antecedent circumstances and reinforcing consequences, rather than in terms of a nonbehavioral process that involves reference and symbolism. In addition, radical behaviorism recognizes private behavioral events and subscribes to research and explanatory practices that do not include testing hypotheses about supposed mediating entities from another dimension. I conclude that methodological behaviorism is actually closer to mentalism than to Skinner's radical behaviorism. PMID:28018031
Evaluation of Model-Based Training for Vertical Guidance Logic
NASA Technical Reports Server (NTRS)
Feary, Michael; Palmer, Everett; Sherry, Lance; Polson, Peter; Alkin, Marty; McCrobie, Dan; Kelley, Jerry; Rosekind, Mark (Technical Monitor)
1997-01-01
This paper will summarize the results of a study which introduces a structured, model based approach to learning how the automated vertical guidance system works on a modern commercial air transport. The study proposes a framework to provide accurate and complete information in an attempt to eliminate confusion about 'what the system is doing'. This study will examine a structured methodology for organizing the ideas on which the system was designed, communicating this information through the training material, and displaying it in the airplane. Previous research on model-based, computer aided instructional technology has shown reductions in the amount of time to a specified level of competence. The lessons learned from the development of these technologies are well suited for use with the design methodology which was used to develop the vertical guidance logic for a large commercial air transport. The design methodology presents the model from which to derive the training material, and the content of information to be displayed to the operator. The study consists of a 2 X 2 factorial experiment which will compare a new method of training vertical guidance logic and a new type of display. The format of the material used to derive both the training and the display will be provided by the Operational Procedure Methodology. The training condition will compare current training material to the new structured format. The display condition will involve a change of the content of the information displayed into pieces that agree with the concepts with which the system was designed.
Wilkins, Emma L; Morris, Michelle A; Radley, Duncan; Griffiths, Claire
2017-03-01
Geographic Information Systems (GIS) are widely used to measure retail food environments. However the methods used are hetrogeneous, limiting collation and interpretation of evidence. This problem is amplified by unclear and incomplete reporting of methods. This discussion (i) identifies common dimensions of methodological diversity across GIS-based food environment research (data sources, data extraction methods, food outlet construct definitions, geocoding methods, and access metrics), (ii) reviews the impact of different methodological choices, and (iii) highlights areas where reporting is insufficient. On the basis of this discussion, the Geo-FERN reporting checklist is proposed to support methodological reporting and interpretation. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Fathollah Bayati, Mohsen; Sadjadi, Seyed Jafar
2017-01-01
In this paper, new Network Data Envelopment Analysis (NDEA) models are developed to evaluate the efficiency of regional electricity power networks. The primary objective of this paper is to consider perturbation in data and develop new NDEA models based on the adaptation of robust optimization methodology. Furthermore, in this paper, the efficiency of the entire networks of electricity power, involving generation, transmission and distribution stages is measured. While DEA has been widely used to evaluate the efficiency of the components of electricity power networks during the past two decades, there is no study to evaluate the efficiency of the electricity power networks as a whole. The proposed models are applied to evaluate the efficiency of 16 regional electricity power networks in Iran and the effect of data uncertainty is also investigated. The results are compared with the traditional network DEA and parametric SFA methods. Validity and verification of the proposed models are also investigated. The preliminary results indicate that the proposed models were more reliable than the traditional Network DEA model.
Sadjadi, Seyed Jafar
2017-01-01
In this paper, new Network Data Envelopment Analysis (NDEA) models are developed to evaluate the efficiency of regional electricity power networks. The primary objective of this paper is to consider perturbation in data and develop new NDEA models based on the adaptation of robust optimization methodology. Furthermore, in this paper, the efficiency of the entire networks of electricity power, involving generation, transmission and distribution stages is measured. While DEA has been widely used to evaluate the efficiency of the components of electricity power networks during the past two decades, there is no study to evaluate the efficiency of the electricity power networks as a whole. The proposed models are applied to evaluate the efficiency of 16 regional electricity power networks in Iran and the effect of data uncertainty is also investigated. The results are compared with the traditional network DEA and parametric SFA methods. Validity and verification of the proposed models are also investigated. The preliminary results indicate that the proposed models were more reliable than the traditional Network DEA model. PMID:28953900
Balakumar, Pitchai; Inamdar, Mohammed Naseeruddin; Jagadeesh, Gowraganahalli
2013-01-01
An interactive workshop on ‘The Critical Steps for Successful Research: The Research Proposal and Scientific Writing’ was conducted in conjunction with the 64th Annual Conference of the Indian Pharmaceutical Congress-2012 at Chennai, India. In essence, research is performed to enlighten our understanding of a contemporary issue relevant to the needs of society. To accomplish this, a researcher begins search for a novel topic based on purpose, creativity, critical thinking, and logic. This leads to the fundamental pieces of the research endeavor: Question, objective, hypothesis, experimental tools to test the hypothesis, methodology, and data analysis. When correctly performed, research should produce new knowledge. The four cornerstones of good research are the well-formulated protocol or proposal that is well executed, analyzed, discussed and concluded. This recent workshop educated researchers in the critical steps involved in the development of a scientific idea to its successful execution and eventual publication. PMID:23761709
Chaves, Gabriela Costa
2007-01-01
Abstract Objective This study aims to propose a framework for measuring the degree of public health-sensitivity of patent legislation reformed after the World Trade Organization’s TRIPS (Trade-Related Aspects of Intellectual Property Rights) Agreement entered into force. Methods The methodology for establishing and testing the proposed framework involved three main steps:(1) a literature review on TRIPS flexibilities related to the protection of public health and provisions considered “TRIPS-plus”; (2) content validation through consensus techniques (an adaptation of Delphi method); and (3) an analysis of patent legislation from nineteen Latin American and Caribbean countries. Findings The results show that the framework detected relevant differences in countries’ patent legislation, allowing for country comparisons. Conclusion The framework’s potential usefulness in monitoring patent legislation changes arises from its clear parameters for measuring patent legislation’s degree of health sensitivity. Nevertheless, it can be improved by including indicators related to government and organized society initiatives that minimize free-trade agreements’ negative effects on access to medicines. PMID:17242758
NASA Technical Reports Server (NTRS)
Shkarayev, S.; Krashantisa, R.; Tessler, A.
2004-01-01
An important and challenging technology aimed at the next generation of aerospace vehicles is that of structural health monitoring. The key problem is to determine accurately, reliably, and in real time the applied loads, stresses, and displacements experienced in flight, with such data establishing an information database for structural health monitoring. The present effort is aimed at developing a finite element-based methodology involving an inverse formulation that employs measured surface strains to recover the applied loads, stresses, and displacements in an aerospace vehicle in real time. The computational procedure uses a standard finite element model (i.e., "direct analysis") of a given airframe, with the subsequent application of the inverse interpolation approach. The inverse interpolation formulation is based on a parametric approximation of the loading and is further constructed through a least-squares minimization of calculated and measured strains. This procedure results in the governing system of linear algebraic equations, providing the unknown coefficients that accurately define the load approximation. Numerical simulations are carried out for problems involving various levels of structural approximation. These include plate-loading examples and an aircraft wing box. Accuracy and computational efficiency of the proposed method are discussed in detail. The experimental validation of the methodology by way of structural testing of an aircraft wing is also discussed.
Computational synchronization of microarray data with application to Plasmodium falciparum.
Zhao, Wei; Dauwels, Justin; Niles, Jacquin C; Cao, Jianshu
2012-06-21
Microarrays are widely used to investigate the blood stage of Plasmodium falciparum infection. Starting with synchronized cells, gene expression levels are continually measured over the 48-hour intra-erythrocytic cycle (IDC). However, the cell population gradually loses synchrony during the experiment. As a result, the microarray measurements are blurred. In this paper, we propose a generalized deconvolution approach to reconstruct the intrinsic expression pattern, and apply it to P. falciparum IDC microarray data. We develop a statistical model for the decay of synchrony among cells, and reconstruct the expression pattern through statistical inference. The proposed method can handle microarray measurements with noise and missing data. The original gene expression patterns become more apparent in the reconstructed profiles, making it easier to analyze and interpret the data. We hypothesize that reconstructed gene expression patterns represent better temporally resolved expression profiles that can be probabilistically modeled to match changes in expression level to IDC transitions. In particular, we identify transcriptionally regulated protein kinases putatively involved in regulating the P. falciparum IDC. By analyzing publicly available microarray data sets for the P. falciparum IDC, protein kinases are ranked in terms of their likelihood to be involved in regulating transitions between the ring, trophozoite and schizont developmental stages of the P. falciparum IDC. In our theoretical framework, a few protein kinases have high probability rankings, and could potentially be involved in regulating these developmental transitions. This study proposes a new methodology for extracting intrinsic expression patterns from microarray data. By applying this method to P. falciparum microarray data, several protein kinases are predicted to play a significant role in the P. falciparum IDC. Earlier experiments have indeed confirmed that several of these kinases are involved in this process. Overall, these results indicate that further functional analysis of these additional putative protein kinases may reveal new insights into how the P. falciparum IDC is regulated.
Classification of Alzheimer's Patients through Ubiquitous Computing.
Nieto-Reyes, Alicia; Duque, Rafael; Montaña, José Luis; Lage, Carmen
2017-07-21
Functional data analysis and artificial neural networks are the building blocks of the proposed methodology that distinguishes the movement patterns among c's patients on different stages of the disease and classifies new patients to their appropriate stage of the disease. The movement patterns are obtained by the accelerometer device of android smartphones that the patients carry while moving freely. The proposed methodology is relevant in that it is flexible on the type of data to which it is applied. To exemplify that, it is analyzed a novel real three-dimensional functional dataset where each datum is observed in a different time domain. Not only is it observed on a difference frequency but also the domain of each datum has different length. The obtained classification success rate of 83 % indicates the potential of the proposed methodology.
A methodology for secure recovery of spacecrafts based on a trusted hardware platform
NASA Astrophysics Data System (ADS)
Juliato, Marcio; Gebotys, Catherine
2017-02-01
This paper proposes a methodology for the secure recovery of spacecrafts and the recovery of its cryptographic capabilities in emergency scenarios recurring from major unintentional failures and malicious attacks. The proposed approach employs trusted modules to achieve higher reliability and security levels in space missions due to the presence of integrity check capabilities as well as secure recovery mechanisms. Additionally, several recovery protocols are thoroughly discussed and analyzed against a wide variety of attacks. Exhaustive search attacks are shown in a wide variety of contexts and are shown to be infeasible and totally independent of the computational power of attackers. Experimental results have shown that the proposed methodology allows for the fast and secure recovery of spacecrafts, demanding minimum implementation area, power consumption and bandwidth.
Towards Methodologies for Building Knowledge-Based Instructional Systems.
ERIC Educational Resources Information Center
Duchastel, Philippe
1992-01-01
Examines the processes involved in building instructional systems that are based on artificial intelligence and hypermedia technologies. Traditional instructional systems design methodology is discussed; design issues including system architecture and learning strategies are addressed; and a new methodology for building knowledge-based…
Fernández-Arévalo, T; Lizarralde, I; Grau, P; Ayesa, E
2014-09-01
This paper presents a new modelling methodology for dynamically predicting the heat produced or consumed in the transformations of any biological reactor using Hess's law. Starting from a complete description of model components stoichiometry and formation enthalpies, the proposed modelling methodology has integrated successfully the simultaneous calculation of both the conventional mass balances and the enthalpy change of reaction in an expandable multi-phase matrix structure, which facilitates a detailed prediction of the main heat fluxes in the biochemical reactors. The methodology has been implemented in a plant-wide modelling methodology in order to facilitate the dynamic description of mass and heat throughout the plant. After validation with literature data, as illustrative examples of the capability of the methodology, two case studies have been described. In the first one, a predenitrification-nitrification dynamic process has been analysed, with the aim of demonstrating the easy integration of the methodology in any system. In the second case study, the simulation of a thermal model for an ATAD has shown the potential of the proposed methodology for analysing the effect of ventilation and influent characterization. Copyright © 2014 Elsevier Ltd. All rights reserved.
Valapour, Maryam; Paulson, Kristin M; Hilde, Alisha
2013-04-01
Publication is one of the primary rewards in the academic research community and is the first step in the dissemination of a new discovery that could lead to recognition and opportunity. Because of this, the publication of research can serve as a tacit endorsement of the methodology behind the science. This becomes a problem when vulnerable populations that are incapable of giving legitimate informed consent, such as prisoners, are used in research. The problem is especially critical in the field of transplant research, in which unverified consent can enable research that exploits the vulnerabilities of prisoners, especially those awaiting execution. Because the doctrine of informed consent is central to the protection of vulnerable populations, we have performed a historical analysis of the standards of informed consent in codes of international human subject protections to form the foundation for our limit and ban recommendations: (1) limit the publication of transplant research involving prisoners in general and (2) ban the publication of transplant research involving executed prisoners in particular. Copyright © 2013 American Association for the Study of Liver Diseases.
NASA Astrophysics Data System (ADS)
Schmidt, S.; Heyns, P. S.; de Villiers, J. P.
2018-02-01
In this paper, a fault diagnostic methodology is developed which is able to detect, locate and trend gear faults under fluctuating operating conditions when only vibration data from a single transducer, measured on a healthy gearbox are available. A two-phase feature extraction and modelling process is proposed to infer the operating condition and based on the operating condition, to detect changes in the machine condition. Information from optimised machine and operating condition hidden Markov models are statistically combined to generate a discrepancy signal which is post-processed to infer the condition of the gearbox. The discrepancy signal is processed and combined with statistical methods for automatic fault detection and localisation and to perform fault trending over time. The proposed methodology is validated on experimental data and a tacholess order tracking methodology is used to enhance the cost-effectiveness of the diagnostic methodology.
NASA Astrophysics Data System (ADS)
de La Cal, E. A.; Fernández, E. M.; Quiroga, R.; Villar, J. R.; Sedano, J.
In previous works a methodology was defined, based on the design of a genetic algorithm GAP and an incremental training technique adapted to the learning of series of stock market values. The GAP technique consists in a fusion of GP and GA. The GAP algorithm implements the automatic search for crisp trading rules taking as objectives of the training both the optimization of the return obtained and the minimization of the assumed risk. Applying the proposed methodology, rules have been obtained for a period of eight years of the S&P500 index. The achieved adjustment of the relation return-risk has generated rules with returns very superior in the testing period to those obtained applying habitual methodologies and even clearly superior to Buy&Hold. This work probes that the proposed methodology is valid for different assets in a different market than previous work.
Cost benefits of advanced software: A review of methodology used at Kennedy Space Center
NASA Technical Reports Server (NTRS)
Joglekar, Prafulla N.
1993-01-01
To assist rational investments in advanced software, a formal, explicit, and multi-perspective cost-benefit analysis methodology is proposed. The methodology can be implemented through a six-stage process which is described and explained. The current practice of cost-benefit analysis at KSC is reviewed in the light of this methodology. The review finds that there is a vicious circle operating. Unsound methods lead to unreliable cost-benefit estimates. Unreliable estimates convince management that cost-benefit studies should not be taken seriously. Then, given external demands for cost-benefit estimates, management encourages software enginees to somehow come up with the numbers for their projects. Lacking the expertise needed to do a proper study, courageous software engineers with vested interests use ad hoc and unsound methods to generate some estimates. In turn, these estimates are unreliable, and the cycle continues. The proposed methodology should help KSC to break out of this cycle.
Assembly line performance and modeling
NASA Astrophysics Data System (ADS)
Rane, Arun B.; Sunnapwar, Vivek K.
2017-09-01
Automobile sector forms the backbone of manufacturing sector. Vehicle assembly line is important section in automobile plant where repetitive tasks are performed one after another at different workstations. In this thesis, a methodology is proposed to reduce cycle time and time loss due to important factors like equipment failure, shortage of inventory, absenteeism, set-up, material handling, rejection and fatigue to improve output within given cost constraints. Various relationships between these factors, corresponding cost and output are established by scientific approach. This methodology is validated in three different vehicle assembly plants. Proposed methodology may help practitioners to optimize the assembly line using lean techniques.
Evaluating Payments for Environmental Services: Methodological Challenges
2016-01-01
Over the last fifteen years, Payments for Environmental Services (PES) schemes have become very popular environmental policy instruments, but the academic literature has begun to question their additionality. The literature attempts to estimate the causal effect of these programs by applying impact evaluation (IE) techniques. However, PES programs are complex instruments and IE methods cannot be directly applied without adjustments. Based on a systematic review of the literature, this article proposes a framework for the methodological process of designing an IE for PES schemes. It revises and discusses the methodological choices at each step of the process and proposes guidelines for practitioners. PMID:26910850
NASA Astrophysics Data System (ADS)
Orlaineta-Agüero, S.; Del Sol-Fernández, S.; Sánchez-Guzmán, D.; García-Salcedo, R.
2017-01-01
In the present work we show the implementation of a learning sequence based on an active learning methodology for teaching Physics, this proposal tends to promote a better learning in high school students with the use of a comic book and it combines the use of different low-cost experimental activities for teaching the electrical concepts of Current, Resistance and Voltage. We consider that this kind of strategy can be easily extrapolated to higher-education levels like Engineering-college/university level and other disciplines of Science. To evaluate this proposal, we used some conceptual questions from the Electric Circuits Concept Evaluation survey developed by Sokoloff and the results from this survey was analysed with the Normalized Conceptual Gain proposed by Hake and the Concentration Factor that was proposed by Bao and Redish, to identify the effectiveness of the methodology and the models that the students presented after and before the instruction, respectively. We found that this methodology was more effective than only the implementation of traditional lectures, we consider that these results cannot be generalized but gave us the opportunity to view many important approaches in Physics Education; finally, we will continue to apply the same experiment with more students, in the same and upper levels of education, to confirm and validate the effectiveness of this methodology proposal.
Finite-time H∞ control for linear continuous system with norm-bounded disturbance
NASA Astrophysics Data System (ADS)
Meng, Qingyi; Shen, Yanjun
2009-04-01
In this paper, the definition of finite-time H∞ control is presented. The system under consideration is subject to time-varying norm-bounded exogenous disturbance. The main aim of this paper is focused on the design a state feedback controller which ensures that the closed-loop system is finite-time bounded (FTB) and reduces the effect of the disturbance input on the controlled output to a prescribed level. A sufficient condition is presented for the solvability of this problem, which can be reduced to a feasibility problem involving linear matrix inequalities (LMIs). A detailed solving method is proposed for the restricted linear matrix inequalities. Finally, examples are given to show the validity of the methodology.
NASA Technical Reports Server (NTRS)
Moe, Karen L.; Perkins, Dorothy C.; Szczur, Martha R.
1987-01-01
The user support environment (USE) which is a set of software tools for a flexible standard interactive user interface to the Space Station systems, platforms, and payloads is described in detail. Included in the USE concept are a user interface language, a run time environment and user interface management system, support tools, and standards for human interaction methods. The goals and challenges of the USE are discussed as well as a methodology based on prototype demonstrations for involving users in the process of validating the USE concepts. By prototyping the key concepts and salient features of the proposed user interface standards, the user's ability to respond is greatly enhanced.
Managing the climate commons at the nexus of ecology, behaviour and economics
NASA Astrophysics Data System (ADS)
Tavoni, Alessandro; Levin, Simon
2014-12-01
Sustainably managing coupled ecological-economic systems requires not only an understanding of the environmental factors that affect them, but also knowledge of the interactions and feedback cycles that operate between resource dynamics and activities attributable to human intervention. The socioeconomic dynamics, in turn, call for an investigation of the behavioural drivers behind human action. We argue that a multidisciplinary approach is needed in order to tackle the increasingly pressing and intertwined environmental challenges faced by modern societies. Academic contributions to climate change policy have been constrained by methodological and terminological differences, so we discuss how programmes aimed at cross-disciplinary education and involvement in governance may help to unlock scholars' potential to propose new solutions.
A time-parallel approach to strong-constraint four-dimensional variational data assimilation
NASA Astrophysics Data System (ADS)
Rao, Vishwas; Sandu, Adrian
2016-05-01
A parallel-in-time algorithm based on an augmented Lagrangian approach is proposed to solve four-dimensional variational (4D-Var) data assimilation problems. The assimilation window is divided into multiple sub-intervals that allows parallelization of cost function and gradient computations. The solutions to the continuity equations across interval boundaries are added as constraints. The augmented Lagrangian approach leads to a different formulation of the variational data assimilation problem than the weakly constrained 4D-Var. A combination of serial and parallel 4D-Vars to increase performance is also explored. The methodology is illustrated on data assimilation problems involving the Lorenz-96 and the shallow water models.
Eye movements when viewing advertisements
Higgins, Emily; Leinenger, Mallorie; Rayner, Keith
2013-01-01
In this selective review, we examine key findings on eye movements when viewing advertisements. We begin with a brief, general introduction to the properties and neural underpinnings of saccadic eye movements. Next, we provide an overview of eye movement behavior during reading, scene perception, and visual search, since each of these activities is, at various times, involved in viewing ads. We then review the literature on eye movements when viewing print ads and warning labels (of the kind that appear on alcohol and tobacco ads), before turning to a consideration of advertisements in dynamic media (television and the Internet). Finally, we propose topics and methodological approaches that may prove to be useful in future research. PMID:24672500
NASA Astrophysics Data System (ADS)
Acri, Antonio; Offner, Guenter; Nijman, Eugene; Rejlek, Jan
2016-10-01
Noise legislations and the increasing customer demands determine the Noise Vibration and Harshness (NVH) development of modern commercial vehicles. In order to meet the stringent legislative requirements for the vehicle noise emission, exact knowledge of all vehicle noise sources and their acoustic behavior is required. Transfer path analysis (TPA) is a fairly well established technique for estimating and ranking individual low-frequency noise or vibration contributions via the different transmission paths. Transmission paths from different sources to target points of interest and their contributions can be analyzed by applying TPA. This technique is applied on test measurements, which can only be available on prototypes, at the end of the designing process. In order to overcome the limits of TPA, a numerical transfer path analysis methodology based on the substructuring of a multibody system is proposed in this paper. Being based on numerical simulation, this methodology can be performed starting from the first steps of the designing process. The main target of the proposed methodology is to get information of noise sources contributions of a dynamic system considering the possibility to have multiple forces contemporary acting on the system. The contributions of these forces are investigated with particular focus on distribute or moving forces. In this paper, the mathematical basics of the proposed methodology and its advantages in comparison with TPA will be discussed. Then, a dynamic system is investigated with a combination of two methods. Being based on the dynamic substructuring (DS) of the investigated model, the methodology proposed requires the evaluation of the contact forces at interfaces, which are computed with a flexible multi-body dynamic (FMBD) simulation. Then, the structure-borne noise paths are computed with the wave based method (WBM). As an example application a 4-cylinder engine is investigated and the proposed methodology is applied on the engine block. The aim is to get accurate and clear relationships between excitations and responses of the simulated dynamic system, analyzing the noise and vibrational sources inside a car engine, showing the main advantages of a numerical methodology.
Construction of Gene Regulatory Networks Using Recurrent Neural Networks and Swarm Intelligence.
Khan, Abhinandan; Mandal, Sudip; Pal, Rajat Kumar; Saha, Goutam
2016-01-01
We have proposed a methodology for the reverse engineering of biologically plausible gene regulatory networks from temporal genetic expression data. We have used established information and the fundamental mathematical theory for this purpose. We have employed the Recurrent Neural Network formalism to extract the underlying dynamics present in the time series expression data accurately. We have introduced a new hybrid swarm intelligence framework for the accurate training of the model parameters. The proposed methodology has been first applied to a small artificial network, and the results obtained suggest that it can produce the best results available in the contemporary literature, to the best of our knowledge. Subsequently, we have implemented our proposed framework on experimental (in vivo) datasets. Finally, we have investigated two medium sized genetic networks (in silico) extracted from GeneNetWeaver, to understand how the proposed algorithm scales up with network size. Additionally, we have implemented our proposed algorithm with half the number of time points. The results indicate that a reduction of 50% in the number of time points does not have an effect on the accuracy of the proposed methodology significantly, with a maximum of just over 15% deterioration in the worst case.
A temperature compensation methodology for piezoelectric based sensor devices
NASA Astrophysics Data System (ADS)
Wang, Dong F.; Lou, Xueqiao; Bao, Aijian; Yang, Xu; Zhao, Ji
2017-08-01
A temperature compensation methodology comprising a negative temperature coefficient thermistor with the temperature characteristics of a piezoelectric material is proposed to improve the measurement accuracy of piezoelectric sensing based devices. The piezoelectric disk is characterized by using a disk-shaped structure and is also used to verify the effectiveness of the proposed compensation method. The measured output voltage shows a nearly linear relationship with respect to the applied pressure by introducing the proposed temperature compensation method in a temperature range of 25-65 °C. As a result, the maximum measurement accuracy is observed to be improved by 40%, and the higher the temperature, the more effective the method. The effective temperature range of the proposed method is theoretically analyzed by introducing the constant coefficient of the thermistor (B), the resistance of initial temperature (R0), and the paralleled resistance (Rx). The proposed methodology can not only eliminate the influence of piezoelectric temperature dependent characteristics on the sensing accuracy but also decrease the power consumption of piezoelectric sensing based devices by the simplified sensing structure.
School-Based Methylphenidate Placebo Protocols: Methodological and Practical Issues.
ERIC Educational Resources Information Center
Hyman, Irwin A.; Wojtowicz, Alexandra; Lee, Kee Duk; Haffner, Mary Elizabeth; Fiorello, Catherine A.; And Others
1998-01-01
Focuses on methodological issues involved in choosing instruments to monitor behavior, once a comprehensive evaluation has suggested trials on Ritalin. Case examples illustrate problems of teacher compliance in filling out measures, supplying adequate placebos, and obtaining physical cooperation. Emerging school-based methodologies are discussed…
Re-Turning Feminist Methodologies: From a Social to an Ecological Epistemology
ERIC Educational Resources Information Center
Hughes, Christina; Lury, Celia
2013-01-01
This paper proposes an ecological methodology in order to re-think the concept of situatedness in ways that can take into account that we live in relation to, and are of, a more-and-other-than-human world. In doing so, the paper proposes that situatedness should be understood in terms of processes of co-invention that, fractally and recursively,…
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Farooq, Mohammad U.
1986-01-01
The definition of proposed research addressing the development and validation of a methodology for the design and evaluation of user interfaces for interactive information systems is given. The major objectives of this research are: the development of a comprehensive, objective, and generalizable methodology for the design and evaluation of user interfaces for information systems; the development of equations and/or analytical models to characterize user behavior and the performance of a designed interface; the design of a prototype system for the development and administration of user interfaces; and the design and use of controlled experiments to support the research and test/validate the proposed methodology. The proposed design methodology views the user interface as a virtual machine composed of three layers: an interactive layer, a dialogue manager layer, and an application interface layer. A command language model of user system interactions is presented because of its inherent simplicity and structured approach based on interaction events. All interaction events have a common structure based on common generic elements necessary for a successful dialogue. It is shown that, using this model, various types of interfaces could be designed and implemented to accommodate various categories of users. The implementation methodology is discussed in terms of how to store and organize the information.
Aguilar-López, Ricardo; Mata-Machuca, Juan L
2016-01-01
This paper proposes a synchronization methodology of two chaotic oscillators under the framework of identical synchronization and master-slave configuration. The proposed methodology is based on state observer design under the frame of control theory; the observer structure provides finite-time synchronization convergence by cancelling the upper bounds of the main nonlinearities of the chaotic oscillator. The above is showed via an analysis of the dynamic of the so called synchronization error. Numerical experiments corroborate the satisfactory results of the proposed scheme.
Aguilar-López, Ricardo
2016-01-01
This paper proposes a synchronization methodology of two chaotic oscillators under the framework of identical synchronization and master-slave configuration. The proposed methodology is based on state observer design under the frame of control theory; the observer structure provides finite-time synchronization convergence by cancelling the upper bounds of the main nonlinearities of the chaotic oscillator. The above is showed via an analysis of the dynamic of the so called synchronization error. Numerical experiments corroborate the satisfactory results of the proposed scheme. PMID:27738651
Luzardo, Octavio P; Almeida-González, Maira; Ruiz-Suárez, Norberto; Zumbado, Manuel; Henríquez-Hernández, Luis A; Meilán, María José; Camacho, María; Boada, Luis D
2015-09-01
Pesticides are frequently responsible for human poisoning and often the information on the involved substance is lacking. The great variety of pesticides that could be responsible for intoxication makes necessary the development of powerful and versatile analytical methodologies, which allows the identification of the unknown toxic substance. Here we developed a methodology for simultaneous identification and quantification in human blood of 109 highly toxic pesticides. The application of this analytical scheme would help in minimizing the cost of this type of chemical identification, maximizing the chances of identifying the pesticide involved. In the methodology that we present here, we use a liquid-liquid extraction, followed by one single purification step, and quantitation of analytes by a combination of liquid and gas chromatography, both coupled to triple quadrupole mass spectrometry, which is operated in the mode of multiple reaction monitoring. The methodology has been fully validated, and its applicability has been demonstrated in two recent cases involving one self-poisoning fatality and one non-fatal homicidal attempt. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hu, Tengfei; Mao, Jingqiao; Pan, Shunqi; Dai, Lingquan; Zhang, Peipei; Xu, Diandian; Dai, Huichao
2018-07-01
Reservoir operations significantly alter the hydrological regime of the downstream river and river-connected lake, which has far-reaching impacts on the lake ecosystem. To facilitate the management of lakes connected to regulated rivers, the following information must be provided: (1) the response of lake water levels to reservoir operation schedules in the near future and (2) the importance of different rivers in terms of affecting the water levels in different lake regions of interest. We develop an integrated modeling and analytical methodology for the water level management of such lakes. The data-driven method is used to model the lake level as it has the potential of producing quick and accurate predictions. A new genetic algorithm-based synchronized search is proposed to optimize input variable time lags and data-driven model parameters simultaneously. The methodology also involves the orthogonal design and range analysis for extracting the influence of an individual river from that of all the rivers. The integrated methodology is applied to the second largest freshwater lake in China, the Dongting Lake. The results show that: (1) the antecedent lake levels are of crucial importance for the current lake level prediction; (2) the selected river discharge time lags reflect the spatial heterogeneity of the rivers' impacts on lake level changes; (3) the predicted lake levels are in very good agreement with the observed data (RMSE ≤ 0.091 m; R2 ≥ 0.9986). This study demonstrates the practical potential of the integrated methodology, which can provide both the lake level responses to future dam releases and the relative contributions of different rivers to lake level changes.
Quality Assurance of UMLS Semantic Type Assignments Using SNOMED CT Hierarchies.
Gu, H; Chen, Y; He, Z; Halper, M; Chen, L
2016-01-01
The Unified Medical Language System (UMLS) is one of the largest biomedical terminological systems, with over 2.5 million concepts in its Metathesaurus repository. The UMLS's Semantic Network (SN) with its collection of 133 high-level semantic types serves as an abstraction layer on top of the Metathesaurus. In particular, the SN elaborates an aspect of the Metathesaurus's concepts via the assignment of one or more types to each concept. Due to the scope and complexity of the Metathesaurus, errors are all but inevitable in this semantic-type assignment process. To develop a semi-automated methodology to help assure the quality of semantic-type assignments within the UMLS. The methodology uses a cross-validation strategy involving SNOMED CT's hierarchies in combination with UMLS semantic types. Semantically uniform, disjoint concept groups are generated programmatically by partitioning the collection of all concepts in the same SNOMED CT hierarchy according to their respective semantic-type assignments in the UMLS. Domain experts are then called upon to review the concepts in any group having a small number of concepts. It is our hypothesis that a semantic-type assignment combination applicable only to a very small number of concepts in a SNOMED CT hierarchy is an indicator of potential problems. The methodology was applied to the UMLS 2013AA release along with the SNOMED CT from January 2013. An overall error rate of 33% was found for concepts proposed by the quality-assurance methodology. Supporting our hypothesis, that number was four times higher than the error rate found in control samples. The results show that the quality-assurance methodology can aid in effective and efficient identification of UMLS semantic-type assignment errors.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-23
... collection: Extension of the time frame required to complete approved and ongoing methodological research on... methodological research on the National Crime Victimization Survey. (2) Title of the Form/Collection: National.... This generic clearance will cover methodological research that will use existing or new sampled...
77 FR 15092 - U.S. Energy Information Administration; Proposed Agency Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-14
... conducted under this clearance will generally be methodological studies of 500 cases or less. The samples... conducted under this clearance will generally be methodological studies of 500 cases or less, but will... the methodological design, sampling procedures (where possible) and questionnaires of the full scale...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-12
... methodological studies conducted during the Vanguard phase will inform the implementation and analysis plan for... Research Methodology Studies for the National Children's Study SUMMARY: In compliance with the requirement... Collection: Title: Environmental Science Formative Research Methodology Studies for the National Children's...
76 FR 72134 - Annual Charges for Use of Government Lands
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-22
... revise the methodology used to compute these annual charges. Under the proposed rule, the Commission would create a fee schedule based on the U.S. Bureau of Land Management's (BLM) methodology for calculating rental rates for linear rights of way. This methodology includes a land value per acre, an...
Investigating the Effectiveness of Special Education: An Analysis of Methodology.
ERIC Educational Resources Information Center
Tindal, Gerald
1985-01-01
The review examines evaluations of the efficacy of special education programs for mildly disabled children. The author suggests that serious methodological flaws make our present knowledge in this area very weak and proposes a methodology to address and overcome many of the limitations of previous research. (Author)
NASA Astrophysics Data System (ADS)
Mitchell, Matthew Wesley
This dissertation proposes a new understanding of Paul's Gentile mission and its relationship to his so-called "conversion." This dissertation examines the origins of Paul's mission to the Gentiles, and locates it in his claims to have been personally commissioned to undertake such a mission by Jesus. Specifically, I argue that it is the rejection of Paul's claim to be an apostle, a claim founded upon his "conversion" experience, that precipitates his mission to the Gentiles. In arguing this view, I draw upon Ferdinand Christian Baur's nineteenth century theories concerning both the unreliability of Acts as a historical source, and his proposal of a clear division between Paul and the other apostles. In establishing the methodological and theoretical framework of the dissertation, I discuss the "New Perspective on Paul" that has dominated New Testament scholarship over the past thirty years. My study is also informed methodologically by the growing interest in rhetorical criticism among biblical scholars, although the emphasis of this dissertation bears more of a resemblance to the approach of the New Rhetoric than the categories of classical, Greco-Roman rhetoric. The textual component of this work falls into two stages. The first contains a full examination of Paul's "conversion passages" in Galatians 1:15--17 and 1 Corinthians 15:8, attempting to situate these seemingly unusual self-descriptions in their cultural contexts. The second involves an examination of F. C. Baur's presentation of Paul, and the reception of Baur's views among biblical scholars throughout the years following his scholarly activity. This dissertation makes two claims, each of which can stand on its own as an important contribution to scholarship. My first claim is that components of Baur's work support my proposal concerning Paul's Gentile mission and his experience of apostolic rejection, and that this proposal has much to commend it as an explanation of a perennial scholarly puzzle. My second claim is methodological, as I demonstrate that scholarly writings about Paul and his modern interpreters are themselves exercises in argumentation, and thus are not to be accepted uncritically, or without close attention to the rhetorical practices they utilize.
Nandi, Anirban; Pan, Sharadwata; Potumarthi, Ravichandra; Danquah, Michael K; Sarethy, Indira P
2014-01-01
Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D). In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA). It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA.
Nandi, Anirban; Danquah, Michael K.
2014-01-01
Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D). In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA). It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA. PMID:25057428
A methodology of SiP testing based on boundary scan
NASA Astrophysics Data System (ADS)
Qin, He; Quan, Haiyang; Han, Yifei; Zhu, Tianrui; Zheng, Tuo
2017-10-01
System in Package (SiP) play an important role in portable, aerospace and military electronic with the microminiaturization, light weight, high density, and high reliability. At present, SiP system test has encountered the problem on system complexity and malfunction location with the system scale exponentially increase. For SiP system, this paper proposed a testing methodology and testing process based on the boundary scan technology. Combining the character of SiP system and referencing the boundary scan theory of PCB circuit and embedded core test, the specific testing methodology and process has been proposed. The hardware requirement of the under test SiP system has been provided, and the hardware platform of the testing has been constructed. The testing methodology has the character of high test efficiency and accurate malfunction location.
Classification of Alzheimer’s Patients through Ubiquitous Computing †
Nieto-Reyes, Alicia; Duque, Rafael; Montaña, José Luis; Lage, Carmen
2017-01-01
Functional data analysis and artificial neural networks are the building blocks of the proposed methodology that distinguishes the movement patterns among c’s patients on different stages of the disease and classifies new patients to their appropriate stage of the disease. The movement patterns are obtained by the accelerometer device of android smartphones that the patients carry while moving freely. The proposed methodology is relevant in that it is flexible on the type of data to which it is applied. To exemplify that, it is analyzed a novel real three-dimensional functional dataset where each datum is observed in a different time domain. Not only is it observed on a difference frequency but also the domain of each datum has different length. The obtained classification success rate of 83% indicates the potential of the proposed methodology. PMID:28753975
2017-09-01
with new methodologies of intratumoral phylogenetic analyses, will yield pivotal information in elucidating the key genes involved evolution of PCa...combined with both clinical and experimental genetic data produced by this study may empower patients and doctors to make personalized treatment decisions...sequencing, paired with new methodologies of intratumoral phylogenetic analyses, will yield pivotal information in elucidating the key genes involved
Sharma, Govind K; Kumar, Anish; Jayakumar, T; Purnachandra Rao, B; Mariyappa, N
2015-03-01
A signal processing methodology is proposed in this paper for effective reconstruction of ultrasonic signals in coarse grained high scattering austenitic stainless steel. The proposed methodology is comprised of the Ensemble Empirical Mode Decomposition (EEMD) processing of ultrasonic signals and application of signal minimisation algorithm on selected Intrinsic Mode Functions (IMFs) obtained by EEMD. The methodology is applied to ultrasonic signals obtained from austenitic stainless steel specimens of different grain size, with and without defects. The influence of probe frequency and data length of a signal on EEMD decomposition is also investigated. For a particular sampling rate and probe frequency, the same range of IMFs can be used to reconstruct the ultrasonic signal, irrespective of the grain size in the range of 30-210 μm investigated in this study. This methodology is successfully employed for detection of defects in a 50mm thick coarse grain austenitic stainless steel specimens. Signal to noise ratio improvement of better than 15 dB is observed for the ultrasonic signal obtained from a 25 mm deep flat bottom hole in 200 μm grain size specimen. For ultrasonic signals obtained from defects at different depths, a minimum of 7 dB extra enhancement in SNR is achieved as compared to the sum of selected IMF approach. The application of minimisation algorithm with EEMD processed signal in the proposed methodology proves to be effective for adaptive signal reconstruction with improved signal to noise ratio. This methodology was further employed for successful imaging of defects in a B-scan. Copyright © 2014. Published by Elsevier B.V.
Modeling Operations Other Than War: Non-Combatants in Combat Modeling
1994-09-01
supposition that non-combatants are an essential feature in OOTW. The model proposal includes a methodology for civilian unit decision making . The model...combatants are an essential feature in OOTW. The model proposal includes a methodology for civilian unit decision making . Thi- model also includes...numerical example demonstrated that the model appeared to perform in an acceptable manner, in that it produced output within a reasonable range. During the
Critical Communicative Methodology: Informing Real Social Transformation through Research
ERIC Educational Resources Information Center
Gomez, Aitor; Puigvert, Lidia; Flecha, Ramon
2011-01-01
The critical communicative methodology (CCM) is a methodological response to the dialogic turn of societies and sciences that has already had an important impact in transforming situations of inequality and exclusion. Research conducted with the CCM implies continuous and egalitarian dialogue among researchers and the people involved in the…
Christoforou, Christoforos; Christou-Champi, Spyros; Constantinidou, Fofi; Theodorou, Maria
2015-01-01
Eye-tracking has been extensively used to quantify audience preferences in the context of marketing and advertising research, primarily in methodologies involving static images or stimuli (i.e., advertising, shelf testing, and website usability). However, these methodologies do not generalize to narrative-based video stimuli where a specific storyline is meant to be communicated to the audience. In this paper, a novel metric based on eye-gaze dispersion (both within and across viewings) that quantifies the impact of narrative-based video stimuli to the preferences of large audiences is presented. The metric is validated in predicting the performance of video advertisements aired during the 2014 Super Bowl final. In particular, the metric is shown to explain 70% of the variance in likeability scores of the 2014 Super Bowl ads as measured by the USA TODAY Ad-Meter. In addition, by comparing the proposed metric with Heart Rate Variability (HRV) indices, we have associated the metric with biological processes relating to attention allocation. The underlying idea behind the proposed metric suggests a shift in perspective when it comes to evaluating narrative-based video stimuli. In particular, it suggests that audience preferences on video are modulated by the level of viewers lack of attention allocation. The proposed metric can be calculated on any narrative-based video stimuli (i.e., movie, narrative content, emotional content, etc.), and thus has the potential to facilitate the use of such stimuli in several contexts: prediction of audience preferences of movies, quantitative assessment of entertainment pieces, prediction of the impact of movie trailers, identification of group, and individual differences in the study of attention-deficit disorders, and the study of desensitization to media violence. PMID:26029135
Christoforou, Christoforos; Christou-Champi, Spyros; Constantinidou, Fofi; Theodorou, Maria
2015-01-01
Eye-tracking has been extensively used to quantify audience preferences in the context of marketing and advertising research, primarily in methodologies involving static images or stimuli (i.e., advertising, shelf testing, and website usability). However, these methodologies do not generalize to narrative-based video stimuli where a specific storyline is meant to be communicated to the audience. In this paper, a novel metric based on eye-gaze dispersion (both within and across viewings) that quantifies the impact of narrative-based video stimuli to the preferences of large audiences is presented. The metric is validated in predicting the performance of video advertisements aired during the 2014 Super Bowl final. In particular, the metric is shown to explain 70% of the variance in likeability scores of the 2014 Super Bowl ads as measured by the USA TODAY Ad-Meter. In addition, by comparing the proposed metric with Heart Rate Variability (HRV) indices, we have associated the metric with biological processes relating to attention allocation. The underlying idea behind the proposed metric suggests a shift in perspective when it comes to evaluating narrative-based video stimuli. In particular, it suggests that audience preferences on video are modulated by the level of viewers lack of attention allocation. The proposed metric can be calculated on any narrative-based video stimuli (i.e., movie, narrative content, emotional content, etc.), and thus has the potential to facilitate the use of such stimuli in several contexts: prediction of audience preferences of movies, quantitative assessment of entertainment pieces, prediction of the impact of movie trailers, identification of group, and individual differences in the study of attention-deficit disorders, and the study of desensitization to media violence.
Prediction of Nucleotide Binding Peptides Using Star Graph Topological Indices.
Liu, Yong; Munteanu, Cristian R; Fernández Blanco, Enrique; Tan, Zhiliang; Santos Del Riego, Antonino; Pazos, Alejandro
2015-11-01
The nucleotide binding proteins are involved in many important cellular processes, such as transmission of genetic information or energy transfer and storage. Therefore, the screening of new peptides for this biological function is an important research topic. The current study proposes a mixed methodology to obtain the first classification model that is able to predict new nucleotide binding peptides, using only the amino acid sequence. Thus, the methodology uses a Star graph molecular descriptor of the peptide sequences and the Machine Learning technique for the best classifier. The best model represents a Random Forest classifier based on two features of the embedded and non-embedded graphs. The performance of the model is excellent, considering similar models in the field, with an Area Under the Receiver Operating Characteristic Curve (AUROC) value of 0.938 and true positive rate (TPR) of 0.886 (test subset). The prediction of new nucleotide binding peptides with this model could be useful for drug target studies in drug development. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Torremorell, Maria Carme Boqué; de Nicolás, Montserrat Alguacil; Valls, Mercè Pañellas
Teacher training at the Blanquerna Faculty of Psychology and Educational and Sports Sciences (FPCEE), in Barcelona, has a long pedagogical tradition based on teaching innovation. Its educational style is characterised by methods focused on the students' involvement and on close collaboration with teaching practice centres. Within a core subject in the Teacher Training diploma course, students were asked to assess different methodological proposals aimed at promoting the development of their personal, social, and professional competences. In the assessment surveys, from a sample of 145 students, scores for variables very satisfactory or satisfactory ranged from 95.8 % to 83.4 % for the entire set of methodological actions under analysis. Data obtained in this first research phase were very useful to design basic training modules for the new Teacher Training Degree. In the second phase (in process), active teachers are asked for their perception on the orientation of the practicum, its connection with the end-of-course assignment, and the in-service student's incidence on innovation processes at school.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keereetaweep, Jantana; Chapman, Kent D.
The endocannabinoidsN-arachidonoylethanolamide (or anandamide, AEA) and 2-arachidonoylglycerol (2-AG) belong to the larger groups ofN-acylethanolamines (NAEs) and monoacylglycerol (MAG) lipid classes, respectively. They are biologically active lipid molecules that activate G-protein-coupled cannabinoid receptors found in various organisms. After AEA and 2-AG were discovered in the 1990s, they have been extensively documented to have a broad range of physiological functions. Along with AEA, several NAEs, for example,N-palmitoylethanolamine (PEA),N-stearoylethanolamine (SEA), andN-oleoylethanolamine (OEA) are also present in tissues, usually at much larger concentrations than AEA. Any perturbation that involves the endocannabinoid pathway may subsequently alter basal level or metabolism of these lipid mediators. Further,more » the altered levels of these molecules often reflect pathological conditions associated with tissue damage. Robust and sensitive methodologies to analyze these lipid mediators are essential to understanding how they act as endocannabinoids. Lastly, the recent advances in mass spectrometry allow researchers to develop lipidomics approaches and several methodologies have been proposed to quantify endocannabinoids in various biological systems.« less
Usability-driven evolution of a space instrument
NASA Astrophysics Data System (ADS)
McCalden, Alec
2012-09-01
The use of resources in the cradle-to-grave timeline of a space instrument might be significantly improved by considering the concept of usability from the start of the mission. The methodology proposed here includes giving early priority in a programme to the iterative development of a simulator that models instrument operation, and allowing this to evolve ahead of the actual instrument specification and fabrication. The advantages include reduction of risk in software development by shifting much of it to earlier in a programme than is typical, plus a test programme that uses and thereby proves the same support systems that may be used for flight. A new development flow for an instrument is suggested, showing how the system engineering phases used by the space agencies could be reworked in line with these ideas. This methodology is also likely to contribute to a better understanding between the various disciplines involved in the creation of a new instrument. The result should better capture the science needs, implement them more accurately with less wasted effort, and more fully allow the best ideas from all team members to be considered.
2Loud?: Community mapping of exposure to traffic noise with mobile phones.
Leao, Simone; Ong, Kok-Leong; Krezel, Adam
2014-10-01
Despite ample medical evidence of the adverse impacts of traffic noise on health, most policies for traffic noise management are arbitrary or incomplete, resulting in serious social and economic impacts. Surprisingly, there is limited information about citizen's exposure to traffic noise worldwide. This paper presents the 2Loud? mobile phone application, developed and tested as a methodology to monitor, assess and map the level of exposure to traffic noise of citizens with focus on the night period and indoor locations, since sleep disturbance is one of the major triggers for ill health related to traffic noise. Based on a community participation experiment using the 2Loud? mobile phone application in a region close to freeways in Australia, the results of this research indicates a good level of accuracy for the noise monitoring by mobile phones and also demonstrates significant levels of indoor night exposure to traffic noise in the study area. The proposed methodology, through the data produced and the participatory process involved, can potentially assist in planning and management towards healthier urban environments.
Bathke, Arne C.; Friedrich, Sarah; Pauly, Markus; Konietschke, Frank; Staffen, Wolfgang; Strobl, Nicolas; Höller, Yvonne
2018-01-01
ABSTRACT To date, there is a lack of satisfactory inferential techniques for the analysis of multivariate data in factorial designs, when only minimal assumptions on the data can be made. Presently available methods are limited to very particular study designs or assume either multivariate normality or equal covariance matrices across groups, or they do not allow for an assessment of the interaction effects across within-subjects and between-subjects variables. We propose and methodologically validate a parametric bootstrap approach that does not suffer from any of the above limitations, and thus provides a rather general and comprehensive methodological route to inference for multivariate and repeated measures data. As an example application, we consider data from two different Alzheimer’s disease (AD) examination modalities that may be used for precise and early diagnosis, namely, single-photon emission computed tomography (SPECT) and electroencephalogram (EEG). These data violate the assumptions of classical multivariate methods, and indeed classical methods would not have yielded the same conclusions with regards to some of the factors involved. PMID:29565679
Keereetaweep, Jantana; Chapman, Kent D.
2016-01-01
The endocannabinoidsN-arachidonoylethanolamide (or anandamide, AEA) and 2-arachidonoylglycerol (2-AG) belong to the larger groups ofN-acylethanolamines (NAEs) and monoacylglycerol (MAG) lipid classes, respectively. They are biologically active lipid molecules that activate G-protein-coupled cannabinoid receptors found in various organisms. After AEA and 2-AG were discovered in the 1990s, they have been extensively documented to have a broad range of physiological functions. Along with AEA, several NAEs, for example,N-palmitoylethanolamine (PEA),N-stearoylethanolamine (SEA), andN-oleoylethanolamine (OEA) are also present in tissues, usually at much larger concentrations than AEA. Any perturbation that involves the endocannabinoid pathway may subsequently alter basal level or metabolism of these lipid mediators. Further,more » the altered levels of these molecules often reflect pathological conditions associated with tissue damage. Robust and sensitive methodologies to analyze these lipid mediators are essential to understanding how they act as endocannabinoids. Lastly, the recent advances in mass spectrometry allow researchers to develop lipidomics approaches and several methodologies have been proposed to quantify endocannabinoids in various biological systems.« less
Community Engagement for Big Epidemiology: Deliberative Democracy as a Tool
McWhirter, Rebekah E.; Critchley, Christine R.; Nicol, Dianne; Chalmers, Don; Whitton, Tess; Otlowski, Margaret; Burgess, Michael M.; Dickinson, Joanne L.
2014-01-01
Public trust is critical in any project requiring significant public support, both in monetary terms and to encourage participation. The research community has widely recognized the centrality of public trust, garnered through community consultation, to the success of large-scale epidemiology. This paper examines the potential utility of the deliberative democracy methodology within the public health research setting. A deliberative democracy event was undertaken in Tasmania, Australia, as part of a wider program of community consultation regarding the potential development of a Tasmanian Biobank. Twenty-five Tasmanians of diverse backgrounds participated in two weekends of deliberation; involving elements of information gathering; discussion; identification of issues and formation of group resolutions. Participants demonstrated strong support for a Tasmanian Biobank and their deliberations resulted in specific proposals in relation to consent; privacy; return of results; governance; funding; and, commercialization and benefit sharing. They exhibited a high degree of satisfaction with the event, and confidence in the outcomes. Deliberative democracy methodology is a useful tool for community engagement that addresses some of the limitations of traditional consultation methods. PMID:25563457
Image-guided endobronchial ultrasound
NASA Astrophysics Data System (ADS)
Higgins, William E.; Zang, Xiaonan; Cheirsilp, Ronnarit; Byrnes, Patrick; Kuhlengel, Trevor; Bascom, Rebecca; Toth, Jennifer
2016-03-01
Endobronchial ultrasound (EBUS) is now recommended as a standard procedure for in vivo verification of extraluminal diagnostic sites during cancer-staging bronchoscopy. Yet, physicians vary considerably in their skills at using EBUS effectively. Regarding existing bronchoscopy guidance systems, studies have shown their effectiveness in the lung-cancer management process. With such a system, a patient's X-ray computed tomography (CT) scan is used to plan a procedure to regions of interest (ROIs). This plan is then used during follow-on guided bronchoscopy. Recent clinical guidelines for lung cancer, however, also dictate using positron emission tomography (PET) imaging for identifying suspicious ROIs and aiding in the cancer-staging process. While researchers have attempted to use guided bronchoscopy systems in tandem with PET imaging and EBUS, no true EBUS-centric guidance system exists. We now propose a full multimodal image-based methodology for guiding EBUS. The complete methodology involves two components: 1) a procedure planning protocol that gives bronchoscope movements appropriate for live EBUS positioning; and 2) a guidance strategy and associated system graphical user interface (GUI) designed for image-guided EBUS. We present results demonstrating the operation of the system.
Keereetaweep, Jantana; Chapman, Kent D.
2016-01-01
The endocannabinoids N-arachidonoylethanolamide (or anandamide, AEA) and 2-arachidonoylglycerol (2-AG) belong to the larger groups of N-acylethanolamines (NAEs) and monoacylglycerol (MAG) lipid classes, respectively. They are biologically active lipid molecules that activate G-protein-coupled cannabinoid receptors found in various organisms. After AEA and 2-AG were discovered in the 1990s, they have been extensively documented to have a broad range of physiological functions. Along with AEA, several NAEs, for example, N-palmitoylethanolamine (PEA), N-stearoylethanolamine (SEA), and N-oleoylethanolamine (OEA) are also present in tissues, usually at much larger concentrations than AEA. Any perturbation that involves the endocannabinoid pathway may subsequently alter basal level or metabolism of these lipid mediators. Further, the altered levels of these molecules often reflect pathological conditions associated with tissue damage. Robust and sensitive methodologies to analyze these lipid mediators are essential to understanding how they act as endocannabinoids. The recent advances in mass spectrometry allow researchers to develop lipidomics approaches and several methodologies have been proposed to quantify endocannabinoids in various biological systems. PMID:26839710
NASA Astrophysics Data System (ADS)
Felipe-Sesé, Luis; López-Alba, Elías; Siegmann, Philip; Díaz, Francisco A.
2016-12-01
A low-cost approach for three-dimensional (3-D) full-field displacement measurement is applied for the analysis of large displacements involved in two different mechanical events. The method is based on a combination of fringe projection and two-dimensional digital image correlation (DIC) techniques. The two techniques have been employed simultaneously using an RGB camera and a color encoding method; therefore, it is possible to measure in-plane and out-of-plane displacements at the same time with only one camera even at high speed rates. The potential of the proposed methodology has been employed for the analysis of large displacements during contact experiments in a soft material block. Displacement results have been successfully compared with those obtained using a 3D-DIC commercial system. Moreover, the analysis of displacements during an impact test on a metal plate was performed to emphasize the application of the methodology for dynamics events. Results show a good level of agreement, highlighting the potential of FP + 2D DIC as low-cost alternative for the analysis of large deformations problems.
Fujinaga, Aiichiro; Uchiyama, Iwao; Morisawa, Shinsuke; Yoneda, Minoru; Sasamoto, Yuzuru
2012-01-01
In Japan, environmental standards for contaminants in groundwater and in leachate from soil are set with the assumption that they are used for drinking water over a human lifetime. Where there is neither a well nor groundwater used for drinking, the standard is thus too severe. Therefore, remediation based on these standards incurs excessive effort and cost. In contrast, the environmental-assessment procedure used in the United States and the Netherlands considers the site conditions (land use, existing wells, etc.); however, a risk assessment is required for each site. Therefore, this study proposes a new framework for judging contamination in Japan by considering the merits of the environmental standards used and a method for risk assessment. The framework involves setting risk-based concentrations that are attainable remediation goals for contaminants in soil and groundwater. The framework was then applied to a model contaminated site for risk management, and the results are discussed regarding the effectiveness and applicability of the new methodology. © 2011 Society for Risk Analysis.
Samsi, Siddharth; Krishnamurthy, Ashok K.; Gurcan, Metin N.
2012-01-01
Follicular Lymphoma (FL) is one of the most common non-Hodgkin Lymphoma in the United States. Diagnosis and grading of FL is based on the review of histopathological tissue sections under a microscope and is influenced by human factors such as fatigue and reader bias. Computer-aided image analysis tools can help improve the accuracy of diagnosis and grading and act as another tool at the pathologist’s disposal. Our group has been developing algorithms for identifying follicles in immunohistochemical images. These algorithms have been tested and validated on small images extracted from whole slide images. However, the use of these algorithms for analyzing the entire whole slide image requires significant changes to the processing methodology since the images are relatively large (on the order of 100k × 100k pixels). In this paper we discuss the challenges involved in analyzing whole slide images and propose potential computational methodologies for addressing these challenges. We discuss the use of parallel computing tools on commodity clusters and compare performance of the serial and parallel implementations of our approach. PMID:22962572
An Evolutionary Method for Financial Forecasting in Microscopic High-Speed Trading Environment
Li, Hsu-Chih
2017-01-01
The advancement of information technology in financial applications nowadays have led to fast market-driven events that prompt flash decision-making and actions issued by computer algorithms. As a result, today's markets experience intense activity in the highly dynamic environment where trading systems respond to others at a much faster pace than before. This new breed of technology involves the implementation of high-speed trading strategies which generate significant portion of activity in the financial markets and present researchers with a wealth of information not available in traditional low-speed trading environments. In this study, we aim at developing feasible computational intelligence methodologies, particularly genetic algorithms (GA), to shed light on high-speed trading research using price data of stocks on the microscopic level. Our empirical results show that the proposed GA-based system is able to improve the accuracy of the prediction significantly for price movement, and we expect this GA-based methodology to advance the current state of research for high-speed trading and other relevant financial applications. PMID:28316618
Faux-Pas Test: A Proposal of a Standardized Short Version.
Fernández-Modamio, Mar; Arrieta-Rodríguez, Marta; Bengochea-Seco, Rosario; Santacoloma-Cabero, Iciar; Gómez de Tojeiro-Roce, Juan; García-Polavieja, Bárbara; González-Fraile, Eduardo; Martín-Carrasco, Manuel; Griffin, Kim; Gil-Sanz, David
2018-06-26
Previous research on theory of mind suggests that people with schizophrenia have difficulties with complex mentalization tasks that involve the integration of cognition and affective mental states. One of the tools most commonly used to assess theory of mind is the Faux-Pas Test. However, it presents two main methodological problems: 1) the lack of a standard scoring system; 2) the different versions are not comparable due to a lack of information on the stories used. These methodological problems make it difficult to draw conclusions about performance on this test by people with schizophrenia. The aim of this study was to develop a reduced version of the Faux-Pas test with adequate psychometric properties. The test was administered to control and clinical groups. Interrater and test-retest reliability were analyzed for each story in order to select the set of 10 stories included in the final reduced version. The shortened version showed good psychometric properties for controls and patients: test-retest reliability of 0.97 and 0.78, inter-rater reliability of 0.95 and 0.87 and Cronbach's alpha of 0.82 and 0.72.
An ontological case base engineering methodology for diabetes management.
El-Sappagh, Shaker H; El-Masri, Samir; Elmogy, Mohammed; Riad, A M; Saddik, Basema
2014-08-01
Ontology engineering covers issues related to ontology development and use. In Case Based Reasoning (CBR) system, ontology plays two main roles; the first as case base and the second as domain ontology. However, the ontology engineering literature does not provide adequate guidance on how to build, evaluate, and maintain ontologies. This paper proposes an ontology engineering methodology to generate case bases in the medical domain. It mainly focuses on the research of case representation in the form of ontology to support the case semantic retrieval and enhance all knowledge intensive CBR processes. A case study on diabetes diagnosis case base will be provided to evaluate the proposed methodology.
Finding user personal interests by tweet-mining using advanced machine learning algorithm in R
NASA Astrophysics Data System (ADS)
Krithika, L. B.; Roy, P.; Asha Jerlin, M.
2017-11-01
The social-media plays a key role in every individual’s life by anyone’s personal views about their liking-ness/disliking-ness. This methodology is a sharp departure from the traditional techniques of inferring interests of a user from the tweets that he/she posts or receives. It is showed that the topics of interest inferred by the proposed methodology are far superior than the topics extracted by state-of-the-art techniques such as using topic models (Labelled LDA) on tweets. Based upon the proposed methodology, a system has been built, “Who is interested in what”, which can infer the interests of millions of Twitter users. A novel mechanism is proposed to infer topics of interest of individual users in the twitter social network. It has been observed that in twitter, a user generally follows experts on various topics of his/her interest in order to acquire information on those topics. A methodology based on social annotations is used to first deduce the topical expertise of popular twitter users and then transitively infer the interests of the users who follow them.
Generalized Predictive and Neural Generalized Predictive Control of Aerospace Systems
NASA Technical Reports Server (NTRS)
Kelkar, Atul G.
2000-01-01
The research work presented in this thesis addresses the problem of robust control of uncertain linear and nonlinear systems using Neural network-based Generalized Predictive Control (NGPC) methodology. A brief overview of predictive control and its comparison with Linear Quadratic (LQ) control is given to emphasize advantages and drawbacks of predictive control methods. It is shown that the Generalized Predictive Control (GPC) methodology overcomes the drawbacks associated with traditional LQ control as well as conventional predictive control methods. It is shown that in spite of the model-based nature of GPC it has good robustness properties being special case of receding horizon control. The conditions for choosing tuning parameters for GPC to ensure closed-loop stability are derived. A neural network-based GPC architecture is proposed for the control of linear and nonlinear uncertain systems. A methodology to account for parametric uncertainty in the system is proposed using on-line training capability of multi-layer neural network. Several simulation examples and results from real-time experiments are given to demonstrate the effectiveness of the proposed methodology.
Petraco, Ricardo; Dehbi, Hakim-Moulay; Howard, James P; Shun-Shin, Matthew J; Sen, Sayan; Nijjer, Sukhjinder S; Mayet, Jamil; Davies, Justin E; Francis, Darrel P
2018-01-01
Diagnostic accuracy is widely accepted by researchers and clinicians as an optimal expression of a test's performance. The aim of this study was to evaluate the effects of disease severity distribution on values of diagnostic accuracy as well as propose a sample-independent methodology to calculate and display accuracy of diagnostic tests. We evaluated the diagnostic relationship between two hypothetical methods to measure serum cholesterol (Chol rapid and Chol gold ) by generating samples with statistical software and (1) keeping the numerical relationship between methods unchanged and (2) changing the distribution of cholesterol values. Metrics of categorical agreement were calculated (accuracy, sensitivity and specificity). Finally, a novel methodology to display and calculate accuracy values was presented (the V-plot of accuracies). No single value of diagnostic accuracy can be used to describe the relationship between tests, as accuracy is a metric heavily affected by the underlying sample distribution. Our novel proposed methodology, the V-plot of accuracies, can be used as a sample-independent measure of a test performance against a reference gold standard.
Athanasiou, Lambros S; Rigas, George A; Sakellarios, Antonis I; Exarchos, Themis P; Siogkas, Panagiotis K; Naka, Katerina K; Panetta, Daniele; Pelosi, Gualtiero; Vozzi, Federico; Michalis, Lampros K; Parodi, Oberdan; Fotiadis, Dimitrios I
2015-10-01
A framework for the inflation of micro-CT and histology data using intravascular ultrasound (IVUS) images, is presented. The proposed methodology consists of three steps. In the first step the micro-CT/histological images are manually co-registered with IVUS by experts using fiducial points as landmarks. In the second step the lumen of both the micro-CT/histological images and IVUS images are automatically segmented. Finally, in the third step the micro-CT/histological images are inflated by applying a transformation method on each image. The transformation method is based on the IVUS and micro-CT/histological contour difference. In order to validate the proposed image inflation methodology, plaque areas in the inflated micro-CT and histological images are compared with the ones in the IVUS images. The proposed methodology for inflating micro-CT/histological images increases the sensitivity of plaque area matching between the inflated and the IVUS images (7% and 22% in histological and micro-CT images, respectively). Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Serra, Roger; Lopez, Lautaro
2018-05-01
Different approaches on the detection of damages based on dynamic measurement of structures have appeared in the last decades. They were based, amongst others, on changes in natural frequencies, modal curvatures, strain energy or flexibility. Wavelet analysis has also been used to detect the abnormalities on modal shapes induced by damages. However the majority of previous work was made with non-corrupted by noise signals. Moreover, the damage influence for each mode shape was studied separately. This paper proposes a new methodology based on combined modal wavelet transform strategy to cope with noisy signals, while at the same time, able to extract the relevant information from each mode shape. The proposed methodology will be then compared with the most frequently used and wide-studied methods from the bibliography. To evaluate the performance of each method, their capacity to detect and localize damage will be analyzed in different cases. The comparison will be done by simulating the oscillations of a cantilever steel beam with and without defect as a numerical case. The proposed methodology proved to outperform classical methods in terms of noisy signals.
An omnibus test for family-based association studies with multiple SNPs and multiple phenotypes.
Lasky-Su, Jessica; Murphy, Amy; McQueen, Matthew B; Weiss, Scott; Lange, Christoph
2010-06-01
We propose an omnibus family-based association test (MFBAT) that can be applied to multiple markers and multiple phenotypes and that has only one degree of freedom. The proposed test statistic extends current FBAT methodology to incorporate multiple markers as well as multiple phenotypes. Using simulation studies, power estimates for the proposed methodology are compared with the standard methodologies. On the basis of these simulations, we find that MFBAT substantially outperforms other methods, including haplotypic approaches and doing multiple tests with single single-nucleotide polymorphisms (SNPs) and single phenotypes. The practical relevance of the approach is illustrated by an application to asthma in which SNP/phenotype combinations are identified and reach overall significance that would not have been identified using other approaches. This methodology is directly applicable to cases in which there are multiple SNPs, such as candidate gene studies, cases in which there are multiple phenotypes, such as expression data, and cases in which there are multiple phenotypes and genotypes, such as genome-wide association studies that incorporate expression profiles as phenotypes. This program is available in the PBAT analysis package.
Anguera, M. Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana
2018-01-01
Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts. PMID:29441028
75 FR 62403 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-08
... Project: 2011-2014 National Survey on Drug Use and Health: Methodological Field Tests (OMB No. 0930-0290..., SAMHSA received a three-year renewal of its generic clearance for methodological field tests. This will be a request for another renewal of the generic approval to continue methodological tests over the...
Determination of smoke plume and layer heights using scanning lidar data
Vladimir A. Kovalev; Alexander Petkov; Cyle Wold; Shawn Urbanski; Wei Min Hao
2009-01-01
The methodology of using mobile scanning lidar data for investigation of smoke plume rise and high-resolution smoke dispersion is considered. The methodology is based on the lidar-signal transformation proposed recently [Appl. Opt. 48, 2559 (2009)]. In this study, similar methodology is used to create the atmospheric heterogeneity height indicator (HHI...
77 FR 10767 - Rate Adjustments for Indian Irrigation Projects
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-23
... Irrigation Project on the proposed rates about the following issues: (1) The methodology for O&M rate setting... BIA's responses are provided below. Comment: The BIA's methodology for setting the 2013 O&M assessment rate was unreasonable. Response: The methodology used by the BIA to determine the 2013 O&M assessment...
Selecting a Targeting Method to Identify BPL Households in India
ERIC Educational Resources Information Center
Alkire, Sabina; Seth, Suman
2013-01-01
This paper proposes how to select a methodology to target multidimensionally poor households, and how to update that targeting exercise periodically. We present this methodology in the context of discussions regarding the selection of a targeting methodology in India. In 1992, 1997, and 2002 the Indian government identified households that are…
1987-03-01
contends his soft systems methodology is such an approach. [Ref. 2: pp. 105-107] Overview of this Methodology is meant flor addressing fuzzy., ill...could form the basis of office systems development: Checkland’s (1981) soft systems methodology , Pava’s (1983) sociotechnical design, and Mumlbrd and
78 FR 77399 - Basic Health Program: Proposed Federal Funding Methodology for Program Year 2015
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-23
... American Indians and Alaska Natives F. Example Application of the BHP Funding Methodology III. Collection... effectively 138 percent due to the application of a required 5 percent income disregard in determining the... correct errors in applying the methodology (such as mathematical errors). Under section 1331(d)(3)(ii) of...
A Call for a New National Norming Methodology.
ERIC Educational Resources Information Center
Ligon, Glynn; Mangino, Evangelina
Issues related to achieving adequate national norms are reviewed, and a new methodology is proposed that would work to provide a true measure of national achievement levels on an annual basis and would enable reporting results in current-year norms. Statistical methodology and technology could combine to create a national norming process that…
2016-12-22
assumptions of behavior. This research proposes an information theoretic methodology to discover such complex network structures and dynamics while overcoming...the difficulties historically associated with their study. Indeed, this was the first application of an information theoretic methodology as a tool...1 Research Objectives and Questions..............................................................................2 Methodology
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.
2009-01-01
Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include (1) diagnostic/detection methodology, (2) prognosis/lifing methodology, (3) diagnostic/prognosis linkage, (4) experimental validation, and (5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multimechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.
2009-01-01
Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include 1) diagnostic/detection methodology, 2) prognosis/lifing methodology, 3) diagnostic/prognosis linkage, 4) experimental validation and 5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multi-mechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.
Raut, Savita V; Yadav, Dinkar M
2018-03-28
This paper presents an fMRI signal analysis methodology using geometric mean curve decomposition (GMCD) and mutual information-based voxel selection framework. Previously, the fMRI signal analysis has been conducted using empirical mean curve decomposition (EMCD) model and voxel selection on raw fMRI signal. The erstwhile methodology loses frequency component, while the latter methodology suffers from signal redundancy. Both challenges are addressed by our methodology in which the frequency component is considered by decomposing the raw fMRI signal using geometric mean rather than arithmetic mean and the voxels are selected from EMCD signal using GMCD components, rather than raw fMRI signal. The proposed methodologies are adopted for predicting the neural response. Experimentations are conducted in the openly available fMRI data of six subjects, and comparisons are made with existing decomposition models and voxel selection frameworks. Subsequently, the effect of degree of selected voxels and the selection constraints are analyzed. The comparative results and the analysis demonstrate the superiority and the reliability of the proposed methodology.
Multimodal hybrid reasoning methodology for personalized wellbeing services.
Ali, Rahman; Afzal, Muhammad; Hussain, Maqbool; Ali, Maqbool; Siddiqi, Muhammad Hameed; Lee, Sungyoung; Ho Kang, Byeong
2016-02-01
A wellness system provides wellbeing recommendations to support experts in promoting a healthier lifestyle and inducing individuals to adopt healthy habits. Adopting physical activity effectively promotes a healthier lifestyle. A physical activity recommendation system assists users to adopt daily routines to form a best practice of life by involving themselves in healthy physical activities. Traditional physical activity recommendation systems focus on general recommendations applicable to a community of users rather than specific individuals. These recommendations are general in nature and are fit for the community at a certain level, but they are not relevant to every individual based on specific requirements and personal interests. To cover this aspect, we propose a multimodal hybrid reasoning methodology (HRM) that generates personalized physical activity recommendations according to the user׳s specific needs and personal interests. The methodology integrates the rule-based reasoning (RBR), case-based reasoning (CBR), and preference-based reasoning (PBR) approaches in a linear combination that enables personalization of recommendations. RBR uses explicit knowledge rules from physical activity guidelines, CBR uses implicit knowledge from experts׳ past experiences, and PBR uses users׳ personal interests and preferences. To validate the methodology, a weight management scenario is considered and experimented with. The RBR part of the methodology generates goal, weight status, and plan recommendations, the CBR part suggests the top three relevant physical activities for executing the recommended plan, and the PBR part filters out irrelevant recommendations from the suggested ones using the user׳s personal preferences and interests. To evaluate the methodology, a baseline-RBR system is developed, which is improved first using ranged rules and ultimately using a hybrid-CBR. A comparison of the results of these systems shows that hybrid-CBR outperforms the modified-RBR and baseline-RBR systems. Hybrid-CBR yields a 0.94% recall, a 0.97% precision, a 0.95% f-score, and low Type I and Type II errors. Copyright © 2015 Elsevier Ltd. All rights reserved.
Mjøsund, Nina Helen; Eriksson, Monica; Espnes, Geir Arild; Haaland-Øverby, Mette; Jensen, Sven Liang; Norheim, Irene; Kjus, Solveig Helene Høymork; Portaasen, Inger-Lill; Vinje, Hege Forbech
2017-01-01
The aim of this study was to examine how service user involvement can contribute to the development of interpretative phenomenological analysis methodology and enhance research quality. Interpretative phenomenological analysis is a qualitative methodology used in nursing research internationally to understand human experiences that are essential to the participants. Service user involvement is requested in nursing research. We share experiences from 4 years of collaboration (2012-2015) on a mental health promotion project, which involved an advisory team. Five research advisors either with a diagnosis or related to a person with severe mental illness constituted the team. They collaborated with the research fellow throughout the entire research process and have co-authored this article. We examined the joint process of analysing the empirical data from interviews. Our analytical discussions were audiotaped, transcribed and subsequently interpreted following the guidelines for good qualitative analysis in interpretative phenomenological analysis studies. The advisory team became 'the researcher's helping hand'. Multiple perspectives influenced the qualitative analysis, which gave more insightful interpretations of nuances, complexity, richness or ambiguity in the interviewed participants' accounts. The outcome of the service user involvement was increased breadth and depth in findings. Service user involvement improved the research quality in a nursing research project on mental health promotion. The interpretative element of interpretative phenomenological analysis was enhanced by the emergence of multiple perspectives in the qualitative analysis of the empirical data. We argue that service user involvement and interpretative phenomenological analysis methodology can mutually reinforce each other and strengthen qualitative methodology. © 2016 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.
A participative model for undertaking and evaluating scientific communication in Earth Observation
NASA Astrophysics Data System (ADS)
L'Astorina, Alba; Tomasoni, Irene
2015-04-01
Public communication of Science and Technology (PCST) is an integral part of the mission of the Italian National Research Council (CNR) and widely carried out among the scientific community. Recently it has also become a research field investigating practices, channels, tools and models of public engagement and their impact on the relation between Science and Society. Understanding such aspects is increasingly considered relevant for an effective and aware outreach. Within this context, CNR has adopted some innovative communication approaches addressed to different publics, such as stakeholders, users, media, young people and the general public, using participative methodologies. Besides being practices of communication promoting the scientific culture, such initiatives aim at understanding the models at the basis of the relationship between the scientific community and the public. To what extent do scientists put their communication and involvement strategies in discussion? Do they use to have a real exchange with their publics in order to evaluate the effectiveness of the participatory techniques they adopt in communicating and disseminating their activities? In this paper we present a case study of a communication and educational proposal recently developed by CNR in order to promote a mutual exchange between Education/School and Research, that are the most important actors in the production and the revision of the scientific knowledge. The proposal brings an ongoing CNR research project (its steps, subjects, tools, activities, costs etc) in classrooms, making use of interactive Earth Sciences workshops conducted directly by researchers. The ongoing CNR project shared with students studies Innovative Methodologies of Earth Observation supporting the Agricultural sector in Lombardy. It aims at exploiting the Aerospace Earth Observation (EO) tools to develop dedicated agricultural downstream services that will bring added economic value and benefits for Lombardy public administrations and citizens. This initiative aims at introducing students to the world of the research and scientific production and vice versa to connect scientists with the educational world, its language and its teaching models. The exchange Research-School is mutual and real. The goal is so twofold: introducing students to a critical/concrete vision of the scientific process and inviting scientists to reflect on PCST activities, participative models and their critical aspects Doing so, in fact, researchers have the chance to open a dialogue with the educational world - to better understand it, its lacks, needs, reasoning and, as a result, improve their own communication/involvement approaches. At the same time, schools, being co-players of a scientific research project and following side by side scientists in their procedures, can actively participate, give personal contributions and feedbacks. The initiative represents an attempt of 'participative research' in which researchers and students can freely express their expectations, acquire information, test new approaches and build together a piece of knowledge. The proposal makes use of participative methodologies and qualitative tools for evaluating the involvement of students, teachers and researchers and analyzing the communication model implied in the relation among them. In EGU presentation the first results of this evaluation process will be reported.
Design and analysis of sustainable computer mouse using design for disassembly methodology
NASA Astrophysics Data System (ADS)
Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia
2017-12-01
This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.
Zhang, Qin; Yao, Quanying
2018-05-01
The dynamic uncertain causality graph (DUCG) is a newly presented framework for uncertain causality representation and probabilistic reasoning. It has been successfully applied to online fault diagnoses of large, complex industrial systems, and decease diagnoses. This paper extends the DUCG to model more complex cases than what could be previously modeled, e.g., the case in which statistical data are in different groups with or without overlap, and some domain knowledge and actions (new variables with uncertain causalities) are introduced. In other words, this paper proposes to use -mode, -mode, and -mode of the DUCG to model such complex cases and then transform them into either the standard -mode or the standard -mode. In the former situation, if no directed cyclic graph is involved, the transformed result is simply a Bayesian network (BN), and existing inference methods for BNs can be applied. In the latter situation, an inference method based on the DUCG is proposed. Examples are provided to illustrate the methodology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Halls, Benjamin R.; Meyer, Terrence R.; Kastengren, Alan L.
2015-01-01
The complex geometry and large index-of-refraction gradients that occur near the point of impingement of binary liquid jets present a challenging environment for optical interrogation. A simultaneous quadruple-tracer x-ray fluorescence and line-of-sight radiography technique is proposed as a means of distinguishing and quantifying individual liquid component distributions prior to, during, and after jet impact. Two different pairs of fluorescence tracers are seeded into each liquid stream to maximize their attenuation ratio for reabsorption correction and differentiation of the two fluids during mixing. This approach for instantaneous correction of x-ray fluorescence reabsorption is compared with a more time-intensive approach of usingmore » stereographic reconstruction of x-ray attenuation along multiple lines of sight. The proposed methodology addresses the need for a quantitative measurement technique capable of interrogating optically complex, near-field liquid distributions in many mixing systems of practical interest involving two or more liquid streams.« less
Hong, Jongwoo; Kim, Sun-Je; Kim, Inki; Yun, Hansik; Mun, Sang-Eun; Rho, Junsuk; Lee, Byoungho
2018-05-14
It has been hard to achieve simultaneous plasmonic enhancement of nanoscale light-matter interactions in terms of both electric and magnetic manners with easily reproducible fabrication method and systematic theoretical design rule. In this paper, a novel concept of a flat nanofocusing device is proposed for simultaneously squeezing both electric and magnetic fields in deep-subwavelength volume (~λ 3 /538) in a large area. Based on the funneled unit cell structures and surface plasmon-assisted coherent interactions between them, the array of rectangular nanocavity connected to a tapered nanoantenna, plasmonic metasurface cavity, is constructed by periodic arrangement of the unit cell. The average enhancement factors of electric and magnetic field intensities reach about 60 and 22 in nanocavities, respectively. The proposed outstanding performance of the device is verified numerically and experimentally. We expect that this work would expand methodologies involving optical near-field manipulations in large areas and related potential applications including nanophotonic sensors, nonlinear responses, and quantum interactions.
Proposed reliability cost model
NASA Technical Reports Server (NTRS)
Delionback, L. M.
1973-01-01
The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.
Modified Distribution-Free Goodness-of-Fit Test Statistic.
Chun, So Yeon; Browne, Michael W; Shapiro, Alexander
2018-03-01
Covariance structure analysis and its structural equation modeling extensions have become one of the most widely used methodologies in social sciences such as psychology, education, and economics. An important issue in such analysis is to assess the goodness of fit of a model under analysis. One of the most popular test statistics used in covariance structure analysis is the asymptotically distribution-free (ADF) test statistic introduced by Browne (Br J Math Stat Psychol 37:62-83, 1984). The ADF statistic can be used to test models without any specific distribution assumption (e.g., multivariate normal distribution) of the observed data. Despite its advantage, it has been shown in various empirical studies that unless sample sizes are extremely large, this ADF statistic could perform very poorly in practice. In this paper, we provide a theoretical explanation for this phenomenon and further propose a modified test statistic that improves the performance in samples of realistic size. The proposed statistic deals with the possible ill-conditioning of the involved large-scale covariance matrices.
Halls, Benjamin R.; Meyer, Terrence R.; Kastengren, Alan L.
2015-01-23
The complex geometry and large index-of-refraction gradients that occur near the point of impingement of binary liquid jets present a challenging environment for optical interrogation. A simultaneous quadruple-tracer x-ray fluorescence and line-of-sight radiography technique is proposed as a means of distinguishing and quantifying individual liquid component distributions prior to, during, and after jet impact. Two different pairs of fluorescence tracers are seeded into each liquid stream to maximize their attenuation ratio for reabsorption correction and differentiation of the two fluids during mixing. This approach for instantaneous correction of x-ray fluorescence reabsorption is compared with a more time-intensive approach of usingmore » stereographic reconstruction of x-ray attenuation along multiple lines of sight. The proposed methodology addresses the need for a quantitative measurement technique capable of interrogating optically complex, near-field liquid distributions in many mixing systems of practical interest involving two or more liquid streams.« less
Guidelines for reporting evaluations based on observational methodology.
Portell, Mariona; Anguera, M Teresa; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana
2015-01-01
Observational methodology is one of the most suitable research designs for evaluating fidelity of implementation, especially in complex interventions. However, the conduct and reporting of observational studies is hampered by the absence of specific guidelines, such as those that exist for other evaluation designs. This lack of specific guidance poses a threat to the quality and transparency of these studies and also constitutes a considerable publication hurdle. The aim of this study thus was to draw up a set of proposed guidelines for reporting evaluations based on observational methodology. The guidelines were developed by triangulating three sources of information: observational studies performed in different fields by experts in observational methodology, reporting guidelines for general studies and studies with similar designs to observational studies, and proposals from experts in observational methodology at scientific meetings. We produced a list of guidelines grouped into three domains: intervention and expected outcomes, methods, and results. The result is a useful, carefully crafted set of simple guidelines for conducting and reporting observational studies in the field of program evaluation.
NASA Astrophysics Data System (ADS)
Zou, Jie; Gattani, Abhishek
2005-01-01
When completely automated systems don't yield acceptable accuracy, many practical pattern recognition systems involve the human either at the beginning (pre-processing) or towards the end (handling rejects). We believe that it may be more useful to involve the human throughout the recognition process rather than just at the beginning or end. We describe a methodology of interactive visual recognition for human-centered low-throughput applications, Computer Assisted Visual InterActive Recognition (CAVIAR), and discuss the prospects of implementing CAVIAR over the Internet. The novelty of CAVIAR is image-based interaction through a domain-specific parameterized geometrical model, which reduces the semantic gap between humans and computers. The user may interact with the computer anytime that she considers its response unsatisfactory. The interaction improves the accuracy of the classification features by improving the fit of the computer-proposed model. The computer makes subsequent use of the parameters of the improved model to refine not only its own statistical model-fitting process, but also its internal classifier. The CAVIAR methodology was applied to implement a flower recognition system. The principal conclusions from the evaluation of the system include: 1) the average recognition time of the CAVIAR system is significantly shorter than that of the unaided human; 2) its accuracy is significantly higher than that of the unaided machine; 3) it can be initialized with as few as one training sample per class and still achieve high accuracy; and 4) it demonstrates a self-learning ability. We have also implemented a Mobile CAVIAR system, where a pocket PC, as a client, connects to a server through wireless communication. The motivation behind a mobile platform for CAVIAR is to apply the methodology in a human-centered pervasive environment, where the user can seamlessly interact with the system for classifying field-data. Deploying CAVIAR to a networked mobile platform poses the challenge of classifying field images and programming under constraints of display size, network bandwidth, processor speed, and memory size. Editing of the computer-proposed model is performed on the handheld while statistical model fitting and classification take place on the server. The possibility that the user can easily take several photos of the object poses an interesting information fusion problem. The advantage of the Internet is that the patterns identified by different users can be pooled together to benefit all peer users. When users identify patterns with CAVIAR in a networked setting, they also collect training samples and provide opportunities for machine learning from their intervention. CAVIAR implemented over the Internet provides a perfect test bed for, and extends, the concept of Open Mind Initiative proposed by David Stork. Our experimental evaluation focuses on human time, machine and human accuracy, and machine learning. We devoted much effort to evaluating the use of our image-based user interface and on developing principles for the evaluation of interactive pattern recognition system. The Internet architecture and Mobile CAVIAR methodology have many applications. We are exploring in the directions of teledermatology, face recognition, and education.
FY 1998 Proposed Rail Improvement Program Supplement
DOT National Transportation Integrated Search
1997-01-01
This FY 1998 Proposed Rail Improvement Program Supplement contains those rail plan amendments which have been published subsequent to the FY 1997 Proposed Rail Improvement program supplement. This document also contains the benefit/cost methodology u...
NASA Astrophysics Data System (ADS)
Gobbi, Gian Paolo; Barnaba, Francesca; Bolignano, Andrea; Costabile, Francesca; Di Liberto, Luca; Dionisi, Davide; Drewnick, Frank; Lucarelli, Franco; Manigrasso, Maurizio; Nava, Silvia; Sauvage, Laurent; Sozzi, Roberto; Struckmeier, Caroline; Wille, Holger
2015-04-01
The EC LIFE+2010 DIAPASON Project (Desert dust Impact on Air quality through model-Predictions and Advanced Sensors ObservatioNs, www.diapason-life.eu) intends to contribute new methodologies to assess the role of aerosol advections of Saharan dust to the local PM loads recorded in Europe. To this goal, automated Polarization Lidar-Ceilometers (PLCs) were prototyped within DIAPASON to certify the presence of Saharan dust plumes and support evaluating their mass loadings in the lowermost atmosphere. The whole process also involves operational dust forecasts, as well as satellite and in-situ observations. Demonstration of the Project is implemented in the pilot region of Rome (Central Italy) where three networked DIAPASON PLCs started, in October 2013 a year-round, 24h/day monitoring of the altitude-resolved aerosol backscatter and depolarization profiles. Two intensive observational periods (IOPs) involving chemical analysis and detailed physical characterization of aerosol samples have also been carried out in this year-long campaign, namely in Fall 2013 and Spring 2014. These allowed for an extensive interpretation of the PLC observations, highlighting important synergies between the PLC and the in situ data. The presentation will address capabilities of the employed PLCs, observations agreement with model forecasts of dust advections, retrievals of aerosol properties and methodologies developed to detect Saharan advections and to evaluate the relevant mass contribution to PM10. This latter task is intended to provide suggestions on possible improvements to the current EC Guidelines (2011) on this matter. In fact, specific Guidelines are delivered by the European Commission to provide the Member States a common method to asses the Saharan dust contribution to the currently legislated PM-related Air Quality metrics. The DIAPASON experience shows that improvements can be proposed to make the current EC Methodology more robust and flexible. The methodology DIAPASON recommends has been designed and validated taking advantage of the PLC observations and highlights the benefits of the operational use of such systems in routine Air Quality applications. Concurrently, PLC activities are contributing to the COST Action "TOPROF", an European effort aiming at the setup and operational use of Lidar-Ceilometers networks for meteorological and safety purposes.
A healthcare Lean Six Sigma System for postanesthesia care unit workflow improvement.
Kuo, Alex Mu-Hsing; Borycki, Elizabeth; Kushniruk, Andre; Lee, Te-Shu
2011-01-01
The aim of this article is to propose a new model called Healthcare Lean Six Sigma System that integrates Lean and Six Sigma methodologies to improve workflow in a postanesthesia care unit. The methodology of the proposed model is fully described. A postanesthesia care unit case study is also used to demonstrate the benefits of using the Healthcare Lean Six Sigma System model by combining Lean and Six Sigma methodologies together. The new model bridges the service gaps between health care providers and patients, balances the requirements of health care managers, and delivers health care services to patients by taking the benefits of the Lean speed and Six Sigma high-quality principles. The full benefits of the new model will be realized when applied at both strategic and operational levels. For further research, we will examine how the proposed model is used in different real-world case studies.
Bayesian Local Contamination Models for Multivariate Outliers
Page, Garritt L.; Dunson, David B.
2013-01-01
In studies where data are generated from multiple locations or sources it is common for there to exist observations that are quite unlike the majority. Motivated by the application of establishing a reference value in an inter-laboratory setting when outlying labs are present, we propose a local contamination model that is able to accommodate unusual multivariate realizations in a flexible way. The proposed method models the process level of a hierarchical model using a mixture with a parametric component and a possibly nonparametric contamination. Much of the flexibility in the methodology is achieved by allowing varying random subsets of the elements in the lab-specific mean vectors to be allocated to the contamination component. Computational methods are developed and the methodology is compared to three other possible approaches using a simulation study. We apply the proposed method to a NIST/NOAA sponsored inter-laboratory study which motivated the methodological development. PMID:24363465
Possible Improvements of the ACE Diversity Interchange Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Etingov, Pavel V.; Zhou, Ning; Makarov, Yuri V.
2010-07-26
North American Electric Reliability Corporation (NERC) grid is operated by about 131 balancing authorities (BA). Within each BA, operators are responsible for managing the unbalance (caused by both load and wind). As wind penetration levels increase, the challenges of managing power variation increases. Working independently, balancing area with limited regulating/load following generation and high wind power penetration faces significant challenges. The benefits of BA cooperation and consolidation increase when there is a significant wind energy penetration. To explore the benefits of BA cooperation, this paper investigates ACE sharing approach. A technology called ACE diversity interchange (ADI) is already in usemore » in the western interconnection. A new methodology extending ADI is proposed in the paper. The proposed advanced ADI overcoming some limitations existing in conventional ADI. Simulations using real statistical data of CAISO and BPA have shown high performance of the proposed advanced ADI methodology.« less
Spousal Involvement and CPAP Adherence: A Dyadic Perspective
Ye, Lichuan; Malhotra, Atul; Kayser, Karen; Willis, Danny G.; Horowitz, June; Aloia, Mark; Weaver, Terri E.
2014-01-01
Summary Poor adherence to continuous positive airway pressure (CPAP) treatment is associated with substantial health care costs, morbidity and mortality, and has been a leading obstacle in the effective management of obstructive sleep apnea (OSA). Successful interventions to improve CPAP adherence may ultimately include a variety of components. For patients living with spouses (refers to all domestic partners), the spouse will likely be an integral component to any successful intervention. Developing understanding of the role of spouses in adherence to CPAP has been identified to be a critical research need. This review expands the investigation of CPAP adherence to a broader context, from an exclusive focus on individual patients to a dyadic perspective encompassing both patients and their spouses. A conceptual framework based on social support and social control theories is proposed to understand spousal involvement in CPAP adherence. Methodologies for future investigations are discussed, along with implications for developing interventions that engage both patients and their spouses to improve CPAP use. PMID:24906222
Leal, Cristian Oliveira Benevides Sanches; Teixeira, Carmen Fontes de Souza
2017-10-01
This is a theoretical essay about the development of the concept of solidarity, a word used in the regulatory framework and in political proposals to reorient the Brazilian Unified Health System (SUS). The methodology consisted of mapping authors addressing human action aspects related to this theme from Durkheim's tradition, linking them to his followers, like Marcel Mauss and authors from the "anti-utilitarianism" movement in social sciences. Solidarity is one way to express a "gift" and appears as a multidimensional action, where duty and freedom, instrumental interest and disinterest interpose and interlace. The planning and execution of sanitary surveillance (VISA) actions requires comprehension of organizational forms and solidary relationship management among agents involved in health risk control, transcending the strongly normative aspect of the prevailing supervision actions. The development of associative actions involving sanitary surveillance professionals, economic agents and consumers, aiming to share the responsibilities in the health risk control of products, services and environments subjected to Sanitary Surveillance action is suggested.
A Negative Selection Immune System Inspired Methodology for Fault Diagnosis of Wind Turbines.
Alizadeh, Esmaeil; Meskin, Nader; Khorasani, Khashayar
2017-11-01
High operational and maintenance costs represent as major economic constraints in the wind turbine (WT) industry. These concerns have made investigation into fault diagnosis of WT systems an extremely important and active area of research. In this paper, an immune system (IS) inspired methodology for performing fault detection and isolation (FDI) of a WT system is proposed and developed. The proposed scheme is based on a self nonself discrimination paradigm of a biological IS. Specifically, the negative selection mechanism [negative selection algorithm (NSA)] of the human body is utilized. In this paper, a hierarchical bank of NSAs are designed to detect and isolate both individual as well as simultaneously occurring faults common to the WTs. A smoothing moving window filter is then utilized to further improve the reliability and performance of the FDI scheme. Moreover, the performance of our proposed scheme is compared with another state-of-the-art data-driven technique, namely the support vector machines (SVMs) to demonstrate and illustrate the superiority and advantages of our proposed NSA-based FDI scheme. Finally, a nonparametric statistical comparison test is implemented to evaluate our proposed methodology with that of the SVM under various fault severities.
ERIC Educational Resources Information Center
Spillane, James P.; Camburn, Eric M.; Pustejovsky, James; Pareja, Amber Stitziel; Lewis, Geoff
2008-01-01
Purpose: This paper is concerned with the epistemological and methodological challenges involved in studying the distribution of leadership across people within the school--the leader-plus aspect of a distributed perspective, which it aims to investigate. Design/methodology/approach: The paper examines the entailments of the distributed…
NASA Astrophysics Data System (ADS)
Qyyum, Muhammad Abdul; Long, Nguyen Van Duc; Minh, Le Quang; Lee, Moonyong
2018-01-01
Design optimization of the single mixed refrigerant (SMR) natural gas liquefaction (LNG) process involves highly non-linear interactions between decision variables, constraints, and the objective function. These non-linear interactions lead to an irreversibility, which deteriorates the energy efficiency of the LNG process. In this study, a simple and highly efficient hybrid modified coordinate descent (HMCD) algorithm was proposed to cope with the optimization of the natural gas liquefaction process. The single mixed refrigerant process was modeled in Aspen Hysys® and then connected to a Microsoft Visual Studio environment. The proposed optimization algorithm provided an improved result compared to the other existing methodologies to find the optimal condition of the complex mixed refrigerant natural gas liquefaction process. By applying the proposed optimization algorithm, the SMR process can be designed with the 0.2555 kW specific compression power which is equivalent to 44.3% energy saving as compared to the base case. Furthermore, in terms of coefficient of performance (COP), it can be enhanced up to 34.7% as compared to the base case. The proposed optimization algorithm provides a deep understanding of the optimization of the liquefaction process in both technical and numerical perspectives. In addition, the HMCD algorithm can be employed to any mixed refrigerant based liquefaction process in the natural gas industry.