Gis-Based Accessibility Analysis of Urban Emergency Shelters: the Case of Adana City
NASA Astrophysics Data System (ADS)
Unal, M.; Uslu, C.
2016-10-01
Accessibility analysis of urban emergency shelters can help support urban disaster prevention planning. Pre-disaster emergency evacuation zoning has become a significant topic on disaster prevention and mitigation research. In this study, we assessed the level of serviceability of urban emergency shelters within maximum capacity, usability, sufficiency and a certain walking time limit by employing spatial analysis techniques of GIS-Network Analyst. The methodology included the following aspects: the distribution analysis of emergency evacuation demands, the calculation of shelter space accessibility and the optimization of evacuation destinations. This methodology was applied to Adana, a city in Turkey, which is located within the Alpine-Himalayan orogenic system, the second major earthquake belt after the Pacific-Belt. It was found that the proposed methodology could be useful in aiding to understand the spatial distribution of urban emergency shelters more accurately and establish effective future urban disaster prevention planning. Additionally, this research provided a feasible way for supporting emergency management in terms of shelter construction, pre-disaster evacuation drills and rescue operations.
Learning in a Chaotic Environment
ERIC Educational Resources Information Center
Goldman, Ellen; Plack, Margaret; Roche, Colleen; Smith, Jeffrey; Turley, Catherine
2009-01-01
Purpose: The purpose of this study is to understand how, when, and why emergency medicine residents learn while working in the chaotic environment of a hospital emergency room. Design/methodology/approach: This research used a qualitative interview methodology with thematic data analysis that was verified with the entire population of learners.…
Sun, Jared H; Twomey, Michele; Tran, Jeffrey; Wallis, Lee A
2012-11-01
Ninety percent of emergency incidents occur in developing countries, and this is only expected to get worse as these nations develop. As a result, governments in developing countries are establishing emergency care systems. However, there is currently no widely-usable, objective method to monitor or research the rapid growth of emergency care in the developing world. Analysis of current quantitative methods to assess emergency care in developing countries, and the proposal of a more appropriate method. Currently accepted methods to quantitatively assess the efficacy of emergency care systems cannot be performed in most developing countries due to weak record-keeping infrastructure and the inappropriateness of applying Western derived coefficients to developing country conditions. As a result, although emergency care in the developing world is rapidly growing, researchers and clinicians are unable to objectively measure its progress or determine which policies work best in their respective countries. We propose the TEWS methodology, a simple analytical tool that can be handled by low-resource, developing countries. By relying on the most basic universal parameters, simplest calculations and straightforward protocol, the TEWS methodology allows for widespread analysis of emergency care in the developing world. This could become essential in the establishment and growth of new emergency care systems worldwide.
Emerging and recurrent issues in drug development.
Anello, C
This paper reviews several emerging and recurrent issues relating to the drug development process. These emerging issues include changes to the FDA regulatory environment, internationalization of drug development, advances in computer technology and visualization tools, and efforts to incorporate meta-analysis methodology. Recurrent issues include: renewed interest in statistical methods for handling subgroups in the design and analysis of clinical trials; renewed interest in alternatives to the 'intention-to-treat' analysis in the presence of non-compliance in randomized clinical trials; renewed interest in methodology to address the multiplicities resulting from a variety of sources inherent in the drug development process, and renewed interest in methods to assure data integrity. These emerging and recurrent issues provide a continuing challenge to the international community of statisticians involved in drug development. Moreover, the involvement of statisticians with different perspectives continues to enrich the field and contributes to improvement in the public health.
Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B
2018-05-30
Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.
Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis
NASA Technical Reports Server (NTRS)
Babcock, P.; Schor, A.; Rosch, G.
1998-01-01
This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.
Methodology discourses as boundary work in the construction of engineering education.
Beddoes, Kacey
2014-04-01
Engineering education research is a new field that emerged in the social sciences over the past 10 years. This analysis of engineering education research demonstrates that methodology discourses have played a central role in the construction and development of the field of engineering education, and that they have done so primarily through boundary work. This article thus contributes to science and technology studies literature by examining the role of methodology discourses in an emerging social science field. I begin with an overview of engineering education research before situating the case within relevant bodies of literature on methodology discourses and boundary work. I then identify two methodology discourses--rigor and methodological diversity--and discuss how they contribute to the construction and development of engineering education research. The article concludes with a discussion of how the findings relate to prior research on methodology discourses and boundary work and implications for future research.
Aircraft optimization by a system approach: Achievements and trends
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw
1992-01-01
Recently emerging methodology for optimal design of aircraft treated as a system of interacting physical phenomena and parts is examined. The methodology is found to coalesce into methods for hierarchic, non-hierarchic, and hybrid systems all dependent on sensitivity analysis. A separate category of methods has also evolved independent of sensitivity analysis, hence suitable for discrete problems. References and numerical applications are cited. Massively parallel computer processing is seen as enabling technology for practical implementation of the methodology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bri Rolston
2005-06-01
Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills,more » and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between those threats and the defensive capabilities of control systems can be analyzed. The results of the gap analysis drive changes in the cyber security of critical infrastructure networks to close the gap between current exploits and existing defenses. The analysis also provides defenders with an idea of how threat technology is evolving and how defenses will need to be modified to address these emerging trends.« less
Current and Emerging Forces Impacting Special Education.
ERIC Educational Resources Information Center
Yates, James R.
Using the methodology of force field analysis, the paper develops possible futures for special education based on current trends. Demographic forces impacting special education include age changes, ethnicity changes, the needs of emerging language minorities, specific change in the youth population, environmental factors and the incidence of…
Research fronts analysis : A bibliometric to identify emerging fields of research
NASA Astrophysics Data System (ADS)
Miwa, Sayaka; Ando, Satoko
Research fronts analysis identifies emerging areas of research through observing co-clustering in highly-cited papers. This article introduces the concept of research fronts analysis, explains its methodology and provides case examples. It also demonstrates developing research fronts in Japan by looking at the past winners of Thomson Reuters Research Fronts Awards. Research front analysis is currently being used by the Japanese government to determine new trends in science and technology. Information professionals can also utilize this bibliometric as a research evaluation tool.
Perspectives Do Matter: "Joint Screen", a Promising Methodology for Multimodal Interaction Analysis
ERIC Educational Resources Information Center
Arend, Béatrice; Sunnen, Patrick; Fixmer, Pierre; Sujbert, Monika
2014-01-01
This paper discusses theoretical and methodological issues arising from a video-based research design and the emergent tool "Joint Screen'"when grasping joint activity. We share our reflections regarding the combined reading of four synchronised camera perspectives combined in one screen. By these means we reconstruct and analyse…
Toward a Learning Behavior Tracking Methodology for CA-for-SLA
ERIC Educational Resources Information Center
Markee, Numa
2008-01-01
This paper is principally about methodology. It first summarizes five issues in the emerging research agenda of conversation analysis-for-second language acquisition (CA-for-SLA), and develops empirically based analyses of classroom talk that occurs over several days and months to illustrate how a longitudinal learning behavior tracking (LBT)…
Recent Methodology in Ginseng Analysis
Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill
2012-01-01
As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112
Fault Tree Analysis: An Emerging Methodology for Instructional Science.
ERIC Educational Resources Information Center
Wood, R. Kent; And Others
1979-01-01
Describes Fault Tree Analysis, a tool for systems analysis which attempts to identify possible modes of failure in systems to increase the probability of success. The article defines the technique and presents the steps of FTA construction, focusing on its application to education. (RAO)
Dent, Andrew W; Asadpour, Ali; Weiland, Tracey J; Paltridge, Debbie
2008-02-01
Fellows of the Australasian College for Emergency Medicine (FACEM) have opportunities to participate in a range of continuing professional development activities. To inform FACEM and assist those involved in planning continuing professional development interventions for FACEM, we undertook a learning needs analysis of emergency physicians. Exploratory study using survey methodology. Following questionnaire development by iterative feedback with emergency physicians and researchers, a mailed survey was distributed to all FACEM. The survey comprised eight items on work and demographic characteristics of FACEM, and 194 items on attitudes to existing learning opportunities, barriers to learning, and perceived learning needs and preferences. Fifty-eight percent (503/854) of all FACEM surveyed responded to the questionnaire, almost half of whom attained their FACEM after year 2000. The sample comprised mostly males (72.8%) with mean age of the sample 41.6 years, similar to ACEM database. Most respondents reported working in ACEM accredited hospitals (89%), major referral hospitals (54%), and practiced on both children and adults (78%). FACEM reported working on average 26.7 clinical hours per week with those at private hospitals working a greater proportion of clinical hours than other hospital types. As the first of six related reports, this paper documents the methodology used, including questionnaire development, and provides the demographics of responding FACEM, including the clinical and non-clinical hours worked and type of hospital of principal employment.
Waugh, Sheldon
2015-02-05
The use of detailed methodologies and legitimate settings justifications in spatial analysis is imperative to locating areas of significance. Studies missing this action may enact interventions in improper areas.
Challenges to the Learning Organization in the Context of Generational Diversity and Social Networks
ERIC Educational Resources Information Center
Kaminska, Renata; Borzillo, Stefano
2018-01-01
Purpose: The purpose of this paper is to gain a better understanding of the challenges to the emergence of a learning organization (LO) posed by a context of generational diversity and an enterprise social networking system (ESNS). Design/methodology/approach: This study uses a qualitative methodology based on an analysis of 20 semi-structured…
The potential for congressional use of emergent telecommunications: An exploratory assessment
NASA Technical Reports Server (NTRS)
Wood, F. B.
1974-01-01
A study of the use of newly emerging communications technology for improving the understanding between members of Congress and their constituents was conducted. The study employed a number of specific methodologies such as interdisciplinary systems model building, technology analysis, a sample survey, and semi-structured interviews using sketches of the emergent channels. The following configurations were identified as representative of emergent channel characteristics: (1) the teleconference, (2) the videoconference, (3) the videophone, (4) cable television, (5) cable television polling, and (6) information retrieval. Analysis of the interview data resulted in an overview of the current congressional-constituent communication system and an assessment of the potential for emergent telecommunications, as perceived by congressmen and senior staff from 40 offices in the stratified judgement sample.
The Rescue911 Emergency Response Information System (ERIS): A Systems Development Project Case
ERIC Educational Resources Information Center
Cohen, Jason F.; Thiel, Franz H.
2010-01-01
This teaching case presents a systems development project useful for courses in object-oriented analysis and design. The case has a strong focus on the business, methodology, modeling and implementation aspects of systems development. The case is centered on a fictitious ambulance and emergency services company (Rescue911). The case describes that…
An emerging discourse: toward epistemic diversity in nursing.
Georges, Jane M
2003-01-01
Grounded in a postmodern feminist methodology, this article undertakes an initial analysis of a newly emerging discourse in contemporary nursing academia in the United States. Two currently prominent discourses in nursing, a dominant discourse informed by the processes and values of "science" in the Enlightenment sense and a concurrent marginalized discourse informed by postmodernism, are described as a context for the emerging discourse. A genealogy informed by the work of Foucault is presented as a basis for an analysis of the power effects resulting from the conflict between these 2 discourses. Finally, 3 recent texts in nursing are analyzed and common themes identified as indicative of a new intertextual discourse, termed "epistemic diversity," emerging from this discursive conflict.
ERIC Educational Resources Information Center
Coad, Jane; Evans, Ruth
2008-01-01
This article reflects on key methodological issues emerging from children and young people's involvement in data analysis processes. We outline a pragmatic framework illustrating different approaches to engaging children, using two case studies of children's experiences of participating in data analysis. The article highlights methods of…
Sociocultural Meanings of Nanotechnology: Research Methodologies
NASA Astrophysics Data System (ADS)
Bainbridge, William Sims
2004-06-01
This article identifies six social-science research methodologies that will be useful for charting the sociocultural meaning of nanotechnology: web-based questionnaires, vignette experiments, analysis of web linkages, recommender systems, quantitative content analysis, and qualitative textual analysis. Data from a range of sources are used to illustrate how the methods can delineate the intellectual content and institutional structure of the emerging nanotechnology culture. Such methods will make it possible in future to test hypotheses such as that there are two competing definitions of nanotechnology - the technical-scientific and the science-fiction - that are influencing public perceptions by different routes and in different directions.
Kiluk, Brian D.; Sugarman, Dawn E.; Nich, Charla; Gibbons, Carly J.; Martino, Steve; Rounsaville, Bruce J.; Carroll, Kathleen M.
2013-01-01
Objective Computer-assisted therapies offer a novel, cost-effective strategy for providing evidence-based therapies to a broad range of individuals with psychiatric disorders. However, the extent to which the growing body of randomized trials evaluating computer-assisted therapies meets current standards of methodological rigor for evidence-based interventions is not clear. Method A methodological analysis of randomized clinical trials of computer-assisted therapies for adult psychiatric disorders, published between January 1990 and January 2010, was conducted. Seventy-five studies that examined computer-assisted therapies for a range of axis I disorders were evaluated using a 14-item methodological quality index. Results Results indicated marked heterogeneity in study quality. No study met all 14 basic quality standards, and three met 13 criteria. Consistent weaknesses were noted in evaluation of treatment exposure and adherence, rates of follow-up assessment, and conformity to intention-to-treat principles. Studies utilizing weaker comparison conditions (e.g., wait-list controls) had poorer methodological quality scores and were more likely to report effects favoring the computer-assisted condition. Conclusions While several well-conducted studies have indicated promising results for computer-assisted therapies, this emerging field has not yet achieved a level of methodological quality equivalent to those required for other evidence-based behavioral therapies or pharmacotherapies. Adoption of more consistent standards for methodological quality in this field, with greater attention to potential adverse events, is needed before computer-assisted therapies are widely disseminated or marketed as evidence based. PMID:21536689
ERIC Educational Resources Information Center
Bradford, Deborah J.
2010-01-01
The purpose of the study was to understand and appreciate the methodologies and procedures used in determining the extent to which an information technology (IT) organization within the eleven member State University Systems (SUS) of Florida planned, implemented, and diffused emerging educational technologies. Key findings found how critical it…
Mrad Nakhlé, M; Farah, W; Ziade, N; Abboud, M; Gerard, J; Zaarour, R; Saliba, N; Dabar, G; Abdel Massih, T; Zoghbi, A; Coussa-Koniski, M-L; Annesi-Maesano, I
2013-12-01
The effects of air pollution on human health have been the subject of much public health research. Several techniques and methods of analysis have been developed. Thus, Beirut Air Pollution and Health Effects (BAPHE) was designed to develop a methodology adapted to the context of the city of Beirut in order to quantify the short-term health effects of air pollution. The quality of data collected from emergency units was analyzed in order to properly estimate hospitalizations via these units. This study examined the process of selecting and validating health and pollution indicators. The different sources of data from emergency units were not correlated. BAPHE was therefore reoriented towards collecting health data from the emergency registry of each hospital. A pilot study determined the appropriate health indicators for BAPHE and created a classification methodology for data collection. In Lebanon, several studies have attempted to indirectly assess the impact of air pollution on health. They had limitations and weaknesses and offered no recommendations regarding the sources and quality of data. The present analysis will be useful for BAPHE and for planning further studies. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
ERIC Educational Resources Information Center
Firmin, Michael W.; Gilson, Krista Merrick
2007-01-01
Using rigorous qualitative research methodology, twenty-four college students receiving their undergraduate degrees in three years were interviewed. Following analysis of the semi-structured interview transcripts and coding, themes emerged, indicating that these students possessed self-discipline, self-motivation, and drive. Overall, the results…
Issues in Longitudinal Research on Motivation
ERIC Educational Resources Information Center
Stoel, Reinoud D.; Roeleveld, Jaap; Peetsma, Thea; van den Wittenboer, Godfried; Hox, Joop
2006-01-01
This paper discusses two methodological issues regarding the analysis of longitudinal data using structural equation modeling that emerged during the reconsideration of the analysis of a recent study on the relationship between academic motivation and language achievement in elementary education [Stoel R.D., Peetsma, T.T.D. and Roeleveld, J.…
Application of the Hardman methodology to the Single Channel Ground-Airborne Radio System (SINCGARS)
NASA Technical Reports Server (NTRS)
1984-01-01
The HARDMAN methodology was applied to the various configurations of employment for an emerging Army multipurpose communications system. The methodology was used to analyze the manpower, personnel and training (MPT) requirements and associated costs, of the system concepts responsive to the Army's requirement for the Single Channel Ground-Airborne Radio System (SINCGARS). The scope of the application includes the analysis of two conceptual designs Cincinnati Electronics and ITT Aerospace/Optical Division for operating and maintenance support addressed through the general support maintenance echelon.
Who should be undertaking population-based surveys in humanitarian emergencies?
Spiegel, Paul B
2007-01-01
Background Timely and accurate data are necessary to prioritise and effectively respond to humanitarian emergencies. 30-by-30 cluster surveys are commonly used in humanitarian emergencies because of their purported simplicity and reasonable validity and precision. Agencies have increasingly used 30-by-30 cluster surveys to undertake measurements beyond immunisation coverage and nutritional status. Methodological errors in cluster surveys have likely occurred for decades in humanitarian emergencies, often with unknown or unevaluated consequences. Discussion Most surveys in humanitarian emergencies are done by non-governmental organisations (NGOs). Some undertake good quality surveys while others have an already overburdened staff with limited epidemiological skills. Manuals explaining cluster survey methodology are available and in use. However, it is debatable as to whether using standardised, 'cookbook' survey methodologies are appropriate. Coordination of surveys is often lacking. If a coordinating body is established, as recommended, it is questionable whether it should have sole authority to release surveys due to insufficient independence. Donors should provide sufficient funding for personnel, training, and survey implementation, and not solely for direct programme implementation. Summary A dedicated corps of trained epidemiologists needs to be identified and made available to undertake surveys in humanitarian emergencies. NGOs in the field may need to form an alliance with certain specialised agencies or pool technically capable personnel. If NGOs continue to do surveys by themselves, a simple training manual with sample survey questionnaires, methodology, standardised files for data entry and analysis, and manual for interpretation should be developed and modified locally for each situation. At the beginning of an emergency, a central coordinating body should be established that has sufficient authority to set survey standards, coordinate when and where surveys should be undertaken and act as a survey repository. Technical expertise is expensive and donors must pay for it. As donors increasingly demand evidence-based programming, they have an obligation to ensure that sufficient funds are provided so organisations have adequate technical staff. PMID:17543107
Skyscape Archaeology: an emerging interdiscipline for archaeoastronomers and archaeologists
NASA Astrophysics Data System (ADS)
Henty, Liz
2016-02-01
For historical reasons archaeoastronomy and archaeology differ in their approach to prehistoric monuments and this has created a divide between the disciplines which adopt seemingly incompatible methodologies. The reasons behind the impasse will be explored to show how these different approaches gave rise to their respective methods. Archaeology investigations tend to concentrate on single site analysis whereas archaeoastronomical surveys tend to be data driven from the examination of a large number of similar sets. A comparison will be made between traditional archaeoastronomical data gathering and an emerging methodology which looks at sites on a small scale and combines archaeology and astronomy. Silva's recent research in Portugal and this author's survey in Scotland have explored this methodology and termed it skyscape archaeology. This paper argues that this type of phenomenological skyscape archaeology offers an alternative to large scale statistical studies which analyse astronomical data obtained from a large number of superficially similar archaeological sites.
Evaluation of massively parallel sequencing for forensic DNA methylation profiling.
Richards, Rebecca; Patel, Jayshree; Stevenson, Kate; Harbison, SallyAnn
2018-05-11
Epigenetics is an emerging area of interest in forensic science. DNA methylation, a type of epigenetic modification, can be applied to chronological age estimation, identical twin differentiation and body fluid identification. However, there is not yet an agreed, established methodology for targeted detection and analysis of DNA methylation markers in forensic research. Recently a massively parallel sequencing-based approach has been suggested. The use of massively parallel sequencing is well established in clinical epigenetics and is emerging as a new technology in the forensic field. This review investigates the potential benefits, limitations and considerations of this technique for the analysis of DNA methylation in a forensic context. The importance of a robust protocol, regardless of the methodology used, that minimises potential sources of bias is highlighted. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Resting-State Functional Connectivity in the Infant Brain: Methods, Pitfalls, and Potentiality.
Mongerson, Chandler R L; Jennings, Russell W; Borsook, David; Becerra, Lino; Bajic, Dusica
2017-01-01
Early brain development is characterized by rapid growth and perpetual reconfiguration, driven by a dynamic milieu of heterogeneous processes. Postnatal brain plasticity is associated with increased vulnerability to environmental stimuli. However, little is known regarding the ontogeny and temporal manifestations of inter- and intra-regional functional connectivity that comprise functional brain networks. Resting-state functional magnetic resonance imaging (rs-fMRI) has emerged as a promising non-invasive neuroinvestigative tool, measuring spontaneous fluctuations in blood oxygen level dependent (BOLD) signal at rest that reflect baseline neuronal activity. Over the past decade, its application has expanded to infant populations providing unprecedented insight into functional organization of the developing brain, as well as early biomarkers of abnormal states. However, many methodological issues of rs-fMRI analysis need to be resolved prior to standardization of the technique to infant populations. As a primary goal, this methodological manuscript will (1) present a robust methodological protocol to extract and assess resting-state networks in early infancy using independent component analysis (ICA), such that investigators without previous knowledge in the field can implement the analysis and reliably obtain viable results consistent with previous literature; (2) review the current methodological challenges and ethical considerations associated with emerging field of infant rs-fMRI analysis; and (3) discuss the significance of rs-fMRI application in infants for future investigations of neurodevelopment in the context of early life stressors and pathological processes. The overarching goal is to catalyze efforts toward development of robust, infant-specific acquisition, and preprocessing pipelines, as well as promote greater transparency by researchers regarding methods used.
Resting-State Functional Connectivity in the Infant Brain: Methods, Pitfalls, and Potentiality
Mongerson, Chandler R. L.; Jennings, Russell W.; Borsook, David; Becerra, Lino; Bajic, Dusica
2017-01-01
Early brain development is characterized by rapid growth and perpetual reconfiguration, driven by a dynamic milieu of heterogeneous processes. Postnatal brain plasticity is associated with increased vulnerability to environmental stimuli. However, little is known regarding the ontogeny and temporal manifestations of inter- and intra-regional functional connectivity that comprise functional brain networks. Resting-state functional magnetic resonance imaging (rs-fMRI) has emerged as a promising non-invasive neuroinvestigative tool, measuring spontaneous fluctuations in blood oxygen level dependent (BOLD) signal at rest that reflect baseline neuronal activity. Over the past decade, its application has expanded to infant populations providing unprecedented insight into functional organization of the developing brain, as well as early biomarkers of abnormal states. However, many methodological issues of rs-fMRI analysis need to be resolved prior to standardization of the technique to infant populations. As a primary goal, this methodological manuscript will (1) present a robust methodological protocol to extract and assess resting-state networks in early infancy using independent component analysis (ICA), such that investigators without previous knowledge in the field can implement the analysis and reliably obtain viable results consistent with previous literature; (2) review the current methodological challenges and ethical considerations associated with emerging field of infant rs-fMRI analysis; and (3) discuss the significance of rs-fMRI application in infants for future investigations of neurodevelopment in the context of early life stressors and pathological processes. The overarching goal is to catalyze efforts toward development of robust, infant-specific acquisition, and preprocessing pipelines, as well as promote greater transparency by researchers regarding methods used. PMID:28856131
Mantzoukas, Stefanos
2009-04-01
Evidence-based practice has become an imperative for efficient, effective and safe practice. Furthermore, evidences emerging from published research are considered as valid knowledge sources to guiding practice. The aim of this paper is to review all research articles published in the top 10 general nursing journals for the years 2000-2006 to identify the methodologies used, the types of evidence these studies produced and the issues upon which they endeavored. Quantitative content analysis was implemented to study all published research papers of the top 10 general nursing journals for the years 2000-2006. The top 10 general nursing journals were included in the study. The abstracts of all research articles were analysed with regards the methodologies of enquiry, the types of evidence produced and the issues of study they endeavored upon. Percentages were developed as to enable conclusions to be drawn. The results for the category methodologies used were 7% experimental, 6% quasi-experimental, 39% non-experimental, 2% ethnographical studies, 7% phenomenological, 4% grounded theory, 1% action research, 1% case study, 15% unspecified, 5.5% other, 0.5% meta-synthesis, 2% meta-analysis, 5% literature reviews and 3% secondary analysis. For the category types of evidence were 4% hypothesis/theory testing, 11% evaluative, 5% comparative, 2% correlational, 46% descriptive, 5% interpretative and 27% exploratory. For the category issues of study were 45% practice/clinical, 8% educational, 11% professional, 3% spiritual/ethical/metaphysical, 26% health promotion and 7% managerial/policy. Published studies can provide adequate evidences for practice if nursing journals conceptualise evidence emerging from non-experimental and qualitative studies as relevant types of evidences for practice and develop appropriate mechanisms for assessing their validity. Also, nursing journals need to increase and encourage the publication of studies that implement RCT methodology, systematic reviews, meta-synthesis and meta-analysis methodologies. Finally, nursing journals need to encourage more high quality research evidence that derive from interpretative, theory testing and evaluative types of studies that are practice relevant.
Jiang, Jheng Jie; Lee, Chon Lin; Fang, Meng Der; Boyd, Kenneth G.; Gibb, Stuart W.
2015-01-01
This paper presents a methodology based on multivariate data analysis for characterizing potential source contributions of emerging contaminants (ECs) detected in 26 river water samples across multi-scape regions during dry and wet seasons. Based on this methodology, we unveil an approach toward potential source contributions of ECs, a concept we refer to as the “Pharmaco-signature.” Exploratory analysis of data points has been carried out by unsupervised pattern recognition (hierarchical cluster analysis, HCA) and receptor model (principal component analysis-multiple linear regression, PCA-MLR) in an attempt to demonstrate significant source contributions of ECs in different land-use zone. Robust cluster solutions grouped the database according to different EC profiles. PCA-MLR identified that 58.9% of the mean summed ECs were contributed by domestic impact, 9.7% by antibiotics application, and 31.4% by drug abuse. Diclofenac, ibuprofen, codeine, ampicillin, tetracycline, and erythromycin-H2O have significant pollution risk quotients (RQ>1), indicating potentially high risk to aquatic organisms in Taiwan. PMID:25874375
Gonzalez, Miriam; Clarke, Diana E; Pereira, Asha; Boyce-Gaudreau, Krystal; Waldman, Celeste; Demczuk, Lisa; Legare, Carol
2017-08-01
Visits to emergency departments for substance use/abuse are common worldwide. However, emergency department health care providers perceive substance-using patients as a challenging group to manage which can lead to negative attitudes. Providing education or experience-based exercises may impact positively on behaviors towards this patient population. Whether staff attitudes are similarly impacted by knowledge acquired through educational interventions remains unknown. To synthesize available evidence on the relationship between new knowledge gained through substance use educational interventions and emergency department health care providers' attitudes towards patients with substance-related presentations. Health care providers working in urban and rural emergency departments of healthcare facilities worldwide providing care to adult patients with substance-related presentations. Quantitative papers examining the impact of substance use educational interventions on health care providers' attitudes towards substance using patients. Experimental and non-experimental study designs. Emergency department staff attitudes towards patients presenting with substance use/abuse. A three-step search strategy was conducted in August 2015 with a search update in March 2017. Studies published since 1995 in English, French or Spanish were considered for inclusion. Two reviewers assessed studies for methodological quality using critical appraisal checklists from the Joanna Briggs Institute Meta-Analysis of Statistics Assessment and Review Instrument (JBI-MAStARI). Reviewers agreed on JBI-MAStARI methodological criteria a study must meet in order to be included in the review (e.g. appropriate use of statistical analysis). The data extraction instrument from JBI-MAStARI was used. As statistical pooling of the data was not possible, the findings are presented in narrative form. A total of 900 articles were identified as relevant for this review. Following abstract and full text screening, four articles were selected and assessed for methodological quality. One article met methodological criteria for inclusion in the review: use of random assignment and comparable study groups and measurement outcomes in a reliable and consistent manner. The included study was a cluster randomized controlled trial. Participants were emergency medicine residents with a mean age of 30 years. The study assessed the impact of a skills-based educational intervention on residents' attitudes, knowledge and practice towards patients with alcohol problems. While knowledge and practice behaviors improved one year following the intervention, there were no significant differences between groups on attitudinal measures. Employing educational interventions to improve the attitudes of emergency department staff towards individuals with drug and alcohol related presentations is not supported by evidence.
Visualizing diurnal population change in urban areas for emergency management.
Kobayashi, Tetsuo; Medina, Richard M; Cova, Thomas J
2011-01-01
There is an increasing need for a quick, simple method to represent diurnal population change in metropolitan areas for effective emergency management and risk analysis. Many geographic studies rely on decennial U.S. Census data that assume that urban populations are static in space and time. This has obvious limitations in the context of dynamic geographic problems. The U.S. Department of Transportation publishes population data at the transportation analysis zone level in fifteen-minute increments. This level of spatial and temporal detail allows for improved dynamic population modeling. This article presents a methodology for visualizing and analyzing diurnal population change for metropolitan areas based on this readily available data. Areal interpolation within a geographic information system is used to create twenty-four (one per hour) population surfaces for the larger metropolitan area of Salt Lake County, Utah. The resulting surfaces represent diurnal population change for an average workday and are easily combined to produce an animation that illustrates population dynamics throughout the day. A case study of using the method to visualize population distributions in an emergency management context is provided using two scenarios: a chemical release and a dirty bomb in Salt Lake County. This methodology can be used to address a wide variety of problems in emergency management.
A Method for Evaluating the Safety Impacts of Air Traffic Automation
NASA Technical Reports Server (NTRS)
Kostiuk, Peter; Shapiro, Gerald; Hanson, Dave; Kolitz, Stephan; Leong, Frank; Rosch, Gene; Bonesteel, Charles
1998-01-01
This report describes a methodology for analyzing the safety and operational impacts of emerging air traffic technologies. The approach integrates traditional reliability models of the system infrastructure with models that analyze the environment within which the system operates, and models of how the system responds to different scenarios. Products of the analysis include safety measures such as predicted incident rates, predicted accident statistics, and false alarm rates; and operational availability data. The report demonstrates the methodology with an analysis of the operation of the Center-TRACON Automation System at Dallas-Fort Worth International Airport.
An innovative and shared methodology for event reconstruction using images in forensic science.
Milliet, Quentin; Jendly, Manon; Delémont, Olivier
2015-09-01
This study presents an innovative methodology for forensic science image analysis for event reconstruction. The methodology is based on experiences from real cases. It provides real added value to technical guidelines such as standard operating procedures (SOPs) and enriches the community of practices at stake in this field. This bottom-up solution outlines the many facets of analysis and the complexity of the decision-making process. Additionally, the methodology provides a backbone for articulating more detailed and technical procedures and SOPs. It emerged from a grounded theory approach; data from individual and collective interviews with eight Swiss and nine European forensic image analysis experts were collected and interpreted in a continuous, circular and reflexive manner. Throughout the process of conducting interviews and panel discussions, similarities and discrepancies were discussed in detail to provide a comprehensive picture of practices and points of view and to ultimately formalise shared know-how. Our contribution sheds light on the complexity of the choices, actions and interactions along the path of data collection and analysis, enhancing both the researchers' and participants' reflexivity. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Open space suitability analysis for emergency shelter after an earthquake
NASA Astrophysics Data System (ADS)
Anhorn, J.; Khazai, B.
2014-06-01
In an emergency situation shelter space is crucial for people affected by natural hazards. Emergency planners in disaster relief and mass care can greatly benefit from a sound methodology that identifies suitable shelter areas and sites where shelter services need to be improved. A methodology to rank suitability of open spaces for contingency planning and placement of shelter in the immediate aftermath of a disaster is introduced. The Open Space Suitability Index (OSSI) uses the combination of two different measures: a qualitative evaluation criterion for the suitability and manageability of open spaces to be used as shelter sites, and a second quantitative criterion using a capacitated accessibility analysis based on network analysis. For the qualitative assessment, implementation issues, environmental considerations, and basic utility supply are the main categories to rank candidate shelter sites. Geographic Information System (GIS) is used to reveal spatial patterns of shelter demand. Advantages and limitations of this method are discussed on the basis of a case study in Kathmandu Metropolitan City (KMC). According to the results, out of 410 open spaces under investigation, 12.2% have to be considered not suitable (Category D and E) while 10.7% are Category A and 17.6% are Category B. Almost two third (59.5%) are fairly suitable (Category C).
Open space suitability analysis for emergency shelter after an earthquake
NASA Astrophysics Data System (ADS)
Anhorn, J.; Khazai, B.
2015-04-01
In an emergency situation shelter space is crucial for people affected by natural hazards. Emergency planners in disaster relief and mass care can greatly benefit from a sound methodology that identifies suitable shelter areas and sites where shelter services need to be improved. A methodology to rank suitability of open spaces for contingency planning and placement of shelter in the immediate aftermath of a disaster is introduced. The Open Space Suitability Index uses the combination of two different measures: a qualitative evaluation criterion for the suitability and manageability of open spaces to be used as shelter sites and another quantitative criterion using a capacitated accessibility analysis based on network analysis. For the qualitative assessment implementation issues, environmental considerations and basic utility supply are the main categories to rank candidate shelter sites. A geographic information system is used to reveal spatial patterns of shelter demand. Advantages and limitations of this method are discussed on the basis of an earthquake hazard case study in the Kathmandu Metropolitan City. According to the results, out of 410 open spaces under investigation, 12.2% have to be considered not suitable (Category D and E) while 10.7% are Category A and 17.6% are Category B. Almost two-thirds (59.55%) are fairly suitable (Category C).
Forero, Roberto; Nahidi, Shizar; De Costa, Josephine; Mohsin, Mohammed; Fitzgerald, Gerry; Gibson, Nick; McCarthy, Sally; Aboagye-Sarfo, Patrick
2018-02-17
The main objective of this methodological manuscript was to illustrate the role of using qualitative research in emergency settings. We outline rigorous criteria applied to a qualitative study assessing perceptions and experiences of staff working in Australian emergency departments. We used an integrated mixed-methodology framework to identify different perspectives and experiences of emergency department staff during the implementation of a time target government policy. The qualitative study comprised interviews from 119 participants across 16 hospitals. The interviews were conducted in 2015-2016 and the data were managed using NVivo version 11. We conducted the analysis in three stages, namely: conceptual framework, comparison and contrast and hypothesis development. We concluded with the implementation of the four-dimension criteria (credibility, dependability, confirmability and transferability) to assess the robustness of the study, RESULTS: We adapted four-dimension criteria to assess the rigour of a large-scale qualitative research in the emergency department context. The criteria comprised strategies such as building the research team; preparing data collection guidelines; defining and obtaining adequate participation; reaching data saturation and ensuring high levels of consistency and inter-coder agreement. Based on the findings, the proposed framework satisfied the four-dimension criteria and generated potential qualitative research applications to emergency medicine research. We have added a methodological contribution to the ongoing debate about rigour in qualitative research which we hope will guide future studies in this topic in emergency care research. It also provided recommendations for conducting future mixed-methods studies. Future papers on this series will use the results from qualitative data and the empirical findings from longitudinal data linkage to further identify factors associated with ED performance; they will be reported separately.
A theoretical framework for negotiating the path of emergency management multi-agency coordination.
Curnin, Steven; Owen, Christine; Paton, Douglas; Brooks, Benjamin
2015-03-01
Multi-agency coordination represents a significant challenge in emergency management. The need for liaison officers working in strategic level emergency operations centres to play organizational boundary spanning roles within multi-agency coordination arrangements that are enacted in complex and dynamic emergency response scenarios creates significant research and practical challenges. The aim of the paper is to address a gap in the literature regarding the concept of multi-agency coordination from a human-environment interaction perspective. We present a theoretical framework for facilitating multi-agency coordination in emergency management that is grounded in human factors and ergonomics using the methodology of core-task analysis. As a result we believe the framework will enable liaison officers to cope more efficiently within the work domain. In addition, we provide suggestions for extending the theory of core-task analysis to an alternate high reliability environment. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Data Analysis Tools and Methods for Improving the Interaction Design in E-Learning
ERIC Educational Resources Information Center
Popescu, Paul Stefan
2015-01-01
In this digital era, learning from data gathered from different software systems may have a great impact on the quality of the interaction experience. There are two main directions that come to enhance this emerging research domain, Intelligent Data Analysis (IDA) and Human Computer Interaction (HCI). HCI specific research methodologies can be…
An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies
NASA Technical Reports Server (NTRS)
Kostiuk, Peter F.; Adams, Milton B.; Allinger, Deborah F.; Rosch, Gene; Kuchar, James
1998-01-01
The continuing growth of air traffic will place demands on NASA's Air Traffic Management (ATM) system that cannot be accommodated without the creation of significant delays and economic impacts. To deal with this situation, work has begun to develop new approaches to providing a safe and economical air transportation infrastructure. Many of these emerging air transport technologies will represent radically new approaches to ATM, both for ground and air operations.
Tracking Concept Development through Semiotic Evolution
ERIC Educational Resources Information Center
Ronen, Ilana
2015-01-01
A qualitative research focused on a case study aiming to monitor emergent knowledge in a discourse group by tracking the development of the concept "goal." The analysis, based on "Semiotic Evolution" methodology facilitates the description of interactions between personal perceptions in the group discourse, illustrating the…
MASQOT: a method for cDNA microarray spot quality control
Bylesjö, Max; Eriksson, Daniel; Sjödin, Andreas; Sjöström, Michael; Jansson, Stefan; Antti, Henrik; Trygg, Johan
2005-01-01
Background cDNA microarray technology has emerged as a major player in the parallel detection of biomolecules, but still suffers from fundamental technical problems. Identifying and removing unreliable data is crucial to prevent the risk of receiving illusive analysis results. Visual assessment of spot quality is still a common procedure, despite the time-consuming work of manually inspecting spots in the range of hundreds of thousands or more. Results A novel methodology for cDNA microarray spot quality control is outlined. Multivariate discriminant analysis was used to assess spot quality based on existing and novel descriptors. The presented methodology displays high reproducibility and was found superior in identifying unreliable data compared to other evaluated methodologies. Conclusion The proposed methodology for cDNA microarray spot quality control generates non-discrete values of spot quality which can be utilized as weights in subsequent analysis procedures as well as to discard spots of undesired quality using the suggested threshold values. The MASQOT approach provides a consistent assessment of spot quality and can be considered an alternative to the labor-intensive manual quality assessment process. PMID:16223442
Boix, C; Ibáñez, M; Fabregat-Safont, D; Morales, E; Pastor, L; Sancho, J V; Sánchez-Ramírez, J E; Hernández, F
2016-01-01
In this work, two analytical methodologies based on liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) were developed for quantification of emerging pollutants identified in sewage sludge after a previous wide-scope screening. The target list included 13 emerging contaminants (EC): thiabendazole, acesulfame, fenofibric acid, valsartan, irbesartan, salicylic acid, diclofenac, carbamazepine, 4-aminoantipyrine (4-AA), 4-acetyl aminoantipyrine (4-AAA), 4-formyl aminoantipyrine (4-FAA), venlafaxine and benzoylecgonine. The aqueous and solid phases of the sewage sludge were analyzed making use of Solid-Phase Extraction (SPE) and UltraSonic Extraction (USE) for sample treatment, respectively. The methods were validated at three concentration levels: 0.2, 2 and 20 μg L(-1) for the aqueous phase, and 50, 500 and 2000 μg kg(-1) for the solid phase of the sludge. In general, the method was satisfactorily validated, showing good recoveries (70-120%) and precision (RSD < 20%). Regarding the limit of quantification (LOQ), it was below 0.1 μg L(-1) in the aqueous phase and below 50 μg kg(-1) in the solid phase for the majority of the analytes. The method applicability was tested by analysis of samples from a wider study on degradation of emerging pollutants in sewage sludge under anaerobic digestion. The key benefits of these methodologies are: • SPE and USE are appropriate sample procedures to extract selected emerging contaminants from the aqueous phase of the sewage sludge and the solid residue. • LC-MS/MS is highly suitable for determining emerging contaminants in both sludge phases. • Up to our knowledge, the main metabolites of dipyrone had not been studied before in sewage sludge.
Process improvement methodologies uncover unexpected gaps in stroke care.
Kuner, Anthony D; Schemmel, Andrew J; Pooler, B Dustin; Yu, John-Paul J
2018-01-01
Background The diagnosis and treatment of acute stroke requires timed and coordinated effort across multiple clinical teams. Purpose To analyze the frequency and temporal distribution of emergent stroke evaluations (ESEs) to identify potential contributory workflow factors that may delay the initiation and subsequent evaluation of emergency department stroke patients. Material and Methods A total of 719 sentinel ESEs with concurrent neuroimaging were identified over a 22-month retrospective time period. Frequency data were tabulated and odds ratios calculated. Results Of all ESEs, 5% occur between 01:00 and 07:00. ESEs were most frequent during the late morning and early afternoon hours (10:00-14:00). Unexpectedly, there was a statistically significant decline in the frequency of ESEs that occur at the 14:00 time point. Conclusion Temporal analysis of ESEs in the emergency department allowed us to identify an unexpected decrease in ESEs and through process improvement methodologies (Lean and Six Sigma) and identify potential workflow elements contributing to this observation.
O'Dwyer, Gisele; Machado, Cristiani Vieira; Alves, Renan Paes; Salvador, Fernanda Gonçalves
2016-06-01
Mobile prehospital care is a key component of emergency care. The aim of this study was to analyze the implementation of the State of Rio de Janeiro's Mobile Emergency Medical Service (SAMU, acronym in Portuguese). The methodology employed included document analysis, visits to six SAMU emergency call centers, and semistructured interviews conducted with 12 local and state emergency care coordinators. The study's conceptual framework was based on Giddens' theory of structuration. Intergovernmental conflicts were observed between the state and municipal governments, and between municipal governments. Despite the shortage of hospital beds, the SAMUs in periphery regions were better integrated with the emergency care network than the metropolitan SAMUs. The steering committees were not very active and weaknesses were observed relating to the limited role played by the state government in funding, management, and monitoring. It was concluded that the SAMU implementation process in the state was marked by political tensions and management and coordination weaknesses. As a result, serious drawbacks remain in the coordination of the SAMU with the other health services and the regionalization of emergency care in the state.
Emerging Concepts and Methodologies in Cancer Biomarker Discovery.
Lu, Meixia; Zhang, Jinxiang; Zhang, Lanjing
2017-01-01
Cancer biomarker discovery is a critical part of cancer prevention and treatment. Despite the decades of effort, only a small number of cancer biomarkers have been identified for and validated in clinical settings. Conceptual and methodological breakthroughs may help accelerate the discovery of additional cancer biomarkers, particularly their use for diagnostics. In this review, we have attempted to review the emerging concepts in cancer biomarker discovery, including real-world evidence, open access data, and data paucity in rare or uncommon cancers. We have also summarized the recent methodological progress in cancer biomarker discovery, such as high-throughput sequencing, liquid biopsy, big data, artificial intelligence (AI), and deep learning and neural networks. Much attention has been given to the methodological details and comparison of the methodologies. Notably, these concepts and methodologies interact with each other and will likely lead to synergistic effects when carefully combined. Newer, more innovative concepts and methodologies are emerging as the current emerging ones became mainstream and widely applied to the field. Some future challenges are also discussed. This review contributes to the development of future theoretical frameworks and technologies in cancer biomarker discovery and will contribute to the discovery of more useful cancer biomarkers.
Microgenetic Learning Analytics Methods: Workshop Report
ERIC Educational Resources Information Center
Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin
2016-01-01
Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…
Preparing School Leaders to Work with and in Community
ERIC Educational Resources Information Center
FitzGerald, Anne Marie; Militello, Matthew
2016-01-01
We used Q methodology, a form of factor analysis, to explore and establish correlations across the perceptions of key stakeholders (i.e., deans, faculty members, doctoral students) about how doctoral programs in educational leadership engage in work with diverse communities. Four distinct viewpoints emerged suggesting the ongoing need to: develop…
An Emergent Phenomenon of American Indian Secondary Students' Career Development Process
ERIC Educational Resources Information Center
Flynn, Stephen V.; Duncan, Kelly J.; Evenson, Lori L.
2013-01-01
Nine single-race American Indian secondary students' career development experiences were examined through a phenomenological methodology. All 9 participants were in the transition period starting in late secondary school (age 18). Data sources included individual interviews and journal analysis. The phenomenon of American Indian secondary…
The Study of Socio-Biospheric Problems.
ERIC Educational Resources Information Center
Scott, Andrew M.
Concepts, tools, and a methodology are needed which will permit the analysis of emergent socio-biospheric problems and facilitate their effective management. Many contemporary problems may be characterized as socio-biospheric; for example, pollution of the seas, acid rain, the growth of cities, and an atmosphere loaded with carcinogens. However,…
A Call for Conducting Multivariate Mixed Analyses
ERIC Educational Resources Information Center
Onwuegbuzie, Anthony J.
2016-01-01
Several authors have written methodological works that provide an introductory- and/or intermediate-level guide to conducting mixed analyses. Although these works have been useful for beginning and emergent mixed researchers, with very few exceptions, works are lacking that describe and illustrate advanced-level mixed analysis approaches. Thus,…
The Application of Survival Analysis to the Study of Psychotherapy Termination
ERIC Educational Resources Information Center
Corning, Alexadra F.; Malofeeva, Elena V.
2004-01-01
The state of the psychotherapy termination literature to date might best be characterized as inconclusive. Despite decades of studies, almost no predictors of premature termination have emerged consistently. An examination of this literature reveals a number of recurrent methodological-analytical problems that likely have contributed substantially…
Turn Allocation in Japanese Business Meetings: Emergence of Institutionality
ERIC Educational Resources Information Center
Murayama, Emi
2012-01-01
This dissertation uses conversation analysis as a theoretical and methodological framework to examine the organization of in-house business meetings that are conducted in Japanese. In particular, this study focuses on how institutionality becomes apparent within the participants' interactions. The data consists of six videotaped in-house…
Affective Responses of Students Who Witness Classroom Cheating
ERIC Educational Resources Information Center
Firmin, Michael W.; Burger, Amanda; Blosser, Matthew
2009-01-01
For this study, 82 general psychology students (51 females, 31 males) witnessed a peer cheating while completing a test. Following the incident, we tape recorded semi-structured interviews with each student who saw the cheating event for later analysis. Using qualitative coding and methodology, themes emerged regarding students' emotional…
NASA Technical Reports Server (NTRS)
Olivas, J. D.; Melroy, P.; McDanels, S.; Wallace, T.; Zapata, M. C.
2006-01-01
In connection with the accident investigation of the space shuttle Columbia, an analysis methodology utilizing well established microscopic and spectroscopic techniques was implemented for evaluating the environment to which the exterior fused silica glass was exposed. Through the implementation of optical microscopy, scanning electron microscopy, energy dispersive spectroscopy, transmission electron microscopy, and electron diffraction, details emerged regarding the manner in which a charred metallic deposited layer formed on top of the exposed glass. Due to nature of the substrate and the materials deposited, the methodology proved to allow for a more detailed analysis of the vehicle breakup. By contrast, similar analytical methodologies on metallic substrates have proven to be challenging due to strong potential for error resulting from substrate contamination. This information proved to be valuable to not only those involved in investigating the break up of Columbia, but also provides a potential guide for investigating future high altitude and high energy accidents.
Meyer, Travis S; Muething, Joseph Z; Lima, Gustavo Amoras Souza; Torres, Breno Raemy Rangel; del Rosario, Trystyn Keia; Gomes, José Orlando; Lambert, James H
2012-01-01
Radiological nuclear emergency responders must be able to coordinate evacuation and relief efforts following the release of radioactive material into populated areas. In order to respond quickly and effectively to a nuclear emergency, high-level coordination is needed between a number of large, independent organizations, including police, military, hazmat, and transportation authorities. Given the complexity, scale, time-pressure, and potential negative consequences inherent in radiological emergency responses, tracking and communicating information that will assist decision makers during a crisis is crucial. The emergency response team at the Angra dos Reis nuclear power facility, located outside of Rio de Janeiro, Brazil, presently conducts emergency response simulations once every two years to prepare organizational leaders for real-life emergency situations. However, current exercises are conducted without the aid of electronic or software tools, resulting in possible cognitive overload and delays in decision-making. This paper describes the development of a decision support system employing systems methodologies, including cognitive task analysis and human-machine interface design. The decision support system can aid the coordination team by automating cognitive functions and improving information sharing. A prototype of the design will be evaluated by plant officials in Brazil and incorporated to a future trial run of a response simulation.
Coller, Hilary A
2017-09-01
Emerging technologies for the analysis of genome-wide information in single cells have the potential to transform many fields of biology, including our understanding of cell states, the response of cells to external stimuli, mosaicism, and intratumor heterogeneity. At Experimental Biology 2017 in Chicago, Physiological Genomics hosted a symposium in which five leaders in the field of single cell genomics presented their recent research. The speakers discussed emerging methodologies in single cell analysis and critical issues for the analysis of single cell data. Also discussed were applications of single cell genomics to understanding the different types of cells within an organism or tissue and the basis for cell-to-cell variability in response to stimuli. Copyright © 2017 the American Physiological Society.
Costello, Tracy J; Falk, Catherine T; Ye, Kenny Q
2003-01-01
The Framingham Heart Study data, as well as a related simulated data set, were generously provided to the participants of the Genetic Analysis Workshop 13 in order that newly developed and emerging statistical methodologies could be tested on that well-characterized data set. The impetus driving the development of novel methods is to elucidate the contributions of genes, environment, and interactions between and among them, as well as to allow comparison between and validation of methods. The seven papers that comprise this group used data-mining methodologies (tree-based methods, neural networks, discriminant analysis, and Bayesian variable selection) in an attempt to identify the underlying genetics of cardiovascular disease and related traits in the presence of environmental and genetic covariates. Data-mining strategies are gaining popularity because they are extremely flexible and may have greater efficiency and potential in identifying the factors involved in complex disorders. While the methods grouped together here constitute a diverse collection, some papers asked similar questions with very different methods, while others used the same underlying methodology to ask very different questions. This paper briefly describes the data-mining methodologies applied to the Genetic Analysis Workshop 13 data sets and the results of those investigations. Copyright 2003 Wiley-Liss, Inc.
Qualitative data collection and analysis methods: the INSTINCT trial.
Meurer, William J; Frederiksen, Shirley M; Majersik, Jennifer J; Zhang, Lingling; Sandretto, Annette; Scott, Phillip A
2007-11-01
Patient care practices often lag behind current scientific evidence and professional guidelines. The failure of such knowledge translation (KT) efforts may reflect inadequate assessment and management of specific barriers confronting both physicians and patients at the point of treatment level. Effective KT in this setting may benefit from the use of qualitative methods to identify and overcome these barriers. Qualitative methodology allows in-depth exploration of the barriers involved in adopting practice change and has been infrequently used in emergency medicine research. The authors describe the methodology for qualitative analysis within the INcreasing Stroke Treatment through INteractive behavioral Change Tactics (INSTINCT) trial. This includes processes for valid data collection and reliable analysis of the textual data from focus group and interview transcripts. INSTINCT is a 24-hospital, randomized, controlled study that is designed to evaluate a system-based barrier assessment and interactive educational intervention to increase appropriate tissue plasminogen activator (tPA) use in ischemic stroke. Intervention hospitals undergo baseline barrier assessment using both qualitative as well as quantitative (survey) techniques. Investigators obtain data on local barriers to tPA use, as well as information on local attitudes, knowledge, and beliefs regarding acute stroke treatment. Targeted groups at each site include emergency physicians, emergency nurses, neurologists, radiologists, and hospital administrators. Transcript analysis using NVivo7 with a predefined barrier taxonomy is described. This will provide both qualitative insight on thrombolytic use and importance of specific barrier types for each site. The qualitative findings subsequently direct the form of professional education efforts and system interventions at treatment sites.
Dehnavieh, Reza; Ebrahimipour, Hossein; Molavi-Taleghani, Yasamin; Vafaee-Najar, Ali; Hekmat, Somayeh Noori; Esmailzdeh, Hamid
2015-01-01
Introduction: Pediatric emergency has been considered as a high risk area, and blood transfusion is known as a unique clinical measure, therefore this study was conducted with the purpose of assessing the proactive risk assessment of blood transfusion process in Pediatric Emergency of Qaem education- treatment center in Mashhad, by the Healthcare Failure Mode and Effects Analysis (HFMEA) methodology. Methodology: This cross-sectional study analyzed the failure mode and effects of blood transfusion process by a mixture of quantitative-qualitative method. The proactive HFMEA was used to identify and analyze the potential failures of the process. The information of the items in HFMEA forms was collected after obtaining a consensus of experts’ panel views via the interview and focus group discussion sessions. Results: The Number of 77 failure modes were identified for 24 sub-processes enlisted in 8 processes of blood transfusion. Totally 13 failure modes were identified as non-acceptable risk (a hazard score above 8) in the blood transfusion process and were transferred to the decision tree. Root causes of high risk modes were discussed in cause-effect meetings and were classified based on the UK national health system (NHS) approved classifications model. Action types were classified in the form of acceptance (11.6%), control (74.2%) and elimination (14.2%). Recommendations were placed in 7 categories using TRIZ (“Theory of Inventive Problem Solving.”) Conclusion: The re-engineering process for the required changes, standardizing and updating the blood transfusion procedure, root cause analysis of blood transfusion catastrophic events, patient identification bracelet, training classes and educational pamphlets for raising awareness of personnel, and monthly gathering of transfusion medicine committee have all been considered as executive strategies in work agenda in pediatric emergency. PMID:25560332
Who Owns Educational Theory? Big Data, Algorithms and the Expert Power of Education Data Science
ERIC Educational Resources Information Center
Williamson, Ben
2017-01-01
"Education data science" is an emerging methodological field which possesses the algorithm-driven technologies required to generate insights and knowledge from educational big data. This article consists of an analysis of the Lytics Lab, Stanford University's laboratory for research and development in learning analytics, and the Center…
ERIC Educational Resources Information Center
Moses, Lindsey; Kelly, Laura Beth
2017-01-01
In this study, the researchers examined how first-grade students initially positioned as struggling readers took up literacy practices to reposition themselves as capable competent readers and part of a literate community of practice over an academic year. Using positive discourse analysis and case study methodology, the researchers documented and…
Evidence-Based Leadership Development: The 4L Framework
ERIC Educational Resources Information Center
Scott, Shelleyann; Webber, Charles F.
2008-01-01
Purpose: This paper aims to use the results of three research initiatives to present the life-long learning leader 4L framework, a model for leadership development intended for use by designers and providers of leadership development programming. Design/methodology/approach: The 4L model is a conceptual framework that emerged from the analysis of…
Knowledge Management in Doctoral Education toward Knowledge Economy
ERIC Educational Resources Information Center
Stamou, Adamantia
2017-01-01
Purpose: The purpose of this paper is to investigate the role and the scope of knowledge management (KM) in doctoral education, in the emerging knowledge economy (KE) context, and the recommendation of a framework for KM in doctoral education. Design/Methodology/Approach: An extended literature analysis was contacted to elaborate the role and the…
Design and construction principles in nature and architecture.
Knippers, Jan; Speck, Thomas
2012-03-01
This paper will focus on how the emerging scientific discipline of biomimetics can bring new insights into the field of architecture. An analysis of both architectural and biological methodologies will show important aspects connecting these two. The foundation of this paper is a case study of convertible structures based on elastic plant movements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szilard, Ronaldo Henriques
A Risk Informed Safety Margin Characterization (RISMC) toolkit and methodology are proposed for investigating nuclear power plant core, fuels design and safety analysis, including postulated Loss-of-Coolant Accident (LOCA) analysis. This toolkit, under an integrated evaluation model framework, is name LOCA toolkit for the US (LOTUS). This demonstration includes coupled analysis of core design, fuel design, thermal hydraulics and systems analysis, using advanced risk analysis tools and methods to investigate a wide range of results.
ERIC Educational Resources Information Center
Hawley, Todd S.; Hostetler, Andrew L.
2017-01-01
In this manuscript, the authors explore self-study as an emerging research methodology with the potential to open up spaces of inquiry for researchers, graduate students, and teachers in a broad array of fields. They argue that the fields of career and technical education (CTE), adult education and technology can leverage self-study methodology in…
Ghimire, Santosh R; Johnston, John M
2017-09-01
We propose a modified eco-efficiency (EE) framework and novel sustainability analysis methodology for green infrastructure (GI) practices used in water resource management. Green infrastructure practices such as rainwater harvesting (RWH), rain gardens, porous pavements, and green roofs are emerging as viable strategies for climate change adaptation. The modified framework includes 4 economic, 11 environmental, and 3 social indicators. Using 6 indicators from the framework, at least 1 from each dimension of sustainability, we demonstrate the methodology to analyze RWH designs. We use life cycle assessment and life cycle cost assessment to calculate the sustainability indicators of 20 design configurations as Decision Management Objectives (DMOs). Five DMOs emerged as relatively more sustainable along the EE analysis Tradeoff Line, and we used Data Envelopment Analysis (DEA), a widely applied statistical approach, to quantify the modified EE measures as DMO sustainability scores. We also addressed the subjectivity and sensitivity analysis requirements of sustainability analysis, and we evaluated the performance of 10 weighting schemes that included classical DEA, equal weights, National Institute of Standards and Technology's stakeholder panel, Eco-Indicator 99, Sustainable Society Foundation's Sustainable Society Index, and 5 derived schemes. We improved upon classical DEA by applying the weighting schemes to identify sustainability scores that ranged from 0.18 to 1.0, avoiding the nonuniqueness problem and revealing the least to most sustainable DMOs. Our methodology provides a more comprehensive view of water resource management and is generally applicable to GI and industrial, environmental, and engineered systems to explore the sustainability space of alternative design configurations. Integr Environ Assess Manag 2017;13:821-831. Published 2017. This article is a US Government work and is in the public domain in the USA. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). Published 2017. This article is a US Government work and is in the public domain in the USA. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).
Lerner, E Brooke; Garrison, Herbert G; Nichol, Graham; Maio, Ronald F; Lookman, Hunaid A; Sheahan, William D; Franz, Timothy R; Austad, James D; Ginster, Aaron M; Spaite, Daniel W
2012-02-01
Calculating the cost of an emergency medical services (EMS) system using a standardized method is important for determining the value of EMS. This article describes the development of a methodology for calculating the cost of an EMS system to its community. This includes a tool for calculating the cost of EMS (the "cost workbook") and detailed directions for determining cost (the "cost guide"). The 12-step process that was developed is consistent with current theories of health economics, applicable to prehospital care, flexible enough to be used in varying sizes and types of EMS systems, and comprehensive enough to provide meaningful conclusions. It was developed by an expert panel (the EMS Cost Analysis Project [EMSCAP] investigator team) in an iterative process that included pilot testing the process in three diverse communities. The iterative process allowed ongoing modification of the toolkit during the development phase, based upon direct, practical, ongoing interaction with the EMS systems that were using the toolkit. The resulting methodology estimates EMS system costs within a user-defined community, allowing either the number of patients treated or the estimated number of lives saved by EMS to be assessed in light of the cost of those efforts. Much controversy exists about the cost of EMS and whether the resources spent for this purpose are justified. However, the existence of a validated toolkit that provides a standardized process will allow meaningful assessments and comparisons to be made and will supply objective information to inform EMS and community officials who are tasked with determining the utilization of scarce societal resources. © 2012 by the Society for Academic Emergency Medicine.
What Synthesis Methodology Should I Use? A Review and Analysis of Approaches to Research Synthesis.
Schick-Makaroff, Kara; MacDonald, Marjorie; Plummer, Marilyn; Burgess, Judy; Neander, Wendy
2016-01-01
When we began this process, we were doctoral students and a faculty member in a research methods course. As students, we were facing a review of the literature for our dissertations. We encountered several different ways of conducting a review but were unable to locate any resources that synthesized all of the various synthesis methodologies. Our purpose is to present a comprehensive overview and assessment of the main approaches to research synthesis. We use 'research synthesis' as a broad overarching term to describe various approaches to combining, integrating, and synthesizing research findings. We conducted an integrative review of the literature to explore the historical, contextual, and evolving nature of research synthesis. We searched five databases, reviewed websites of key organizations, hand-searched several journals, and examined relevant texts from the reference lists of the documents we had already obtained. We identified four broad categories of research synthesis methodology including conventional, quantitative, qualitative, and emerging syntheses. Each of the broad categories was compared to the others on the following: key characteristics, purpose, method, product, context, underlying assumptions, unit of analysis, strengths and limitations, and when to use each approach. The current state of research synthesis reflects significant advancements in emerging synthesis studies that integrate diverse data types and sources. New approaches to research synthesis provide a much broader range of review alternatives available to health and social science students and researchers.
Coetzee, Tanya; Hoffmann, Willem A; de Roubaix, Malcolm
2015-10-01
The amended research ethics policy at a South African University required the ethics review of undergraduate research projects, prompting the need to explore the content and teaching approach of research ethics education in health science undergraduate programs. Two qualitative data collection strategies were used: document analysis (syllabi and study guides) and semi-structured interviews with research methodology coordinators. Five main themes emerged: (a) timing of research ethics courses, (b) research ethics course content, (c) sub-optimal use of creative classroom activities to facilitate research ethics lectures, (d) understanding the need for undergraduate project research ethics review, and (e) research ethics capacity training for research methodology lecturers and undergraduate project supervisors. © The Author(s) 2015.
[A functional analysis of healthcare auditors' skills in Venezuela, 2008].
Chirinos-Muñoz, Mónica S
2010-10-01
Using functional analysis for identifying the basic, working, specific and generic skills and values which a health service auditor must have. Implementing the functional analysis technique with 10 experts, identifying specific, basic, generic skills and values by means of deductive logic. A functional map was obtained which started by establishing a key purpose based on improving healthcare and service quality from which three key functions emerged. The main functions and skills' units were then broken down into the competitive elements defining what a health service auditor is able to do. This functional map (following functional analysis methodology) shows in detail the simple and complex tasks which a healthcare auditor should apply in the workplace, adopting a forward management approach for improving healthcare and health service quality. This methodology, expressing logical-deductive awareness raising, provides expert consensual information validating each element regarding overall skills.
Doumouras, Aristithes G; Gomez, David; Haas, Barbara; Boyes, Donald M; Nathens, Avery B
2012-09-01
The regionalization of medical services has resulted in improved outcomes and greater compliance with existing guidelines. For certain "time-critical" conditions intimately associated with emergency medicine, early intervention has demonstrated mortality benefits. For these conditions, then, appropriate triage within a regionalized system at first diagnosis is paramount, ideally occurring in the field by emergency medical services (EMS) personnel. Therefore, EMS ground transport access is an important metric in the ongoing evaluation of a regionalized care system for time-critical emergency services. To our knowledge, no studies have demonstrated how methodologies for calculating EMS ground transport access differ in their estimates of access over the same study area for the same resource. This study uses two methodologies to calculate EMS ground transport access to trauma center care in a single study area to explore their manifestations and critically evaluate the differences between the methodologies. Two methodologies were compared in their estimations of EMS ground transport access to trauma center care: a routing methodology (RM) and an as-the-crow-flies methodology (ACFM). These methodologies were adaptations of the only two methodologies that had been previously used in the literature to calculate EMS ground transport access to time-critical emergency services across the United States. The RM and ACFM were applied to the nine Level I and Level II trauma centers within the province of Ontario by creating trauma center catchment areas at 30, 45, 60, and 120 minutes and calculating the population and area encompassed by the catchments. Because the methodologies were identical for measuring air access, this study looks specifically at EMS ground transport access. Catchments for the province were created for each methodology at each time interval, and their populations and areas were significantly different at all time periods. Specifically, the RM calculated significantly larger populations at every time interval while the ACFM calculated larger catchment area sizes. This trend is counterintuitive (i.e., larger catchment should mean higher populations), and it was found to be most disparate at the shortest time intervals (under 60 minutes). Through critical evaluation of the differences, the authors elucidated that the ACFM could calculate road access in areas with no roads and overestimates access in low-density areas compared to the RM, potentially affecting delivery of care decisions. Based on these results, the authors believe that future methodologies for calculating EMS ground transport access must incorporate a continuous and valid route through the road network as well as use travel speeds appropriate to the road segments traveled; alternatively, we feel that variation in methods for calculating road distances would have little effect on realized access. Overall, as more complex models for calculating EMS ground transport access become used, there needs to be a standard methodology to improve and to compare it to. Based on these findings, the authors believe that this should be the RM. © 2012 by the Society for Academic Emergency Medicine.
Martin, B C
2000-01-01
The high cost of emergency department (ED) care is often viewed as an area for achieving cost savings through reduced utilization for inappropriate conditions. The implementation of outpatient prospective payment for Medicare ED patients heightens scrutiny of costs and utilization in the ED versus primary care settings. Data from hospital clinical records, financial records, and a provider survey was used to develop a costing methodology and complete a comparative analysis of the cost of care for three diagnoses by setting. Total costs were significantly higher in the ED due primarily to differences in ancillary tests and prescription drugs ordered.
An application of Six Sigma methodology to turnover intentions in health care.
Taner, Mehmet
2009-01-01
The purpose of this study is to show how the principles of Six Sigma can be applied to the high turnover problem of doctors in medical emergency services and paramedic backup. Six Sigma's define-measure-analyse-improve-control (DMAIC) is applied for reducing the turnover rate of doctors in an organisation operating in emergency services. Variables of the model are determined. Explanatory factor analysis, multiple regression, analysis of variance (ANOVA) and Gage R&R are employed for the analysis. Personal burnout/stress and dissatisfaction from salary were found to be the "vital few" variables. The organisation took a new approach by improving its initiatives to doctors' working conditions. Sigma level of the process is increased. New policy and process changes have been found to effectively decrease the incidence of turnover intentions. The improved process is gained, standardised and institutionalised. This study is one of the few papers in the literature that elaborates the turnover problem of doctors working in the emergency and paramedic backup services.
ERIC Educational Resources Information Center
Poole, Wendy; Fallon, Gerald
2015-01-01
This paper examines increasing privatisation of education in the province of British Columbia, Canada. Conceptually, the paper is informed by theories of privatisation and social justice; and methodologically, it uses policy analysis to examine documents and financial records obtained from government departments. The paper critically analyses…
Changing Family Habits: A Case Study into Climate Change Mitigation Behavior in Families
ERIC Educational Resources Information Center
Leger, Michel T.; Pruneau, Diane
2012-01-01
A case-study methodology was used to explore the process of change as experienced by 3 suburban families in an attempt to incorporate climate change mitigation behavior into their day to day life. Cross-case analysis of the findings revealed the emergence of three major conceptual themes associated with behavior adoption: collectively applied…
2010-06-16
Clemen and Reilly (2001) Risk analysis Haimes (2009); Kaplan et al. (2001): Lowrance (1976); Kaplan and Garrick (1981) Source: The US Army Energy...collect solar energy and convert to heat (NREL presentation) • Wind turbines capture energy in wind and convert it into electricity (NREL
An Analysis of Career Tracks in the Design of IS Curricula in the U.S.
ERIC Educational Resources Information Center
Hwang, Drew; Soe, Louise L.
2010-01-01
Studies of undergraduate curricula in the field of Information Systems (IS) over the past two decades demonstrate a continual process of development and change. Many factors influence curriculum design, including new technologies and methodologies, and emerging subfields and subject areas. However, one deficit in the literature about IS curriculum…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gore, Bryan F.; Blackburn, Tyrone R.; Heasler, Patrick G.
2001-01-19
The objective of this report is to compare the benefits and costs of modifications proposed for intake gate closure systems at four hydroelectric stations on the Lower Snake and Upper Columbia Rivers in the Walla Walla District that are unable to meet the COE 10-minute closure rule due to the installation of fish screens. The primary benefit of the proposed modifications is to reduce the risk of damage to the station and environs when emergency intake gate closure is required. Consequently, this report presents the results and methodology of an extensive risk analysis performed to assess the reliability of powerhousemore » systems and the costs and timing of potential damages resulting from events requiring emergency intake gate closure. As part of this analysis, the level of protection provided by the nitrogen emergency closure system was also evaluated. The nitrogen system was the basis for the original recommendation to partially disable the intake gate systems. The risk analysis quantifies this protection level.« less
Deljavan, Reza; Sadeghi-Bazargani, Homayoun; Fouladi, Nasrin; Arshi, Shahnam; Mohammadi, Reza
2012-01-01
Little has been done to investigate the application of injury specific qualitative research methods in the field of burn injuries. The aim of this study was to use an analytical tool (Haddon's matrix) through qualitative research methods to better understand people's perceptions about burn injuries. This study applied Haddon's matrix as a framework and an analytical tool for a qualitative research methodology in burn research. Both child and adult burn injury victims were enrolled into a qualitative study conducted using focus group discussion. Haddon's matrix was used to develop an interview guide and also through the analysis phase. The main analysis clusters were pre-event level/human (including risky behaviors, belief and cultural factors, and knowledge and education), pre-event level/object, pre-event phase/environment and event and post-event phase (including fire control, emergency scald and burn wound management, traditional remedies, medical consultation, and severity indicators). This research gave rise to results that are possibly useful both for future injury research and for designing burn injury prevention plans. Haddon's matrix is applicable in a qualitative research methodology both at data collection and data analysis phases. The study using Haddon's matrix through a qualitative research methodology yielded substantially rich information regarding burn injuries that may possibly be useful for prevention or future quantitative research.
Churilov, Leonid; Liu, Daniel; Ma, Henry; Christensen, Soren; Nagakane, Yoshinari; Campbell, Bruce; Parsons, Mark W; Levi, Christopher R; Davis, Stephen M; Donnan, Geoffrey A
2013-04-01
The appropriateness of a software platform for rapid MRI assessment of the amount of salvageable brain tissue after stroke is critical for both the validity of the Extending the Time for Thrombolysis in Emergency Neurological Deficits (EXTEND) Clinical Trial of stroke thrombolysis beyond 4.5 hours and for stroke patient care outcomes. The objective of this research is to develop and implement a methodology for selecting the acute stroke imaging software platform most appropriate for the setting of a multi-centre clinical trial. A multi-disciplinary decision making panel formulated the set of preferentially independent evaluation attributes. Alternative Multi-Attribute Value Measurement methods were used to identify the best imaging software platform followed by sensitivity analysis to ensure the validity and robustness of the proposed solution. Four alternative imaging software platforms were identified. RApid processing of PerfusIon and Diffusion (RAPID) software was selected as the most appropriate for the needs of the EXTEND trial. A theoretically grounded generic multi-attribute selection methodology for imaging software was developed and implemented. The developed methodology assured both a high quality decision outcome and a rational and transparent decision process. This development contributes to stroke literature in the area of comprehensive evaluation of MRI clinical software. At the time of evaluation, RAPID software presented the most appropriate imaging software platform for use in the EXTEND clinical trial. The proposed multi-attribute imaging software evaluation methodology is based on sound theoretical foundations of multiple criteria decision analysis and can be successfully used for choosing the most appropriate imaging software while ensuring both robust decision process and outcomes. © 2012 The Authors. International Journal of Stroke © 2012 World Stroke Organization.
White, Andrew A; Wright, Seth W; Blanco, Roberto; Lemonds, Brent; Sisco, Janice; Bledsoe, Sandy; Irwin, Cindy; Isenhour, Jennifer; Pichert, James W
2004-10-01
Identifying the etiologies of adverse outcomes is an important first step in improving patient safety and reducing malpractice risks. However, relatively little is known about the causes of emergency department-related adverse outcomes. The objective was to describe a method for identification of common causes of adverse outcomes in an emergency department. This methodology potentially can suggest ways to improve care and might provide a model for identification of factors associated with adverse outcomes. This was a retrospective analysis of 74 consecutive files opened by a malpractice insurer between 1995 and 2000. Each risk-management file was analyzed to identify potential causes of adverse outcomes. The main outcomes were rater-assigned codes for alleged problems with care (e.g., failures of communication or problems related to diagnosis). About 50% of cases were related to injuries or abdominal complaints. A contributing cause was found in 92% of cases, and most had more than one contributing cause. The most frequent contributing categories included failure to diagnose (45%), supervision problems (31%), communication problems (30%), patient behavior (24%), administrative problems (20%), and documentation (20%). Specific relating factors within these categories, such as lack of timely resident supervision and failure to follow policies and procedures, were identified. This project documented that an aggregate analysis of risk-management files has the potential to identify shared causes related to real or perceived adverse outcomes. Several potentially correctable systems problems were identified using this methodology. These simple, descriptive management tools may be useful in identifying issues for problem solving and can be easily learned by physicians and managers.
Capabilities, methodologies, and use of the cambio file-translation application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lasche, George P.
2007-03-01
This report describes the capabilities, methodologies, and uses of the Cambio computer application, designed to automatically read and display nuclear spectral data files of any known format in the world and to convert spectral data to one of several commonly used analysis formats. To further assist responders, Cambio incorporates an analysis method based on non-linear fitting techniques found in open literature and implemented in openly published source code in the late 1980s. A brief description is provided of how Cambio works, of what basic formats it can currently read, and how it can be used. Cambio was developed at Sandiamore » National Laboratories and is provided as a free service to assist nuclear emergency response analysts anywhere in the world in the fight against nuclear terrorism.« less
Cho, Brian H; Lopez, Joseph; Means, Jessica; Lopez, Sandra; Milton, Jacqueline; Tufaro, Anthony P; May, James W; Dorafshar, Amir H
2017-12-01
Conflicts of interest (COI) are an emerging area of discussion within the field of plastic surgery. Recently, several reports have found that research studies that disclose COI are associated with publication of positive outcomes. We hypothesize that this association is driven by higher-quality studies receiving industry funding. This study aimed to investigate the association between industry support and study methodological quality. We reviewed all entries in Plastic and Reconstructive Surgery, Annals of Plastic Surgery, and Journal of Plastic, Reconstructive, and Aesthetic Surgery within a 1-year period encompassing 2013. All clinical research articles were analyzed. Studies were evaluated blindly for methodology quality based on a validated scoring system. An ordinal logistic regression model was used to examine the association between methodology score and COI. A total of 1474 articles were reviewed, of which 483 met our inclusion criteria. These articles underwent methodological quality scoring. Conflicts of interest were reported in 28 (5.8%) of these articles. After adjusting for article characteristics in the ordinal logistic regression analysis, there was no significant association between articles with COI and higher methodological scores (P = 0.7636). Plastic surgery studies that disclose COI are not associated with higher methodological quality when compared with studies that do not disclose COI. These findings suggest that although the presence of COI is associated with positive findings, the association is not shown to be driven by higher-quality studies.
Islam, Nadia Shilpi; Khan, Suhaila; Kwon, Simona; Jang, Deeana; Ro, Marguerite; Trinh-Shevrin, Chau
2011-01-01
There are close to 15 million Asian Americans living in the United States, and they represent the fastest growing populations in the country. By the year 2050, there will be an estimated 33.4 million Asian Americans living in the country. However, their health needs remain poorly understood and there is a critical lack of data disaggregated by Asian American ethnic subgroups, primary language, and geography. This paper examines methodological issues, challenges, and potential solutions to addressing the collection, analysis, and reporting of disaggregated (or, granular) data on Asian Americans. The article explores emerging efforts to increase granular data through the use of innovative study design and analysis techniques. Concerted efforts to implement these techniques will be critical to the future development of sound research, health programs, and policy efforts targeting this and other minority populations. PMID:21099084
Virtopsy: An integration of forensic science and imageology
Joseph, T. Isaac; Girish, K. L.; Sathyan, Pradeesh; Kiran, M. Shashi; Vidya, S.
2017-01-01
In an era where noninvasive and minimally invasive techniques are heralding medical innovations and health science technology, necrological analysis is not bereft of this wave. Virtopsy is virtual autopsy. It is a new-age complimentary documentation approach to identify and analyze the details of demise. Utilizing virtual autopsy for orofacial forensic examination is an emerging specialty which holds a plethora of potential for future trends in forensic science. Being a noninvasive technique, it is a rapid method which facilitates the medicolegal process and aids in the delivery of justice. The present article is an overview of this emerging methodology. PMID:29657485
Virtopsy: An integration of forensic science and imageology.
Joseph, T Isaac; Girish, K L; Sathyan, Pradeesh; Kiran, M Shashi; Vidya, S
2017-01-01
In an era where noninvasive and minimally invasive techniques are heralding medical innovations and health science technology, necrological analysis is not bereft of this wave. Virtopsy is virtual autopsy. It is a new-age complimentary documentation approach to identify and analyze the details of demise. Utilizing virtual autopsy for orofacial forensic examination is an emerging specialty which holds a plethora of potential for future trends in forensic science. Being a noninvasive technique, it is a rapid method which facilitates the medicolegal process and aids in the delivery of justice. The present article is an overview of this emerging methodology.
ERIC Educational Resources Information Center
Darwin, Stephen
2011-01-01
Cultural-historical activity theory (CHAT), founded on the seminal work of Vygotsky and evolving in the subsequent work of Leont'ev and Engestrom, continues to emerge as a robust and increasingly widely used conceptual framework for the research and analysis of the complex social mediation of human learning and development. Yet there remains…
ERIC Educational Resources Information Center
Huws, Ursula; Jagger, Nick; O'Regan, Siobhan
Inexpensive telecommunications, the spread of computing, and globalization are creating major change in the location of work within and between countries. Because no tools have yet been developed to investigate the new spatial employment patterns, a cluster analysis involving more than 50 variables and 206 countries was performed to group…
ERIC Educational Resources Information Center
Pereira Querol, Marco A.; Suutari, Timo; Seppanen, Laura
2010-01-01
The purpose of this paper is to present theoretical tools for understanding the dynamics of change and learning during the emergence and development of environmental management activities. The methodology consists of a historical analysis of a case of biogas production that took place in the Southwest region of Finland. The theoretical tools used…
ERIC Educational Resources Information Center
Rugen, Brian David
2009-01-01
Contemporary research suggests that forming a professional identity is crucial to the process of becoming a teacher. Furthermore, a "narrative turn" has emerged as a major methodological influence for the study of identity in research on teaching. A guiding assumption of traditional narrative research is that stories act as…
Emerging Demands for Public Policies in Rio De Janeiro: Educational Prevention of Social Risks
ERIC Educational Resources Information Center
Gomes Da Silva, Magda Maria Ventura; Garcia, Maria del Pilar Quicios
2016-01-01
This paper disseminates some results of an international research on the social risk manifestations published in eight periodicals in Rio de Janeiro from July 2013 to December 2014. A sample of the research coincides with the population: 541 news, which constitutes 1255 analytical units. The methodology consisted of a content analysis of the news,…
What Synthesis Methodology Should I Use? A Review and Analysis of Approaches to Research Synthesis
Schick-Makaroff, Kara; MacDonald, Marjorie; Plummer, Marilyn; Burgess, Judy; Neander, Wendy
2016-01-01
Background When we began this process, we were doctoral students and a faculty member in a research methods course. As students, we were facing a review of the literature for our dissertations. We encountered several different ways of conducting a review but were unable to locate any resources that synthesized all of the various synthesis methodologies. Our purpose is to present a comprehensive overview and assessment of the main approaches to research synthesis. We use ‘research synthesis’ as a broad overarching term to describe various approaches to combining, integrating, and synthesizing research findings. Methods We conducted an integrative review of the literature to explore the historical, contextual, and evolving nature of research synthesis. We searched five databases, reviewed websites of key organizations, hand-searched several journals, and examined relevant texts from the reference lists of the documents we had already obtained. Results We identified four broad categories of research synthesis methodology including conventional, quantitative, qualitative, and emerging syntheses. Each of the broad categories was compared to the others on the following: key characteristics, purpose, method, product, context, underlying assumptions, unit of analysis, strengths and limitations, and when to use each approach. Conclusions The current state of research synthesis reflects significant advancements in emerging synthesis studies that integrate diverse data types and sources. New approaches to research synthesis provide a much broader range of review alternatives available to health and social science students and researchers. PMID:29546155
Mjøsund, Nina Helen; Eriksson, Monica; Espnes, Geir Arild; Haaland-Øverby, Mette; Jensen, Sven Liang; Norheim, Irene; Kjus, Solveig Helene Høymork; Portaasen, Inger-Lill; Vinje, Hege Forbech
2017-01-01
The aim of this study was to examine how service user involvement can contribute to the development of interpretative phenomenological analysis methodology and enhance research quality. Interpretative phenomenological analysis is a qualitative methodology used in nursing research internationally to understand human experiences that are essential to the participants. Service user involvement is requested in nursing research. We share experiences from 4 years of collaboration (2012-2015) on a mental health promotion project, which involved an advisory team. Five research advisors either with a diagnosis or related to a person with severe mental illness constituted the team. They collaborated with the research fellow throughout the entire research process and have co-authored this article. We examined the joint process of analysing the empirical data from interviews. Our analytical discussions were audiotaped, transcribed and subsequently interpreted following the guidelines for good qualitative analysis in interpretative phenomenological analysis studies. The advisory team became 'the researcher's helping hand'. Multiple perspectives influenced the qualitative analysis, which gave more insightful interpretations of nuances, complexity, richness or ambiguity in the interviewed participants' accounts. The outcome of the service user involvement was increased breadth and depth in findings. Service user involvement improved the research quality in a nursing research project on mental health promotion. The interpretative element of interpretative phenomenological analysis was enhanced by the emergence of multiple perspectives in the qualitative analysis of the empirical data. We argue that service user involvement and interpretative phenomenological analysis methodology can mutually reinforce each other and strengthen qualitative methodology. © 2016 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.
Binder, Andrew R; Cacciatore, Michael A; Scheufele, Dietram A; Shaw, Bret R; Corley, Elizabeth A
2012-10-01
This study presents a systematic comparison of two alternative measures of citizens' perceptions of risks and benefits of emerging technologies. By focusing on two specific issues (nanotechnology and biofuels), we derive several insights for the measurement of public views of science. Most importantly, our analyses reveal that relying on global, single-item measures may lead to invalid inferences regarding external influences on public perceptions, particularly those related to cognitive schema and media use. Beyond these methodological implications, this analysis suggests several reasons why researchers in the area of public attitudes toward science must revisit notions of measurement in order to accurately inform the general public, policymakers, scientists, and journalists about trends in public opinion toward emerging technologies.
Studies of planning behavior of aircraft pilots in normal, abnormal and emergency situations
NASA Technical Reports Server (NTRS)
Johannsen, G.; Rouse, W. B.; Hillmann, K.
1981-01-01
A methodology for the study of planning is presented and the results of applying the methodology within two experimental investigations of planning behavior of aircraft pilots in normal, abnormal, and emergency situations are discussed. Beyond showing that the methodology yields consistent results, these experiments also lead to concepts in terms of a dichotomy between event driven and time driven planning, subtle effects of automation on planning, and the relationship of planning to workload and flight performance.
An introduction to instrumental variables analysis: part 1.
Bennett, Derrick A
2010-01-01
There are several examples in the medical literature where the associations of treatment effects predicted by observational studies have been refuted by evidence from subsequent large-scale randomised trials. This is because of the fact that non-experimental studies are subject to confounding - and confounding cannot be entirely eliminated even if all known confounders have been measured in the study as there may be unknown confounders. The aim of this 2-part methodological primer is to introduce an emerging methodology for estimating treatment effects using observational data in the absence of good randomised evidence known as the method of instrumental variables. Copyright © 2010 S. Karger AG, Basel.
Karami, Manoochehr; Khazaei, Salman
2017-12-06
Clinical decision makings according studies result require the valid and correct data collection, andanalysis. However, there are some common methodological and statistical issues which may ignore by authors. In individual matched case- control design bias arising from the unconditional analysis instead of conditional analysis. Using an unconditional logistic for matched data causes the imposition of a large number of nuisance parameters which may result in seriously biased estimates.
Financial options methodology for analyzing investments in new technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wenning, B.D.
1994-12-31
The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options valuation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisionsmore » are being contemplated.« less
Financial options methodology for analyzing investments in new technology
NASA Technical Reports Server (NTRS)
Wenning, B. D.
1995-01-01
The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options evaluation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisions are being contemplated.
Bartolucci, Chiara; Lombardo, Giovanni Pietro
2012-08-01
This article examines the scientific-cultural context of the second half of the 1800s, during which psychological science emerged in Italy. The article explores the contribution made by the emergence of the primary research traditions of that period, namely, physiological anthropology and phreniatry, by means of a methodology that combines content analysis with a classical historiographical study of the period. Themes and authors deriving from the various disciplines in the human and natural sciences were identified through a content analysis of the Rivista di Filosofia Scientifica [Journal of Scientific Philosophy], a periodical that is representative of Italian positivism. The analysis highlights the epistemological perspective held by scholars who, distancing themselves from the mechanistic reductionism of the proponents of positivism, integrated a naturalistic and evolutionary conceptualization with the neo-Kantian critique. A clearly delineated naturalistic and differential perspective of scientific research that brought about the birth of psychology as an experimental discipline in Italy in the 1900s emerges from the analysis, including psychology and psychopathology as studied by the phreniatrists Gabriele Buccola, Enrico Morselli, and Eugenio Tanzi; Tito Vignoli and Giuseppe Sergi's work in comparative anthropology; Giulio Fano's approach and contribution to physiology; and Enrico Ferri's contribution to criminology. (PsycINFO Database Record (c) 2012 APA, all rights reserved).
Statistical data mining of streaming motion data for fall detection in assistive environments.
Tasoulis, S K; Doukas, C N; Maglogiannis, I; Plagianakos, V P
2011-01-01
The analysis of human motion data is interesting for the purpose of activity recognition or emergency event detection, especially in the case of elderly or disabled people living independently in their homes. Several techniques have been proposed for identifying such distress situations using either motion, audio or video sensors on the monitored subject (wearable sensors) or the surrounding environment. The output of such sensors is data streams that require real time recognition, especially in emergency situations, thus traditional classification approaches may not be applicable for immediate alarm triggering or fall prevention. This paper presents a statistical mining methodology that may be used for the specific problem of real time fall detection. Visual data captured from the user's environment, using overhead cameras along with motion data are collected from accelerometers on the subject's body and are fed to the fall detection system. The paper includes the details of the stream data mining methodology incorporated in the system along with an initial evaluation of the achieved accuracy in detecting falls.
Emohawk: Searching for a "Good" Emergent Narrative
NASA Astrophysics Data System (ADS)
Brom, Cyril; Bída, Michal; Gemrot, Jakub; Kadlec, Rudolf; Plch, Tomáš
We report on the progress we have achieved in development of Emohawk, a 3D virtual reality application with an emergent narrative for teaching high-school students and undergraduates the basics of virtual characters control, emotion modelling, and narrative generation. Besides, we present a new methodology, used in Emohawk, for purposeful authoring of emergent narratives of Façade's complexity. The methodology is based on massive automatic search for stories that are appealing to the audience whilst forbidding the unappealing ones during the design phase.
Brixner, Diana; Maniadakis, Nikos; Kaló, Zoltán; Hu, Shanlian; Shen, Jie; Wijaya, Kalman
2017-09-01
Off-patent pharmaceuticals (OPPs) represent more than 60% of the pharmaceutical market in many emerging countries, where they are frequently evaluated primarily on cost rather than with health technology assessment. OPPs are assumed to be identical to the originators. Branded and unbranded generic versions can, however, vary from the originator in active pharmaceutical ingredients, dosage, consistency formulation, excipients, manufacturing processes, and distribution, for example. These variables can alter the efficacy and safety of the product, negatively impacting both the anticipated cost savings and the population's health. In addition, many health care systems lack the resources or expertise to evaluate such products, and current assessment methods can be complex and difficult to adapt to a health system's needs. Multicriteria decision analysis (MCDA) simple scoring is an evidence-based health technology assessment methodology for evaluating OPPs, especially in emerging countries in which resources are limited but decision makers still must balance affordability with factors such as drug safety, level interchangeability, manufacturing site and active pharmaceutical ingredient quality, supply track record, and real-life outcomes. MCDA simple scoring can be applied to pharmaceutical pricing, reimbursement, formulary listing, and drug procurement. In November 2015, a workshop was held at the International Society for Pharmacoeconomics and Outcomes Research Annual Meeting in Milan to refine and prioritize criteria that can be used in MCDA simple scoring for OPPs, resulting in an example MCDA process and 22 prioritized criteria that health care systems in emerging countries can easily adapt to their own decision-making processes. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Chen, Wei; Zeng, Guang
2006-02-01
To establish a comprehensive assessment model on the ability of emergency response within the public health system in flooding-prone areas. A hierarchy process theory was used to establish the initial assessing framework. Delphi method was used to screen and choose the ultimate indicators and their weights before an assessment model was set up under the 'synthetic scored method' to assess the ability of the emergency response among twenty county public health units. We then used the 'analysis of variation (ANOVA)' methodology to test the feasibility of distinguishing the ability of emergency response among different county health units and correlation analysis was used to assess the independence of indicators in the assessing model. A comprehensive model was then established including twenty first-class indicators and fifty-six second-class indicators and the degree of ability to emergency response with flooding of public health units was evaluated. There were five public health units having higher, ten having moderate but five with lower levels on emergency response. The assessment model was proved to be a good method in differentiating the ability of public health units, using independent indicators. The assessment model which we established seemed to be practical and reliable.
A Comprehensive Comparison of Current Operating Reserve Methodologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krad, Ibrahim; Ibanez, Eduardo; Gao, Wenzhong
Electric power systems are currently experiencing a paradigm shift from a traditionally static system to a system that is becoming increasingly more dynamic and variable. Emerging technologies are forcing power system operators to adapt to their performance characteristics. These technologies, such as distributed generation and energy storage systems, have changed the traditional idea of a distribution system with power flowing in one direction into a distribution system with bidirectional flows. Variable generation, in the form of wind and solar generation, also increases the variability and uncertainty in the system. As such, power system operators are revisiting the ways in whichmore » they treat this evolving power system, namely by modifying their operating reserve methodologies. This paper intends to show an in-depth analysis on different operating reserve methodologies and investigate their impacts on power system reliability and economic efficiency.« less
Rodríguez, M T Torres; Andrade, L Cristóbal; Bugallo, P M Bello; Long, J J Casares
2011-09-15
Life cycle thinking (LCT) is one of the philosophies that has recently appeared in the context of the sustainable development. Some of the already existing tools and methods, as well as some of the recently emerged ones, which seek to understand, interpret and design the life of a product, can be included into the scope of the LCT philosophy. That is the case of the material and energy flow analysis (MEFA), a tool derived from the industrial metabolism definition. This paper proposes a methodology combining MEFA with another technique derived from sustainable development which also fits the LCT philosophy, the BAT (best available techniques) analysis. This methodology, applied to an industrial process, seeks to identify the so-called improvable flows by MEFA, so that the appropriate candidate BAT can be selected by BAT analysis. Material and energy inputs, outputs and internal flows are quantified, and sustainable solutions are provided on the basis of industrial metabolism. The methodology has been applied to an exemplary roof tile manufacture plant for validation. 14 Improvable flows have been identified and 7 candidate BAT have been proposed aiming to reduce these flows. The proposed methodology provides a way to detect improvable material or energy flows in a process and selects the most sustainable options to enhance them. Solutions are proposed for the detected improvable flows, taking into account their effectiveness on improving such flows. Copyright © 2011 Elsevier B.V. All rights reserved.
2018-02-15
models and approaches are also valid using other invasive and non - invasive technologies. Finally, we illustrate and experimentally evaluate this...2017 Project Outline q Pattern formation diversity in wild microbial societies q Experimental and mathematical analysis methodology q Skeleton...chemotaxis, nutrient degradation, and the exchange of amino acids between cells. Using both quantitative experimental methods and several theoretical
Zammar, Guilherme Roberto; Shah, Jatin; Bonilauri Ferreira, Ana Paula; Cofiel, Luciana; Lyles, Kenneth W.; Pietrobon, Ricardo
2010-01-01
Background The inherent complexity of statistical methods and clinical phenomena compel researchers with diverse domains of expertise to work in interdisciplinary teams, where none of them have a complete knowledge in their counterpart's field. As a result, knowledge exchange may often be characterized by miscommunication leading to misinterpretation, ultimately resulting in errors in research and even clinical practice. Though communication has a central role in interdisciplinary collaboration and since miscommunication can have a negative impact on research processes, to the best of our knowledge, no study has yet explored how data analysis specialists and clinical researchers communicate over time. Methods/Principal Findings We conducted qualitative analysis of encounters between clinical researchers and data analysis specialists (epidemiologist, clinical epidemiologist, and data mining specialist). These encounters were recorded and systematically analyzed using a grounded theory methodology for extraction of emerging themes, followed by data triangulation and analysis of negative cases for validation. A policy analysis was then performed using a system dynamics methodology looking for potential interventions to improve this process. Four major emerging themes were found. Definitions using lay language were frequently employed as a way to bridge the language gap between the specialties. Thought experiments presented a series of “what if” situations that helped clarify how the method or information from the other field would behave, if exposed to alternative situations, ultimately aiding in explaining their main objective. Metaphors and analogies were used to translate concepts across fields, from the unfamiliar to the familiar. Prolepsis was used to anticipate study outcomes, thus helping specialists understand the current context based on an understanding of their final goal. Conclusion/Significance The communication between clinical researchers and data analysis specialists presents multiple challenges that can lead to errors. PMID:20195374
Methodological challenges collecting parent phone-call healthcare utilization data.
Moreau, Paula; Crawford, Sybil; Sullivan-Bolyai, Susan
2016-02-01
Recommendations by the National Institute of Nursing Research and other groups have strongly encouraged nurses to pay greater attention to cost-effectiveness analysis when conducting research. Given the increasing prominence of translational science and comparative effective research, cost-effective analysis has become a basic tool in determining intervention value in research. Tracking phone-call communication (number of calls and context) with cross-checks between parents and healthcare providers is an example of this type of healthcare utilization data collection. This article identifies some methodological challenges that have emerged in the process of collecting this type of data in a randomized controlled trial: Parent education Through Simulation-Diabetes (PETS-D). We also describe ways in which those challenges have been addressed with comparison data results, and make recommendations for future research. Copyright © 2015 Elsevier Inc. All rights reserved.
Lindberg, Elisabeth; Österberg, Sofia A; Hörberg, Ulrica
2016-01-01
Phenomena in caring science are often complex and laden with meanings. Empirical research with the aim of capturing lived experiences is one way of revealing the complexity. Sometimes, however, results from empirical research need to be further discussed. One way is to further abstract the result and/or philosophically examine it. This has previously been performed and presented in scientific journals and doctoral theses, contributing to a greater understanding of phenomena in caring science. Although the intentions in many of these publications are laudable, the lack of methodological descriptions as well as a theoretical and systematic foundation can contribute to an ambiguity concerning how the results have emerged during the analysis. The aim of this paper is to describe the methodological support for the further abstraction of and/or philosophical examination of empirical findings. When trying to systematize the support procedures, we have used a reflective lifeworld research (RLR) approach. Based on the assumptions in RLR, this article will present methodological support for a theoretical examination that can include two stages. In the first stage, data from several (two or more) empirical results on an essential level are synthesized into a general structure. Sometimes the analysis ends with the general structure, but sometimes there is a need to proceed further. The second stage can then be a philosophical examination, in which the general structure is discussed in relation to a philosophical text, theory, or concept. It is important that the theories are brought in as the final stage after the completion of the analysis. Core dimensions of the described methodological support are, in accordance with RLR, openness, bridling, and reflection. The methodological support cannot be understood as fixed stages, but rather as a guiding light in the search for further meanings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Belzer, D.B.; Serot, D.E.; Kellogg, M.A.
1991-03-01
Development of integrated mobilization preparedness policies requires planning estimates of available productive capacity during national emergency conditions. Such estimates must be developed in a manner to allow evaluation of current trends in capacity and the consideration of uncertainties in various data inputs and in engineering assumptions. This study developed estimates of emergency operating capacity (EOC) for 446 manufacturing industries at the 4-digit Standard Industrial Classification (SIC) level of aggregation and for 24 key nonmanufacturing sectors. This volume lays out the general concepts and methods used to develop the emergency operating estimates. The historical analysis of capacity extends from 1974 throughmore » 1986. Some nonmanufacturing industries are included. In addition to mining and utilities, key industries in transportation, communication, and services were analyzed. Physical capacity and efficiency of production were measured. 3 refs., 2 figs., 12 tabs. (JF)« less
Exoplanet Biosignatures: Future Directions
Bains, William; Cronin, Leroy; DasSarma, Shiladitya; Danielache, Sebastian; Domagal-Goldman, Shawn; Kacar, Betul; Kiang, Nancy Y.; Lenardic, Adrian; Reinhard, Christopher T.; Moore, William; Schwieterman, Edward W.; Shkolnik, Evgenya L.; Smith, Harrison B.
2018-01-01
Abstract We introduce a Bayesian method for guiding future directions for detection of life on exoplanets. We describe empirical and theoretical work necessary to place constraints on the relevant likelihoods, including those emerging from better understanding stellar environment, planetary climate and geophysics, geochemical cycling, the universalities of physics and chemistry, the contingencies of evolutionary history, the properties of life as an emergent complex system, and the mechanisms driving the emergence of life. We provide examples for how the Bayesian formalism could guide future search strategies, including determining observations to prioritize or deciding between targeted searches or larger lower resolution surveys to generate ensemble statistics and address how a Bayesian methodology could constrain the prior probability of life with or without a positive detection. Key Words: Exoplanets—Biosignatures—Life detection—Bayesian analysis. Astrobiology 18, 779–824. PMID:29938538
Exoplanet Biosignatures: Future Directions.
Walker, Sara I; Bains, William; Cronin, Leroy; DasSarma, Shiladitya; Danielache, Sebastian; Domagal-Goldman, Shawn; Kacar, Betul; Kiang, Nancy Y; Lenardic, Adrian; Reinhard, Christopher T; Moore, William; Schwieterman, Edward W; Shkolnik, Evgenya L; Smith, Harrison B
2018-06-01
We introduce a Bayesian method for guiding future directions for detection of life on exoplanets. We describe empirical and theoretical work necessary to place constraints on the relevant likelihoods, including those emerging from better understanding stellar environment, planetary climate and geophysics, geochemical cycling, the universalities of physics and chemistry, the contingencies of evolutionary history, the properties of life as an emergent complex system, and the mechanisms driving the emergence of life. We provide examples for how the Bayesian formalism could guide future search strategies, including determining observations to prioritize or deciding between targeted searches or larger lower resolution surveys to generate ensemble statistics and address how a Bayesian methodology could constrain the prior probability of life with or without a positive detection. Key Words: Exoplanets-Biosignatures-Life detection-Bayesian analysis. Astrobiology 18, 779-824.
Automated Discovery and Modeling of Sequential Patterns Preceding Events of Interest
NASA Technical Reports Server (NTRS)
Rohloff, Kurt
2010-01-01
The integration of emerging data manipulation technologies has enabled a paradigm shift in practitioners' abilities to understand and anticipate events of interest in complex systems. Example events of interest include outbreaks of socio-political violence in nation-states. Rather than relying on human-centric modeling efforts that are limited by the availability of SMEs, automated data processing technologies has enabled the development of innovative automated complex system modeling and predictive analysis technologies. We introduce one such emerging modeling technology - the sequential pattern methodology. We have applied the sequential pattern methodology to automatically identify patterns of observed behavior that precede outbreaks of socio-political violence such as riots, rebellions and coups in nation-states. The sequential pattern methodology is a groundbreaking approach to automated complex system model discovery because it generates easily interpretable patterns based on direct observations of sampled factor data for a deeper understanding of societal behaviors that is tolerant of observation noise and missing data. The discovered patterns are simple to interpret and mimic human's identifications of observed trends in temporal data. Discovered patterns also provide an automated forecasting ability: we discuss an example of using discovered patterns coupled with a rich data environment to forecast various types of socio-political violence in nation-states.
Methodological Issues and Practices in Qualitative Research.
ERIC Educational Resources Information Center
Bradley, Jana
1993-01-01
Discusses methodological issues concerning qualitative research and describes research practices that qualitative researchers use to address these methodological issues. Topics discussed include the researcher as interpreter, the emergent nature of qualitative research, understanding the experience of others, trustworthiness in qualitative…
The Escherichia coli Proteome: Past, Present, and Future Prospects†
Han, Mee-Jung; Lee, Sang Yup
2006-01-01
Proteomics has emerged as an indispensable methodology for large-scale protein analysis in functional genomics. The Escherichia coli proteome has been extensively studied and is well defined in terms of biochemical, biological, and biotechnological data. Even before the entire E. coli proteome was fully elucidated, the largest available data set had been integrated to decipher regulatory circuits and metabolic pathways, providing valuable insights into global cellular physiology and the development of metabolic and cellular engineering strategies. With the recent advent of advanced proteomic technologies, the E. coli proteome has been used for the validation of new technologies and methodologies such as sample prefractionation, protein enrichment, two-dimensional gel electrophoresis, protein detection, mass spectrometry (MS), combinatorial assays with n-dimensional chromatographies and MS, and image analysis software. These important technologies will not only provide a great amount of additional information on the E. coli proteome but also synergistically contribute to other proteomic studies. Here, we review the past development and current status of E. coli proteome research in terms of its biological, biotechnological, and methodological significance and suggest future prospects. PMID:16760308
Institutions and national development in Latin America: a comparative study
Portes, Alejandro; Smith, Lori D.
2013-01-01
We review the theoretical and empirical literatures on the role of institutions on national development as a prelude to present a more rigorous and measurable definition of the concept and a methodology to study this relationship at the national and subnational levels. The existing research literature features conflicting definitions of the concept of “institutions” and empirical tests based mostly on reputational indices, with countries as units of analysis. The present study’s methodology is based on a set of five strategic organizations studied comparatively in five Latin American countries. These include key federal agencies, public administrative organizations, and stock exchanges. Systematic analysis of results show a pattern of differences between economically-oriented institutions and those entrusted with providing basic services to the general population. Consistent differences in institutional quality also emerge across countries, despite similar levels of economic development. Using the algebraic methods developed by Ragin, we test six hypotheses about factors determining the developmental character of particular institutions. Implications of results for theory and for methodological practices of future studies in this field are discussed. PMID:26543407
Evaluation Methodology. The Evaluation Exchange. Volume 11, Number 2, Summer 2005
ERIC Educational Resources Information Center
Coffman, Julia, Ed.
2005-01-01
This is the third issue of "The Evaluation Exchange" devoted entirely to the theme of methodology, though every issue tries to identify new methodological choices, the instructive ways in which people have applied or combined different methods, and emerging methodological trends. For example, lately "theories of change" have gained almost…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Hongbin; Szilard, Ronaldo; Epiney, Aaron
Under the auspices of the DOE LWRS Program RISMC Industry Application ECCS/LOCA, INL has engaged staff from both South Texas Project (STP) and the Texas A&M University (TAMU) to produce a generic pressurized water reactor (PWR) model including reactor core, clad/fuel design and systems thermal hydraulics based on the South Texas Project (STP) nuclear power plant, a 4-Loop Westinghouse PWR. A RISMC toolkit, named LOCA Toolkit for the U.S. (LOTUS), has been developed for use in this generic PWR plant model to assess safety margins for the proposed NRC 10 CFR 50.46c rule, Emergency Core Cooling System (ECCS) performance duringmore » LOCA. This demonstration includes coupled analysis of core design, fuel design, thermalhydraulics and systems analysis, using advanced risk analysis tools and methods to investigate a wide range of results. Within this context, a multi-physics best estimate plus uncertainty (MPBEPU) methodology framework is proposed.« less
Comparative Analysis of Biosurveillance Methodologies
2006-03-01
an average of one new disease emerged annually. Some of the more memorable diseases include Legionnaires ’ disease in 1977, HIV/AIDS in 1981, West...public must be prepared. The belief that the discovery of antibiotics would lead to a disease -free society has been proven wrong. Kathleen Gensheimer...85 Branswell, H. "Anthrax Scares may Fuel Growth of Antibiotic Resistance." Canadian Press (2001). CDC website. “Bioterrorism Agents and Disease
The speed-accuracy tradeoff: history, physiology, methodology, and behavior
Heitz, Richard P.
2014-01-01
There are few behavioral effects as ubiquitous as the speed-accuracy tradeoff (SAT). From insects to rodents to primates, the tendency for decision speed to covary with decision accuracy seems an inescapable property of choice behavior. Recently, the SAT has received renewed interest, as neuroscience approaches begin to uncover its neural underpinnings and computational models are compelled to incorporate it as a necessary benchmark. The present work provides a comprehensive overview of SAT. First, I trace its history as a tractable behavioral phenomenon and the role it has played in shaping mathematical descriptions of the decision process. Second, I present a “users guide” of SAT methodology, including a critical review of common experimental manipulations and analysis techniques and a treatment of the typical behavioral patterns that emerge when SAT is manipulated directly. Finally, I review applications of this methodology in several domains. PMID:24966810
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-26
... Introductions. 2. Member Discussion Methodology Options for Identifying Emerging Technologies. 3. Public... DEPARTMENT OF COMMERCE Bureau of Industry and Security Emerging Technology and Research Advisory Committee; Notice of Partially Closed Meeting The Emerging Technology and Research Advisory Committee (ETRAC...
NASA Astrophysics Data System (ADS)
Zhang, Lin
2014-02-01
Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to different technology features? And how can pieces of qualitative and quantitative results be integrated to achieve a broader understanding of technology designs? To address these issues, this paper proposes a meta-analysis method. Detailed explanations about the structure of the methodology and its scientific mechanism are provided for discussions and suggestions. This paper ends with an in-depth discussion on the concerns and questions that educational researchers might raise, such as how this methodology takes care of learning contexts.
Cost-effectiveness Analysis Appraisal and Application: An Emergency Medicine Perspective.
April, Michael D; Murray, Brian P
2017-06-01
Cost-effectiveness is an important goal for emergency care delivery. The many diagnostic, treatment, and disposition decisions made in the emergency department (ED) have a significant impact upon healthcare resource utilization. Cost-effectiveness analysis (CEA) is an analytic tool to optimize these resource allocation decisions through the systematic comparison of costs and effects of alternative healthcare decisions. Yet few emergency medicine leaders and policymakers have any formal training in CEA methodology. This paper provides an introduction to the interpretation and use of CEA with a focus on application to emergency medicine problems and settings. It applies a previously published CEA to the hypothetical case of a patient presenting to the ED with chest pain who requires risk stratification. This paper uses a widely cited checklist to appraise the CEA. This checklist serves as a vehicle for presenting basic CEA terminology and concepts. General topics of focus include measurement of costs and outcomes, incremental analysis, and sensitivity analysis. Integrated throughout the paper are recommendations for good CEA practice with emphasis on the guidelines published by the U.S. Panel on Cost-Effectiveness in Health and Medicine. Unique challenges for emergency medicine CEAs discussed include the projection of long-term outcomes from emergent interventions, costing ED services, and applying study results to diverse patient populations across various ED settings. The discussion also includes an overview of the limitations inherent in applying CEA results to clinical practice to include the lack of incorporation of noncost considerations in CEA (e.g., ethics). After reading this article, emergency medicine leaders and researchers will have an enhanced understanding of the basics of CEA critical appraisal and application. The paper concludes with an overview of economic evaluation resources for readers interested in conducting ED-based economic evaluation studies. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.
Nanotechnology risk perceptions and communication: emerging technologies, emerging challenges.
Pidgeon, Nick; Harthorn, Barbara; Satterfield, Terre
2011-11-01
Nanotechnology involves the fabrication, manipulation, and control of materials at the atomic level and may also bring novel uncertainties and risks. Potential parallels with other controversial technologies mean there is a need to develop a comprehensive understanding of processes of public perception of nanotechnology uncertainties, risks, and benefits, alongside related communication issues. Study of perceptions, at so early a stage in the development trajectory of a technology, is probably unique in the risk perception and communication field. As such it also brings new methodological and conceptual challenges. These include: dealing with the inherent diversity of the nanotechnology field itself; the unfamiliar and intangible nature of the concept, with few analogies to anchor mental models or risk perceptions; and the ethical and value questions underlying many nanotechnology debates. Utilizing the lens of social amplification of risk, and drawing upon the various contributions to this special issue of Risk Analysis on Nanotechnology Risk Perceptions and Communication, nanotechnology may at present be an attenuated hazard. The generic idea of "upstream public engagement" for emerging technologies such as nanotechnology is also discussed, alongside its importance for future work with emerging technologies in the risk communication field. © 2011 Society for Risk Analysis.
School-Based Methylphenidate Placebo Protocols: Methodological and Practical Issues.
ERIC Educational Resources Information Center
Hyman, Irwin A.; Wojtowicz, Alexandra; Lee, Kee Duk; Haffner, Mary Elizabeth; Fiorello, Catherine A.; And Others
1998-01-01
Focuses on methodological issues involved in choosing instruments to monitor behavior, once a comprehensive evaluation has suggested trials on Ritalin. Case examples illustrate problems of teacher compliance in filling out measures, supplying adequate placebos, and obtaining physical cooperation. Emerging school-based methodologies are discussed…
A Methodology for the Emerging: Bringing College and Community Together
ERIC Educational Resources Information Center
Kazanjian, Christopher J.
2012-01-01
The purpose of this qualitative case study is to create a unified humanistic methodology for institutions of higher education engaging undergraduate students and diverse/displaced youth in pro-social group activity. Scholarly researchers have expressed the current methodological disconnect between institutions seeking to accommodate displaced…
Chaos of Textures or "Tapisserie"? A Model for Creative Teacher Education Curriculum Design
ERIC Educational Resources Information Center
Simon, Sue E.
2013-01-01
A tapestry or "tapisserie" methodology, inspired by Denzin and Lincoln's "bricolage" methodology (2000), emerged during the complex task of re-developing teacher education programs at the University of the Sunshine Coast, Queensland, Australia. "Tapisserie" methodology highlights the pivotal task of determining…
Emergent Pedagogy and Affect in Collaborative Research: A Metho-Pedagogical Paradigm
ERIC Educational Resources Information Center
Gallagher, Kathleen; Wessels, Anne
2011-01-01
The widespread turn towards "collaboration" in qualitative research methodologies warrants careful and continuous critique. This paper addresses the possibilities and the challenges of collaborative methodology, and in particular what happens when the line between pedagogy and methodology is blurred in classroom-based ethnographic…
Biological Parametric Mapping: A Statistical Toolbox for Multi-Modality Brain Image Analysis
Casanova, Ramon; Ryali, Srikanth; Baer, Aaron; Laurienti, Paul J.; Burdette, Jonathan H.; Hayasaka, Satoru; Flowers, Lynn; Wood, Frank; Maldjian, Joseph A.
2006-01-01
In recent years multiple brain MR imaging modalities have emerged; however, analysis methodologies have mainly remained modality specific. In addition, when comparing across imaging modalities, most researchers have been forced to rely on simple region-of-interest type analyses, which do not allow the voxel-by-voxel comparisons necessary to answer more sophisticated neuroscience questions. To overcome these limitations, we developed a toolbox for multimodal image analysis called biological parametric mapping (BPM), based on a voxel-wise use of the general linear model. The BPM toolbox incorporates information obtained from other modalities as regressors in a voxel-wise analysis, thereby permitting investigation of more sophisticated hypotheses. The BPM toolbox has been developed in MATLAB with a user friendly interface for performing analyses, including voxel-wise multimodal correlation, ANCOVA, and multiple regression. It has a high degree of integration with the SPM (statistical parametric mapping) software relying on it for visualization and statistical inference. Furthermore, statistical inference for a correlation field, rather than a widely-used T-field, has been implemented in the correlation analysis for more accurate results. An example with in-vivo data is presented demonstrating the potential of the BPM methodology as a tool for multimodal image analysis. PMID:17070709
CrossTalk: The Journal of Defense Software Engineering. Volume 19, Number 9
2006-09-01
it does. Several freely down- loadable methodologies have emerged to support the developer in modeling threats to applications and other soft...SECURIS. Model -Driven Develop - ment and Analysis of Secure Information Systems <www.sintef.no/ content/page1_1824.aspx>. 10. The SECURIS Project ...By applying these methods to the SDLC , we can actively reduce the number of known vulnerabilities in software as it is developed . For
The social reality of the imaginary audience: a grounded theory approach.
Bell, Joanna H; Bromnick, Rachel D
2003-01-01
Traditional approaches to understanding the imaginary audience are challenged in this study. Three hundred sixty-one British schoolchildren (aged 14 and 15 years) were asked to express their worries and concerns, using grounded theory methodology. Qualitative responses were collated and coded according to emerging categories, with "what other people think" identified as the central concern. In particular, the findings are used to critique Elkind's (1967) theory of adolescent egocentrism. Data presented in this study suggest that adolescents worry about what other people think because there are real personal and social consequences. Such concerns are seen as being based in social reality and are not imaginary as Elkind suggested. In conclusion, new methodologies which place young people at the center of the analysis are advocated.
Varjoshani, Nasrin Jafari; Hosseini, Mohammad Ali; Khankeh, Hamid Reza; Ahmadi, Fazlollah
2015-01-01
Background: A highly important factor in enhancing quality of patient care and job satisfaction of health care staff is inter-professional communication. Due to the critical nature of the work environment, the large number of staff and units, and complexity of professional tasks and interventions, inter-professional communication in an emergency department is particularly and exceptionally important. Despite its importance, inter-professional communication in emergency department seems unfavorable. Thus, this study was designed to explain barriers to inter-professional communication in an emergency department. Methodology & Methods: This was a qualitative study with content analysis approach, based on interviews conducted with 26 participants selected purposively, with diversity of occupation, position, age, gender, history, and place of work. Interviews were in-depth and semi-structured, and data were analyzed using the inductive content analysis approach. Results: In total, 251 initial codes were extracted from 30 interviews (some of the participants re-interviewed) and in the reducing trend of final results, 5 categories were extracted including overcrowded emergency, stressful emergency environment, not discerning emergency conditions, ineffective management, and inefficient communication channels. Tumultuous atmosphere (physical, mental) was the common theme between categories, and was decided to be the main barrier to effective inter-professional communication. Conclusion: Tumultuous atmosphere (physical-mental) was found to be the most important barrier to inter-professional communication. This study provided a better understanding of these barriers in emergency department, often neglected in most studies. It is held that by reducing environmental turmoil (physical-mental), inter-professional communication can be improved, thereby improving patient care outcomes and personnel job satisfaction. PMID:25560351
Chadeau-Hyam, Marc; Campanella, Gianluca; Jombart, Thibaut; Bottolo, Leonardo; Portengen, Lutzen; Vineis, Paolo; Liquet, Benoit; Vermeulen, Roel C H
2013-08-01
Recent technological advances in molecular biology have given rise to numerous large-scale datasets whose analysis imposes serious methodological challenges mainly relating to the size and complex structure of the data. Considerable experience in analyzing such data has been gained over the past decade, mainly in genetics, from the Genome-Wide Association Study era, and more recently in transcriptomics and metabolomics. Building upon the corresponding literature, we provide here a nontechnical overview of well-established methods used to analyze OMICS data within three main types of regression-based approaches: univariate models including multiple testing correction strategies, dimension reduction techniques, and variable selection models. Our methodological description focuses on methods for which ready-to-use implementations are available. We describe the main underlying assumptions, the main features, and advantages and limitations of each of the models. This descriptive summary constitutes a useful tool for driving methodological choices while analyzing OMICS data, especially in environmental epidemiology, where the emergence of the exposome concept clearly calls for unified methods to analyze marginally and jointly complex exposure and OMICS datasets. Copyright © 2013 Wiley Periodicals, Inc.
Examining social, physical, and environmental dimensions of tornado vulnerability in Texas.
Siebeneck, Laura
2016-01-01
To develop a vulnerability model that captures the social, physical, and environmental dimensions of tornado vulnerability of Texas counties. Guided by previous research and methodologies proposed in the hazards and emergency management literature, a principle components analysis is used to create a tornado vulnerability index. Data were gathered from open source information available through the US Census Bureau, American Community Surveys, and the Texas Natural Resources Information System. Texas counties. The results of the model yielded three indices that highlight geographic variability of social vulnerability, built environment vulnerability, and tornado hazard throughout Texas. Further analyses suggest that counties with the highest tornado vulnerability include those with high population densities and high tornado risk. This article demonstrates one method for assessing statewide tornado vulnerability and presents how the results of this type of analysis can be applied by emergency managers towards the reduction of tornado vulnerability in their communities.
Modeling operators' emergency response time for chemical processing operations.
Murray, Susan L; Harputlu, Emrah; Mentzer, Ray A; Mannan, M Sam
2014-01-01
Operators have a crucial role during emergencies at a variety of facilities such as chemical processing plants. When an abnormality occurs in the production process, the operator often has limited time to either take corrective actions or evacuate before the situation becomes deadly. It is crucial that system designers and safety professionals can estimate the time required for a response before procedures and facilities are designed and operations are initiated. There are existing industrial engineering techniques to establish time standards for tasks performed at a normal working pace. However, it is reasonable to expect the time required to take action in emergency situations will be different than working at a normal production pace. It is possible that in an emergency, operators will act faster compared to a normal pace. It would be useful for system designers to be able to establish a time range for operators' response times for emergency situations. This article develops a modeling approach to estimate the time standard range for operators taking corrective actions or following evacuation procedures in emergency situations. This will aid engineers and managers in establishing time requirements for operators in emergency situations. The methodology used for this study combines a well-established industrial engineering technique for determining time requirements (predetermined time standard system) and adjustment coefficients for emergency situations developed by the authors. Numerous videos of workers performing well-established tasks at a maximum pace were studied. As an example, one of the tasks analyzed was pit crew workers changing tires as quickly as they could during a race. The operations in these videos were decomposed into basic, fundamental motions (such as walking, reaching for a tool, and bending over) by studying the videos frame by frame. A comparison analysis was then performed between the emergency pace and the normal working pace operations to determine performance coefficients. These coefficients represent the decrease in time required for various basic motions in emergency situations and were used to model an emergency response. This approach will make hazardous operations requiring operator response, alarm management, and evacuation processes easier to design and predict. An application of this methodology is included in the article. The time required for an emergency response was roughly a one-third faster than for a normal response time.
Imaging genetics paradigms in depression research: Systematic review and meta-analysis.
Pereira, Lícia P; Köhler, Cristiano A; Stubbs, Brendon; Miskowiak, Kamilla W; Morris, Gerwyn; de Freitas, Bárbara P; Thompson, Trevor; Fernandes, Brisa S; Brunoni, André R; Maes, Michael; Pizzagalli, Diego A; Carvalho, André F
2018-05-17
Imaging genetics studies involving participants with major depressive disorder (MDD) have expanded. Nevertheless, findings have been inconsistent. Thus, we conducted a systematic review and meta-analysis of imaging genetics studies that enrolled MDD participants across major databases through June 30th, 2017. Sixty-five studies met eligibility criteria (N = 4034 MDD participants and 3293 controls), and there was substantial between-study variability in the methodological quality of included studies. However, few replicated findings emerged from this literature with only 22 studies providing data for meta-analyses (882 participants with MDD and 616 controls). Total hippocampal volumes did not significantly vary in MDD participants or controls carrying either the BDNF Val66Met 'Met' (386 participants with MDD and 376 controls) or the 5-HTTLPR short 'S' (310 participants with MDD and 230 controls) risk alleles compared to non-carriers. Heterogeneity across studies was explored through meta-regression and subgroup analyses. Gender distribution, the use of medications, segmentation methods used to measure the hippocampus, and age emerged as potential sources of heterogeneity across studies that assessed the association of 5-HTTLPR short 'S' alleles and hippocampal volumes. Our data also suggest that the methodological quality of included studies, publication year, and the inclusion of brain volume as a covariate contributed to the heterogeneity of studies that assessed the association of the BDNF Val66Met 'Met' risk allele and hippocampal volumes. In exploratory voxel-wise meta-analyses, MDD participants carrying the 5-HTTLPR short 'S' allele had white matter microstructural abnormalities predominantly in the corpus callosum, while carriers of the BDNF Val66Met 'Met' allele had larger gray matter volumes and hyperactivation of the right middle frontal gyrus compared to non-carriers. In conclusion, few replicated findings emerged from imaging genetics studies that included participants with MDD. Nevertheless, we explored and identified specific sources of heterogeneity across studies, which could provide insights to enhance the reproducibility of this emerging field. Copyright © 2018 Elsevier Inc. All rights reserved.
Multi-scaling allometric analysis for urban and regional development
NASA Astrophysics Data System (ADS)
Chen, Yanguang
2017-01-01
The concept of allometric growth is based on scaling relations, and it has been applied to urban and regional analysis for a long time. However, most allometric analyses were devoted to the single proportional relation between two elements of a geographical system. Few researches focus on the allometric scaling of multielements. In this paper, a process of multiscaling allometric analysis is developed for the studies on spatio-temporal evolution of complex systems. By means of linear algebra, general system theory, and by analogy with the analytical hierarchy process, the concepts of allometric growth can be integrated with the ideas from fractal dimension. Thus a new methodology of geo-spatial analysis and the related theoretical models emerge. Based on the least squares regression and matrix operations, a simple algorithm is proposed to solve the multiscaling allometric equation. Applying the analytical method of multielement allometry to Chinese cities and regions yields satisfying results. A conclusion is reached that the multiscaling allometric analysis can be employed to make a comprehensive evaluation for the relative levels of urban and regional development, and explain spatial heterogeneity. The notion of multiscaling allometry may enrich the current theory and methodology of spatial analyses of urban and regional evolution.
Ribesse, Nathalie; Bossyns, Paul; Marchal, Bruno; Karemere, Hermes; Burman, Christopher J; Macq, Jean
2017-03-01
In the field of development cooperation, interest in systems thinking and complex systems theories as a methodological approach is increasingly recognised. And so it is in health systems research, which informs health development aid interventions. However, practical applications remain scarce to date. The objective of this article is to contribute to the body of knowledge by presenting the tools inspired by systems thinking and complexity theories and methodological lessons learned from their application. These tools were used in a case study. Detailed results of this study are in process for publication in additional articles. Applying a complexity 'lens', the subject of the case study is the role of long-term international technical assistance in supporting health administration reform at the provincial level in the Democratic Republic of Congo. The Methods section presents the guiding principles of systems thinking and complex systems, their relevance and implication for the subject under study, and the existing tools associated with those theories which inspired us in the design of the data collection and analysis process. The tools and their application processes are presented in the results section, and followed in the discussion section by the critical analysis of their innovative potential and emergent challenges. The overall methodology provides a coherent whole, each tool bringing a different and complementary perspective on the system.
Event-scale power law recession analysis: quantifying methodological uncertainty
NASA Astrophysics Data System (ADS)
Dralle, David N.; Karst, Nathaniel J.; Charalampous, Kyriakos; Veenstra, Andrew; Thompson, Sally E.
2017-01-01
The study of single streamflow recession events is receiving increasing attention following the presentation of novel theoretical explanations for the emergence of power law forms of the recession relationship, and drivers of its variability. Individually characterizing streamflow recessions often involves describing the similarities and differences between model parameters fitted to each recession time series. Significant methodological sensitivity has been identified in the fitting and parameterization of models that describe populations of many recessions, but the dependence of estimated model parameters on methodological choices has not been evaluated for event-by-event forms of analysis. Here, we use daily streamflow data from 16 catchments in northern California and southern Oregon to investigate how combinations of commonly used streamflow recession definitions and fitting techniques impact parameter estimates of a widely used power law recession model. Results are relevant to watersheds that are relatively steep, forested, and rain-dominated. The highly seasonal mediterranean climate of northern California and southern Oregon ensures study catchments explore a wide range of recession behaviors and wetness states, ideal for a sensitivity analysis. In such catchments, we show the following: (i) methodological decisions, including ones that have received little attention in the literature, can impact parameter value estimates and model goodness of fit; (ii) the central tendencies of event-scale recession parameter probability distributions are largely robust to methodological choices, in the sense that differing methods rank catchments similarly according to the medians of these distributions; (iii) recession parameter distributions are method-dependent, but roughly catchment-independent, such that changing the choices made about a particular method affects a given parameter in similar ways across most catchments; and (iv) the observed correlative relationship between the power-law recession scale parameter and catchment antecedent wetness varies depending on recession definition and fitting choices. Considering study results, we recommend a combination of four key methodological decisions to maximize the quality of fitted recession curves, and to minimize bias in the related populations of fitted recession parameters.
Lindberg, Elisabeth; Österberg, Sofia A.; Hörberg, Ulrica
2016-01-01
Phenomena in caring science are often complex and laden with meanings. Empirical research with the aim of capturing lived experiences is one way of revealing the complexity. Sometimes, however, results from empirical research need to be further discussed. One way is to further abstract the result and/or philosophically examine it. This has previously been performed and presented in scientific journals and doctoral theses, contributing to a greater understanding of phenomena in caring science. Although the intentions in many of these publications are laudable, the lack of methodological descriptions as well as a theoretical and systematic foundation can contribute to an ambiguity concerning how the results have emerged during the analysis. The aim of this paper is to describe the methodological support for the further abstraction of and/or philosophical examination of empirical findings. When trying to systematize the support procedures, we have used a reflective lifeworld research (RLR) approach. Based on the assumptions in RLR, this article will present methodological support for a theoretical examination that can include two stages. In the first stage, data from several (two or more) empirical results on an essential level are synthesized into a general structure. Sometimes the analysis ends with the general structure, but sometimes there is a need to proceed further. The second stage can then be a philosophical examination, in which the general structure is discussed in relation to a philosophical text, theory, or concept. It is important that the theories are brought in as the final stage after the completion of the analysis. Core dimensions of the described methodological support are, in accordance with RLR, openness, bridling, and reflection. The methodological support cannot be understood as fixed stages, but rather as a guiding light in the search for further meanings. PMID:26925926
Specific design features of an interpretative phenomenological analysis study.
Wagstaff, Christopher; Williams, Bob
2014-01-01
Report of an innovative use of interpretative phenomenological analysis (IPA) to enable an in-depth study of the experiences of disengagement from mental health services of black men with diagnoses of severe and enduring mental illness. The aim of IPA is to explore the sense that participants make of their personal and social worlds, while recognising the contribution of the researcher in interpreting the participants' interpretations of their experiences. Seven black male research participants were recruited to the study. The components of the study that contribute to the body of literature on IPA research design include: an engagement stage in the research; a second clarifying interview; discussion of clarifying questions and emergent themes with two academic service-users; and a post-interview meeting to discuss the themes emerging from the research study. The paper focuses on the contribution of the four specific design features of the study and how these enabled the researcher to engage with a population that is often deemed 'hard to reach'. The four distinctive methodological developments in the study emphasise the flexibility of IPA. These innovations assisted the researcher in developing a broader double hermeneutic that enabled reporting of the experiences of disengagement from mental health services of black men with diagnoses of severe and enduring mental illness. The distinctive design of this study further emphasises the flexibility of IPA, while simultaneously showing fidelity to the core principles underlying the research methodology.
Eastwood, John G; Jalaludin, Bin B; Kemp, Lynn A
2014-01-01
A recent criticism of social epidemiological studies, and multi-level studies in particular has been a paucity of theory. We will present here the protocol for a study that aims to build a theory of the social epidemiology of maternal depression. We use a critical realist approach which is trans-disciplinary, encompassing both quantitative and qualitative traditions, and that assumes both ontological and hierarchical stratification of reality. We describe a critical realist Explanatory Theory Building Method comprising of an: 1) emergent phase, 2) construction phase, and 3) confirmatory phase. A concurrent triangulated mixed method multilevel cross-sectional study design is described. The Emergent Phase uses: interviews, focus groups, exploratory data analysis, exploratory factor analysis, regression, and multilevel Bayesian spatial data analysis to detect and describe phenomena. Abductive and retroductive reasoning will be applied to: categorical principal component analysis, exploratory factor analysis, regression, coding of concepts and categories, constant comparative analysis, drawing of conceptual networks, and situational analysis to generate theoretical concepts. The Theory Construction Phase will include: 1) defining stratified levels; 2) analytic resolution; 3) abductive reasoning; 4) comparative analysis (triangulation); 5) retroduction; 6) postulate and proposition development; 7) comparison and assessment of theories; and 8) conceptual frameworks and model development. The strength of the critical realist methodology described is the extent to which this paradigm is able to support the epistemological, ontological, axiological, methodological and rhetorical positions of both quantitative and qualitative research in the field of social epidemiology. The extensive multilevel Bayesian studies, intensive qualitative studies, latent variable theory, abductive triangulation, and Inference to Best Explanation provide a strong foundation for Theory Construction. The study will contribute to defining the role that realism and mixed methods can play in explaining the social determinants and developmental origins of health and disease.
Methodological Approaches in MOOC Research: Retracing the Myth of Proteus
ERIC Educational Resources Information Center
Raffaghelli, Juliana Elisa; Cucchiara, Stefania; Persico, Donatella
2015-01-01
This paper explores the methodological approaches most commonly adopted in the scholarly literature on Massive Open Online Courses (MOOCs), published during the period January 2008-May 2014. In order to identify trends, gaps and criticalities related to the methodological approaches of this emerging field of research, we analysed 60 papers…
ERIC Educational Resources Information Center
Khalil, Deena; Kier, Meredith
2017-01-01
This article is about introducing Critical Race Design (CRD), a research methodology that centers race and equity at the nucleus of educational opportunities by design. First, the authors define design-based implementation research (DBIR; Penuel, Fishman, Cheng, & Sabelli, 2011) as an equity-oriented education research methodology where…
Diffraction or Reflection? Sketching the Contours of Two Methodologies in Educational Research
ERIC Educational Resources Information Center
Bozalek, Vivienne; Zembylas, Michalinos
2017-01-01
Internationally, an interest is emerging in a growing body of work on what has become known as "diffractive methodologies" drawing attention to ontological aspects of research. Diffractive methodologies have largely been developed in response to a dissatisfaction with practices of "reflexivity", which are seen to be grounded in…
Neurotech for Neuroscience: Unifying Concepts, Organizing Principles, and Emerging Tools
Silver, Rae; Boahen, Kwabena; Grillner, Sten; Kopell, Nancy; Olsen, Kathie L.
2012-01-01
The ability to tackle analysis of the brain at multiple levels simultaneously is emerging from rapid methodological developments. The classical research strategies of “measure,” “model,” and “make” are being applied to the exploration of nervous system function. These include novel conceptual and theoretical approaches, creative use of mathematical modeling, and attempts to build brain-like devices and systems, as well as other developments including instrumentation and statistical modeling (not covered here). Increasingly, these efforts require teams of scientists from a variety of traditional scientific disciplines to work together. The potential of such efforts for understanding directed motor movement, emergence of cognitive function from neuronal activity, and development of neuromimetic computers are described by a team that includes individuals experienced in behavior and neuroscience, mathematics, and engineering. Funding agencies, including the National Science Foundation, explore the potential of these changing frontiers of research for developing research policies and long-term planning. PMID:17978017
Menzin, Joseph; Marton, Jeno P; Menzin, Jordan A; Willke, Richard J; Woodward, Rebecca M; Federico, Victoria
2012-06-25
Researchers and policy makers have determined that accounting for productivity costs, or "indirect costs," may be as important as including direct medical expenditures when evaluating the societal value of health interventions. These costs are also important when estimating the global burden of disease. The estimation of indirect costs is commonly done on a country-specific basis. However, there are few studies that evaluate indirect costs across countries using a consistent methodology. Using the human capital approach, we developed a model that estimates productivity costs as the present value of lifetime earnings (PVLE) lost due to premature mortality. Applying this methodology, the model estimates productivity costs for 29 selected countries, both developed and emerging. We also provide an illustration of how the inclusion of productivity costs contributes to an analysis of the societal burden of smoking. A sensitivity analysis is undertaken to assess productivity costs on the basis of the friction cost approach. PVLE estimates were higher for certain subpopulations, such as men, younger people, and people in developed countries. In the case study, productivity cost estimates from our model showed that productivity loss was a substantial share of the total cost burden of premature mortality due to smoking, accounting for over 75 % of total lifetime costs in the United States and 67 % of total lifetime costs in Brazil. Productivity costs were much lower using the friction cost approach among those of working age. Our PVLE model is a novel tool allowing researchers to incorporate the value of lost productivity due to premature mortality into economic analyses of treatments for diseases or health interventions. We provide PVLE estimates for a number of emerging and developed countries. Including productivity costs in a health economics study allows for a more comprehensive analysis, and, as demonstrated by our illustration, can have important effects on the results and conclusions.
2012-01-01
Background Researchers and policy makers have determined that accounting for productivity costs, or “indirect costs,” may be as important as including direct medical expenditures when evaluating the societal value of health interventions. These costs are also important when estimating the global burden of disease. The estimation of indirect costs is commonly done on a country-specific basis. However, there are few studies that evaluate indirect costs across countries using a consistent methodology. Methods Using the human capital approach, we developed a model that estimates productivity costs as the present value of lifetime earnings (PVLE) lost due to premature mortality. Applying this methodology, the model estimates productivity costs for 29 selected countries, both developed and emerging. We also provide an illustration of how the inclusion of productivity costs contributes to an analysis of the societal burden of smoking. A sensitivity analysis is undertaken to assess productivity costs on the basis of the friction cost approach. Results PVLE estimates were higher for certain subpopulations, such as men, younger people, and people in developed countries. In the case study, productivity cost estimates from our model showed that productivity loss was a substantial share of the total cost burden of premature mortality due to smoking, accounting for over 75 % of total lifetime costs in the United States and 67 % of total lifetime costs in Brazil. Productivity costs were much lower using the friction cost approach among those of working age. Conclusions Our PVLE model is a novel tool allowing researchers to incorporate the value of lost productivity due to premature mortality into economic analyses of treatments for diseases or health interventions. We provide PVLE estimates for a number of emerging and developed countries. Including productivity costs in a health economics study allows for a more comprehensive analysis, and, as demonstrated by our illustration, can have important effects on the results and conclusions. PMID:22731620
Angelis, Aris; Kanavos, Panos
2016-05-01
In recent years, multiple criteria decision analysis (MCDA) has emerged as a likely alternative to address shortcomings in health technology assessment (HTA) by offering a more holistic perspective to value assessment and acting as an alternative priority setting tool. In this paper, we argue that MCDA needs to subscribe to robust methodological processes related to the selection of objectives, criteria and attributes in order to be meaningful in the context of healthcare decision making and fulfil its role in value-based assessment (VBA). We propose a methodological process, based on multi-attribute value theory (MAVT) methods comprising five distinct phases, outline the stages involved in each phase and discuss their relevance in the HTA process. Importantly, criteria and attributes need to satisfy a set of desired properties, otherwise the outcome of the analysis can produce spurious results and misleading recommendations. Assuming the methodological process we propose is adhered to, the application of MCDA presents three very distinct advantages to decision makers in the context of HTA and VBA: first, it acts as an instrument for eliciting preferences on the performance of alternative options across a wider set of explicit criteria, leading to a more complete assessment of value; second, it allows the elicitation of preferences across the criteria themselves to reflect differences in their relative importance; and, third, the entire process of preference elicitation can be informed by direct stakeholder engagement, and can therefore reflect their own preferences. All features are fully transparent and facilitate decision making.
Reference Model 6 (RM6): Oscillating Wave Energy Converter.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bull, Diana L; Smith, Chris; Jenne, Dale Scott
This report is an addendum to SAND2013-9040: Methodology for Design and Economic Analysis of Marine Energy Conversion (MEC) Technologies. This report describes an Oscillating Water Column Wave Energy Converter reference model design in a complementary manner to Reference Models 1-4 contained in the above report. In this report, a conceptual design for an Oscillating Water Column Wave Energy Converter (WEC) device appropriate for the modeled reference resource site was identified, and a detailed backward bent duct buoy (BBDB) device design was developed using a combination of numerical modeling tools and scaled physical models. Our team used the methodology in SAND2013-9040more » for the economic analysis that included costs for designing, manufacturing, deploying, and operating commercial-scale MEC arrays, up to 100 devices. The methodology was applied to identify key cost drivers and to estimate levelized cost of energy (LCOE) for this RM6 Oscillating Water Column device in dollars per kilowatt-hour ($/kWh). Although many costs were difficult to estimate at this time due to the lack of operational experience, the main contribution of this work was to disseminate a detailed set of methodologies and models that allow for an initial cost analysis of this emerging technology. This project is sponsored by the U.S. Department of Energy's (DOE) Wind and Water Power Technologies Program Office (WWPTO), within the Office of Energy Efficiency & Renewable Energy (EERE). Sandia National Laboratories, the lead in this effort, collaborated with partners from National Laboratories, industry, and universities to design and test this reference model.« less
Prediction of ball and roller bearing thermal and kinematic performance by computer analysis
NASA Technical Reports Server (NTRS)
Pirvics, J.; Kleckner, R. J.
1983-01-01
Characteristics of good computerized analysis software are suggested. These general remarks and an overview of representative software precede a more detailed discussion of load support system analysis program structure. Particular attention is directed at a recent cylindrical roller bearing analysis as an example of the available design tools. Selected software modules are then examined to reveal the detail inherent in contemporary analysis. This leads to a brief section on current design computation which seeks to suggest when and why computerized analysis is warranted. An example concludes the argument offered for such design methodology. Finally, remarks are made concerning needs for model development to address effects which are now considered to be secondary but are anticipated to emerge to primary status in the near future.
Emerging Educational Institutional Decision-Making Matrix
ERIC Educational Resources Information Center
Ashford-Rowe, Kevin H.; Holt, Marnie
2011-01-01
The "emerging educational institutional decision-making matrix" is developed to allow educational institutions to adopt a rigorous and consistent methodology of determining which of the myriad of emerging educational technologies will be the most compelling for the institution, particularly ensuring that it is the educational or pedagogical but…
Mapping a research agenda for the science of team science
Falk-Krzesinski, Holly J; Contractor, Noshir; Fiore, Stephen M; Hall, Kara L; Kane, Cathleen; Keyton, Joann; Klein, Julie Thompson; Spring, Bonnie; Stokols, Daniel; Trochim, William
2012-01-01
An increase in cross-disciplinary, collaborative team science initiatives over the last few decades has spurred interest by multiple stakeholder groups in empirical research on scientific teams, giving rise to an emergent field referred to as the science of team science (SciTS). This study employed a collaborative team science concept-mapping evaluation methodology to develop a comprehensive research agenda for the SciTS field. Its integrative mixed-methods approach combined group process with statistical analysis to derive a conceptual framework that identifies research areas of team science and their relative importance to the emerging SciTS field. The findings from this concept-mapping project constitute a lever for moving SciTS forward at theoretical, empirical, and translational levels. PMID:23223093
Sensitivity analysis of Repast computational ecology models with R/Repast.
Prestes García, Antonio; Rodríguez-Patón, Alfonso
2016-12-01
Computational ecology is an emerging interdisciplinary discipline founded mainly on modeling and simulation methods for studying ecological systems. Among the existing modeling formalisms, the individual-based modeling is particularly well suited for capturing the complex temporal and spatial dynamics as well as the nonlinearities arising in ecosystems, communities, or populations due to individual variability. In addition, being a bottom-up approach, it is useful for providing new insights on the local mechanisms which are generating some observed global dynamics. Of course, no conclusions about model results could be taken seriously if they are based on a single model execution and they are not analyzed carefully. Therefore, a sound methodology should always be used for underpinning the interpretation of model results. The sensitivity analysis is a methodology for quantitatively assessing the effect of input uncertainty in the simulation output which should be incorporated compulsorily to every work based on in-silico experimental setup. In this article, we present R/Repast a GNU R package for running and analyzing Repast Simphony models accompanied by two worked examples on how to perform global sensitivity analysis and how to interpret the results.
Violanti, S; Fraschetta, M; Adda, S; Caputo, E
2009-12-01
Within the framework of Environmental Agencies system's activities, coordinated by ISPRA (superior institute for environmental protection and research), a comparison among measurements was designed and accomplished, in order to go into depth on the matter of measurement problems and to evaluate magnetic field at power frequencies. These measurements have been taken near medium voltage /low voltage transformer substation. This project was developed with the contribution of several experts who belong to different Regional Agencies. In three of these regions, substations having specific international standard characteristics were chosen; then a measurement and data analysis protocol was arranged. Data analysis showed a good level of coherence among results obtained by different laboratories. However, a range of problems emerged, either during the protocol predisposition and definition of the data analysis procedure or during the execution of measures and data reprocessing, because of the spatial and temporal variability of magnetic field. These problems represent elements of particular interest in determining a correct measurement methodology, whose purpose is the comparison with limits of exposure, attention values and quality targets.
Cortical Signal Analysis and Advances in Functional Near-Infrared Spectroscopy Signal: A Review.
Kamran, Muhammad A; Mannan, Malik M Naeem; Jeong, Myung Yung
2016-01-01
Functional near-infrared spectroscopy (fNIRS) is a non-invasive neuroimaging modality that measures the concentration changes of oxy-hemoglobin (HbO) and de-oxy hemoglobin (HbR) at the same time. It is an emerging cortical imaging modality with a good temporal resolution that is acceptable for brain-computer interface applications. Researchers have developed several methods in last two decades to extract the neuronal activation related waveform from the observed fNIRS time series. But still there is no standard method for analysis of fNIRS data. This article presents a brief review of existing methodologies to model and analyze the activation signal. The purpose of this review article is to give a general overview of variety of existing methodologies to extract useful information from measured fNIRS data including pre-processing steps, effects of differential path length factor (DPF), variations and attributes of hemodynamic response function (HRF), extraction of evoked response, removal of physiological noises, instrumentation, and environmental noises and resting/activation state functional connectivity. Finally, the challenges in the analysis of fNIRS signal are summarized.
Cortical Signal Analysis and Advances in Functional Near-Infrared Spectroscopy Signal: A Review
Kamran, Muhammad A.; Mannan, Malik M. Naeem; Jeong, Myung Yung
2016-01-01
Functional near-infrared spectroscopy (fNIRS) is a non-invasive neuroimaging modality that measures the concentration changes of oxy-hemoglobin (HbO) and de-oxy hemoglobin (HbR) at the same time. It is an emerging cortical imaging modality with a good temporal resolution that is acceptable for brain-computer interface applications. Researchers have developed several methods in last two decades to extract the neuronal activation related waveform from the observed fNIRS time series. But still there is no standard method for analysis of fNIRS data. This article presents a brief review of existing methodologies to model and analyze the activation signal. The purpose of this review article is to give a general overview of variety of existing methodologies to extract useful information from measured fNIRS data including pre-processing steps, effects of differential path length factor (DPF), variations and attributes of hemodynamic response function (HRF), extraction of evoked response, removal of physiological noises, instrumentation, and environmental noises and resting/activation state functional connectivity. Finally, the challenges in the analysis of fNIRS signal are summarized. PMID:27375458
de Bruin, Anique B H
2016-12-01
Since emergence of the field 'Educational Neuroscience' (EN) in the late nineties of the previous century, a debate has emerged about the potential this field holds to influence teaching and learning in the classroom. By now, most agree that the original claims promising direct translations to teaching and learning were too strong. I argue here that research questions in (health professions) education require multi-methodological approaches, including neuroscience, while carefully weighing what (combination of) approaches are most suitable given a research question. Only through a multi-methodological approach will convergence of evidence emerge, which is so desperately needed for improving teaching and learning in the classroom. However, both researchers and teachers should become aware of the so-called 'seductive allure' of EN; that is, the demonstrable physical location and apparent objectivity of the measurements can be interpreted as yielding more powerful evidence and warranting stronger conclusions than, e.g., behavioral experiments, where in fact oftentimes the reverse is the case. I conclude that our tendency as researchers to commit ourselves to one methodological approach and to addressing educational research questions from a single methodological perspective is limiting progress in educational science and in translation to education.
Abraham, I L; Chalifoux, Z L; Evers, G C; De Geest, S
1995-04-01
This study compared the conceptual foci and methodological characteristics of research projects which tested the effects of nursing interventions, published in four general nursing research journals with predominantly North American, and two with predominantly European/International authorship and readership. Dimensions and variables of comparison included: nature of subjects, design issues, statistical methodology, statistical power, and types of interventions and outcomes. Although some differences emerged, the most striking and consistent finding was that there were no statistically significant differences (and thus similarities) in the content foci and methodological parameters of the intervention studies published in both groups of journals. We conclude that European/International and North American nursing intervention studies, as reported in major general nursing research journals, are highly similar in the parameters studied, yet in need of overall improvement. Certainly, there is no empirical support for the common (explicit or implicit) ethnocentric American bias that leadership in nursing intervention research resides with and in the United States of America.
Space Station man-machine automation trade-off analysis
NASA Technical Reports Server (NTRS)
Zimmerman, W. F.; Bard, J.; Feinberg, A.
1985-01-01
The man machine automation tradeoff methodology presented is of four research tasks comprising the autonomous spacecraft system technology (ASST) project. ASST was established to identify and study system level design problems for autonomous spacecraft. Using the Space Station as an example spacecraft system requiring a certain level of autonomous control, a system level, man machine automation tradeoff methodology is presented that: (1) optimizes man machine mixes for different ground and on orbit crew functions subject to cost, safety, weight, power, and reliability constraints, and (2) plots the best incorporation plan for new, emerging technologies by weighing cost, relative availability, reliability, safety, importance to out year missions, and ease of retrofit. A fairly straightforward approach is taken by the methodology to valuing human productivity, it is still sensitive to the important subtleties associated with designing a well integrated, man machine system. These subtleties include considerations such as crew preference to retain certain spacecraft control functions; or valuing human integration/decision capabilities over equivalent hardware/software where appropriate.
NASA Astrophysics Data System (ADS)
Sobradelo, Rosa; Martí, Joan; Kilburn, Christopher; López, Carmen
2014-05-01
Understanding the potential evolution of a volcanic crisis is crucial to improving the design of effective mitigation strategies. This is especially the case for volcanoes close to densely-populated regions, where inappropriate decisions may trigger widespread loss of life, economic disruption and public distress. An outstanding goal for improving the management of volcanic crises, therefore, is to develop objective, real-time methodologies for evaluating how an emergency will develop and how scientists communicate with decision makers. Here we present a new model BADEMO (Bayesian Decision Model) that applies a general and flexible, probabilistic approach to managing volcanic crises. The model combines the hazard and risk factors that decision makers need for a holistic analysis of a volcanic crisis. These factors include eruption scenarios and their probabilities of occurrence, the vulnerability of populations and their activities, and the costs of false alarms and failed forecasts. The model can be implemented before an emergency, to identify actions for reducing the vulnerability of a district; during an emergency, to identify the optimum mitigating actions and how these may change as new information is obtained; and after an emergency, to assess the effectiveness of a mitigating response and, from the results, to improve strategies before another crisis occurs. As illustrated by a retrospective analysis of the 2011 eruption of El Hierro, in the Canary Islands, BADEMO provides the basis for quantifying the uncertainty associated with each recommended action as an emergency evolves, and serves as a mechanism for improving communications between scientists and decision makers.
The Fungal Frontier: A Comparative Analysis of Methods Used in the Study of the Human Gut Mycobiome.
Huseyin, Chloe E; Rubio, Raul Cabrera; O'Sullivan, Orla; Cotter, Paul D; Scanlan, Pauline D
2017-01-01
The human gut is host to a diverse range of fungal species, collectively referred to as the gut "mycobiome". The gut mycobiome is emerging as an area of considerable research interest due to the potential roles of these fungi in human health and disease. However, there is no consensus as to what the best or most suitable methodologies available are with respect to characterizing the human gut mycobiome. The aim of this study is to provide a comparative analysis of several previously published mycobiome-specific culture-dependent and -independent methodologies, including choice of culture media, incubation conditions (aerobic versus anaerobic), DNA extraction method, primer set and freezing of fecal samples to assess their relative merits and suitability for gut mycobiome analysis. There was no significant effect of media type or aeration on culture-dependent results. However, freezing was found to have a significant effect on fungal viability, with significantly lower fungal numbers recovered from frozen samples. DNA extraction method had a significant effect on DNA yield and quality. However, freezing and extraction method did not have any impact on either α or β diversity. There was also considerable variation in the ability of different fungal-specific primer sets to generate PCR products for subsequent sequence analysis. Through this investigation two DNA extraction methods and one primer set was identified which facilitated the analysis of the mycobiome for all samples in this study. Ultimately, a diverse range of fungal species were recovered using both approaches, with Candida and Saccharomyces identified as the most common fungal species recovered using culture-dependent and culture-independent methods, respectively. As has been apparent from ecological surveys of the bacterial fraction of the gut microbiota, the use of different methodologies can also impact on our understanding of gut mycobiome composition and therefore requires careful consideration. Future research into the gut mycobiome needs to adopt a common strategy to minimize potentially confounding effects of methodological choice and to facilitate comparative analysis of datasets.
Visualizando el desarrollo de la nanomedicina en México.
Robles-Belmont, Eduardo; Gortari-Rabiela, Rebeca de; Galarza-Barrios, Pilar; Siqueiros-García, Jesús Mario; Ruiz-León, Alejandro Arnulfo
2017-01-01
In this article we present a set of different visualizations of Mexico's nanomedicine scientific production data. Visualizations were developed using different methodologies for data analysis and visualization such as social network analysis, geography of science maps, and complex network communities analysis. Results are a multi-dimensional overview of the evolution of nanomedicine in Mexico. Moreover, visualizations allowed to identify trends and patterns of collaboration at the national and international level. Trends are also found in the knowledge structure of themes and disciplines. Finally, we identified the scientific communities in Mexico that are responsible for the new knowledge production in this emergent field of science. Copyright: © 2017 SecretarÍa de Salud
A review on color normalization and color deconvolution methods in histopathology.
Onder, Devrim; Zengin, Selen; Sarioglu, Sulen
2014-01-01
The histopathologists get the benefits of wide range of colored dyes to have much useful information about the lesions and the tissue compositions. Despite its advantages, the staining process comes up with quite complex variations in staining concentrations and correlations, tissue fixation types, and fixation time periods. Together with the improvements in computing power and with the development of novel image analysis methods, these imperfections have led to the emerging of several color normalization algorithms. This article is a review of the currently available digital color normalization methods for the bright field histopathology. We describe the proposed color normalization methodologies in detail together with the lesion and tissue types used in the corresponding experiments. We also present the quantitative validation approaches for each of the proposed methodology where available.
Designing a Strategic Plan through an Emerging Knowledge Generation Process: The ATM Experience
ERIC Educational Resources Information Center
Zanotti, Francesco
2012-01-01
Purpose: The aim of this contribution is to describe a new methodology for designing strategic plans and how it was implemented by ATM, a public transportation agency based in Milan, Italy. Design/methodology/approach: This methodology is founded on a new system theory, called "quantum systemics". It is based on models and metaphors both…
Broccoli, Morgan C; Calvello, Emilie J B; Skog, Alexander P; Wachira, Benjamin; Wallis, Lee A
2015-01-01
Objectives We undertook this study in Kenya to understand the community's emergency care needs and barriers they face when trying to access care, and to seek community members’ thoughts regarding high impact solutions to expand access to essential emergency services. Design We used a qualitative research methodology to conduct 59 focus groups with 528 total Kenyan community member participants. Data were coded, aggregated and analysed using the content analysis approach. Setting Participants were uniformly selected from all eight of the historical Kenyan provinces (Central, Coast, Eastern, Nairobi, North Eastern, Nyanza, Rift Valley and Western), with equal rural and urban community representation. Results Socioeconomic and cultural factors play a major role both in seeking and reaching emergency care. Community members in Kenya experience a wide range of medical emergencies, and seem to understand their time-critical nature. They rely on one another for assistance in the face of substantial barriers to care—a lack of: system structure, resources, transportation, trained healthcare providers and initial care at the scene. Conclusions Access to emergency care in Kenya can be improved by encouraging recognition and initial treatment of emergent illness in the community, strengthening the pre-hospital care system, improving emergency care delivery at health facilities and creating new policies at a national level. These community-generated solutions likely have a wider applicability in the region. PMID:26586324
Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards
Smith, Justin D.
2013-01-01
This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have precluded widespread implementation and acceptance of the SCED as a viable complementary methodology to the predominant group design. This article includes a description of the research design, measurement, and analysis domains distinctive to the SCED; a discussion of the results within the framework of contemporary standards and guidelines in the field; and a presentation of updated benchmarks for key characteristics (e.g., baseline sampling, method of analysis), and overall, it provides researchers and reviewers with a resource for conducting and evaluating SCED research. The results of the systematic review of 409 studies suggest that recently published SCED research is largely in accordance with contemporary criteria for experimental quality. Analytic method emerged as an area of discord. Comparison of the findings of this review with historical estimates of the use of statistical analysis indicates an upward trend, but visual analysis remains the most common analytic method and also garners the most support amongst those entities providing SCED standards. Although consensus exists along key dimensions of single-case research design and researchers appear to be practicing within these parameters, there remains a need for further evaluation of assessment and sampling techniques and data analytic methods. PMID:22845874
Lopez, Andrea M; Bourgois, Philippe; Wenger, Lynn D; Lorvick, Jennifer; Martinez, Alexis N; Kral, Alex H
2013-03-01
Research with injection drug users (IDUs) benefits from interdisciplinary theoretical and methodological innovation because drug use is illegal, socially sanctioned and often hidden. Despite the increasing visibility of interdisciplinary, mixed methods research projects with IDUs, qualitative components are often subordinated to quantitative approaches and page restrictions in top addiction journals limit detailed reports of complex data collection and analysis logistics, thus minimizing the fuller scientific potential of genuine mixed methods. We present the methodological logistics and conceptual approaches of four mixed-methods research projects that our interdisciplinary team conducted in San Francisco with IDUs over the past two decades. These projects include combinations of participant-observation ethnography, in-depth qualitative interviewing, epidemiological surveys, photo-documentation, and geographic mapping. We adapted Greene et al.'s framework for combining methods in a single research project through: data triangulation, methodological complementarity, methodological initiation, and methodological expansion. We argue that: (1) flexible and self-reflexive methodological procedures allowed us to seize strategic opportunities to document unexpected and sometimes contradictory findings as they emerged to generate new research questions, (2) iteratively mixing methods increased the scope, reliability, and generalizability of our data, and (3) interdisciplinary collaboration contributed to a scientific "value added" that allowed for more robust theoretical and practical findings about drug use and risk-taking. Copyright © 2013 Elsevier B.V. All rights reserved.
Lopez, Andrea; Bourgois, Philippe; Wenger, Lynn; Lorvick, Jennifer; Martinez, Alexis; Kral, Alex H.
2013-01-01
Research with injection drug users (IDUs) benefits from interdisciplinary theoretical and methodological innovation because drug use is illegal, socially sanctioned and often hidden. Despite the increasing visibility of interdisciplinary, mixed methods research projects with IDUs, qualitative components are often subordinated to quantitative approaches and page restrictions in top addiction journals limit detailed reports of complex data collection and analysis logistics, thus minimizing the fuller scientific potential of genuine mixed methods. We present the methodological logistics and conceptual approaches of four mixed-methods research projects that our interdisciplinary team conducted in San Francisco with IDUs over the past two decades. These projects include combinations of participant-observation ethnography, in-depth qualitative interviewing, epidemiological surveys, photo-documentation, and geographic mapping. We adapted Greene et al.’s framework for combining methods in a single research project through: data triangulation, methodological complementarity, methodological initiation, and methodological expansion. We argue that: (1) flexible and self-reflexive methodological procedures allowed us to seize strategic opportunities to document unexpected and sometimes contradictory findings as they emerged to generate new research questions, (2) iteratively mixing methods increased the scope, reliability, and generalizability of our data, and (3) interdisciplinary collaboration contributed to a scientific “value added” that allowed for more robust theoretical and practical findings about drug use and risk-taking. PMID:23312109
A simple landslide susceptibility analysis for hazard and risk assessment in developing countries
NASA Astrophysics Data System (ADS)
Guinau, M.; Vilaplana, J. M.
2003-04-01
In recent years, a number of techniques and methodologies have been developed for mitigating natural disasters. The complexity of these methodologies and the scarcity of material and data series justify the need for simple methodologies to obtain the necessary information for minimising the effects of catastrophic natural phenomena. The work with polygonal maps using a GIS allowed us to develop a simple methodology, which was developed in an area of 473 Km2 in the Departamento de Chinandega (NW Nicaragua). This area was severely affected by a large number of landslides (mainly debris flows), triggered by the Hurricane Mitch rainfalls in October 1998. With the aid of aerial photography interpretation at 1:40.000 scale, amplified to 1:20.000, and detailed field work, a landslide map at 1:10.000 scale was constructed. The failure zones of landslides were digitized in order to obtain a failure zone digital map. A terrain unit digital map, in which a series of physical-environmental terrain factors are represented, was also used. Dividing the studied area into two zones (A and B) with homogeneous physical and environmental characteristics, allows us to develop the proposed methodology and to validate it. In zone A, the failure zone digital map is superimposed onto the terrain unit digital map to establish the relationship between the different terrain factors and the failure zones. The numerical expression of this relationship enables us to classify the terrain by its landslide susceptibility. In zone B, this numerical relationship was employed to obtain a landslide susceptibility map, obviating the need for a failure zone map. The validity of the methodology can be tested in this area by using the degree of superposition of the susceptibility map and the failure zone map. The implementation of the methodology in tropical countries with physical and environmental characteristics similar to those of the study area allows us to carry out a landslide susceptibility analysis in areas where landslide records do not exist. This analysis is essential to landslide hazard and risk assessment, which is necessary to determine the actions for mitigating landslide effects, e.g. land planning, emergency aid actions, etc.
Radović, V; Ćurčić, L
2012-01-01
Background: The aim of the study was a recommendation and establishment the concept of the appropriate communication between public health, other competent services and population in emergency as the corner stone which guarantee that all goals which are important for community life will be achieved. Methods: We used methodology appropriate for social science: analyses of documents, historical approach and comparative analysis. Results: The finding shows the urgent need for accepting of crises and emergency risk communication principles, or some similar concepts, in Serbia, and implementing effective two way communication especially in multiethnic region. The pragmatic value of the paper lays in information about the recent improvement of health workforce and emergency services in emergencies using new concept of communication and as source of numerous useful documents published in USA and few recent Serbian examples. Conclusion: Health workforce has significant role in the process of protection of population in emergencies. Policy makers should work on finding a way to improve their coordination and communication, creating new academic programs, providing of adequate training, and financial means in order to give them different role in society and provide visibility. From other side health workforce should build back to the citizen trust in what they are doing for society welfare using all their skills and abilities. PMID:23308348
Ferrazzi, Priscilla; Krupa, Terry
2015-09-01
Studies that seek to understand and improve health care systems benefit from qualitative methods that employ theory to add depth, complexity, and context to analysis. Theories used in health research typically emerge from social science, but these can be inadequate for studying complex health systems. Mental health rehabilitation programs for criminal courts are complicated by their integration within the criminal justice system and by their dual health-and-justice objectives. In a qualitative multiple case study exploring the potential for these mental health court programs in Arctic communities, we assess whether a legal theory, known as therapeutic jurisprudence, functions as a useful methodological theory. Therapeutic jurisprudence, recruited across discipline boundaries, succeeds in guiding our qualitative inquiry at the complex intersection of mental health care and criminal law by providing a framework foundation for directing the study's research questions and the related propositions that focus our analysis. © The Author(s) 2014.
Percy, Andrew J; Yang, Juncong; Chambers, Andrew G; Mohammed, Yassene; Miliotis, Tasso; Borchers, Christoph H
2016-01-01
Quantitative mass spectrometry (MS)-based approaches are emerging as a core technology for addressing health-related queries in systems biology and in the biomedical and clinical fields. In several 'omics disciplines (proteomics included), an approach centered on selected or multiple reaction monitoring (SRM or MRM)-MS with stable isotope-labeled standards (SIS), at the protein or peptide level, has emerged as the most precise technique for quantifying and screening putative analytes in biological samples. To enable the widespread use of MRM-based protein quantitation for disease biomarker assessment studies and its ultimate acceptance for clinical analysis, the technique must be standardized to facilitate precise and accurate protein quantitation. To that end, we have developed a number of kits for assessing method/platform performance, as well as for screening proposed candidate protein biomarkers in various human biofluids. Collectively, these kits utilize a bottom-up LC-MS methodology with SIS peptides as internal standards and quantify proteins using regression analysis of standard curves. This chapter details the methodology used to quantify 192 plasma proteins of high-to-moderate abundance (covers a 6 order of magnitude range from 31 mg/mL for albumin to 18 ng/mL for peroxidredoxin-2), and a 21-protein subset thereof. We also describe the application of this method to patient samples for biomarker discovery and verification studies. Additionally, we introduce our recently developed Qualis-SIS software, which is used to expedite the analysis and assessment of protein quantitation data in control and patient samples.
Macroergonomic analysis and design for improved safety and quality performance.
Kleiner, B M
1999-01-01
Macroergonomics, which emerged historically after sociotechnical systems theory, quality management, and ergonomics, is presented as the basis for a needed integrative methodology. A macroergonomics methodology was presented in some detail to demonstrate how aspects of microergonomics, total quality management (TQM), and sociotechnical systems (STS) can be triangulated in a common approach. In the context of this methodology, quality and safety were presented as 2 of several important performance criteria. To demonstrate aspects of the methodology, 2 case studies were summarized with safety and quality performance results where available. The first case manipulated both personnel and technical factors to achieve a "safety culture" at a nuclear site. The concept of safety culture is defined in INSAG-4 (International Atomic Energy Agency, 1991). as "that assembly of characteristics and attitudes in organizations and individuals which establishes that, as an overriding priority, nuclear plant safety issues receive the attention warranted by their significance." The second case described a tire manufacturing intervention to improve quality (as defined by Sink and Tuttle, 1989) through joint consideration of technical and social factors. It was suggested that macroergonomics can yield greater performance than can be achieved through ergonomic intervention alone. Whereas case studies help to make the case, more rigorous formative and summative research is needed to refine and validate the proposed methodology respectively.
Ethical and methodological issues in research with Sami experiencing disability.
Melbøe, Line; Hansen, Ketil Lenert; Johnsen, Bjørn-Eirik; Fedreheim, Gunn Elin; Dinesen, Tone; Minde, Gunn-Tove; Rustad, Marit
2016-01-01
A study of disability among the indigenous Sami people in Norway presented a number of ethical and methodological challenges rarely addressed in the literature. The main study was designed to examine and understand the everyday life, transitions between life stages and democratic participation of Norwegian Sami people experiencing disability. Hence, the purpose of this article is to increase the understanding of possible ethical and methodological issues in research within this field. The article describes and discusses ethical and methodological issues that arose when conducting our study and identifies some strategies for addressing issues like these. The ethical and methodological issues addressed in the article are based on a qualitative study among indigenous Norwegian Sami people experiencing disability. The data in this study were collected through 31 semi-structured in-depth interviews with altogether 24 Sami people experiencing disability and 13 next of kin of Sami people experiencing disability (8 mothers, 2 fathers, 2 sister and 1 guardian). The researchers identified 4 main areas of ethical and methodological issues. We present these issues chronologically as they emerged in the research process: 1) concept of knowledge when designing the study, 2) gaining access, 3) data collection and 4) analysis and accountability. The knowledge generated from this study has the potential to benefit future health research, specifically of Norwegian Sami people experiencing disability, as well as health research concerning indigenous people in general, providing scientific-based insight into important ethical and methodological issues in research with indigenous people experiencing disability.
A Framework for Reliability and Safety Analysis of Complex Space Missions
NASA Technical Reports Server (NTRS)
Evans, John W.; Groen, Frank; Wang, Lui; Austin, Rebekah; Witulski, Art; Mahadevan, Nagabhushan; Cornford, Steven L.; Feather, Martin S.; Lindsey, Nancy
2017-01-01
Long duration and complex mission scenarios are characteristics of NASA's human exploration of Mars, and will provide unprecedented challenges. Systems reliability and safety will become increasingly demanding and management of uncertainty will be increasingly important. NASA's current pioneering strategy recognizes and relies upon assurance of crew and asset safety. In this regard, flexibility to develop and innovate in the emergence of new design environments and methodologies, encompassing modeling of complex systems, is essential to meet the challenges.
Work-based physiological assessment of physically-demanding trades: a methodological overview.
Taylor, Nigel A S; Groeller, Herb
2003-03-01
Technological advances, modified work practices, altered employment strategies, work-related injuries, and the rise in work-related litigation and compensation claims necessitate ongoing trade analysis research. Such research enables the identification and development of gender- and age-neutral skills, physiological attributes and employment standards required to satisfactorily perform critical trade tasks. This paper overviews a methodological approach which may be adopted when seeking to establish trade-specific physiological competencies for physically-demanding trades (occupations). A general template is presented for conducting a trade analyses within physically-demanding trades, such as those encountered within military or emergency service occupations. Two streams of analysis are recommended: the trade analysis and the task analysis. The former involves a progressive dissection of activities and skills into a series of specific tasks (elements), and results in a broad approximation of the types of trade duties, and the links between trade tasks. The latter, will lead to the determination of how a task is performed within a trade, and the physiological attributes required to satisfactorily perform that task. The approach described within this paper is designed to provide research outcomes which have high content, criterion-related and construct validities.
Yosha, Amanat M.; Carroll, Jennifer K.; Hendren, Samantha; Salamone, Charcy M.; Sanders, Mechelle; Fiscella, Kevin; Epstein, Ronald M.
2011-01-01
Objective Patient navigation for cancer care assesses and alleviates barriers to health care services. We examined paired perspectives of cancer patients and their navigators to examine the process of patient navigation. We explored the strengths, limitations, and our own lessons learned about adopting the novel methodology of multiperspective analysis. Methods As part of a larger RCT, patients and navigators were interviewed separately. We reviewed interviews with 18 patient-navigator dyads. Dyad summaries were created that explicitly incorporated both patient and navigator perspectives. Emerging themes and verbatim quotations were reflected in the summaries. Results Paired perspectives were valuable in identifying struggles that arose during navigation. These were represented as imbalanced investment and relational amelioration. Patients and navigators had general consensus about important patient needs for cancer care, but characterized these needs differently. Conclusion Our experience with multiperspective analysis revealed a methodology that delivers novel relational findings, but is best conducted de novo rather than as part of a larger study. Practice Implications Multiperspective analysis should be more widely adopted with clear aims and analytic strategy that strengthen the ability to reveal relational dynamics. Navigation training programs should anticipate navigator struggles and provide navigators with tools to manage them. PMID:21255958
Jelinek, George; Mackinlay, Claire; Weiland, Tracey; Hill, Nicole; Gerdtz, Marie
2011-06-01
This study aimed to describe the perceived barriers faced by emergency clinicians in utilising mental health legislation in Australian hospital emergency departments. A semi-structured interview methodology was used to assess what barriers emergency department doctors and nurses perceive in the operation of mental health legislation. Key findings from the interview data were drawn in accordance with the most commonly represented themes. A total of 36 interviews were conducted with 20 members of the Australasian College for Emergency Medicine and 16 members of the College for Emergency Nursing Australasia representing the various Australian jurisdictions. Most concerning to clinicians were the effects of access block and overcrowding on the appropriate use of mental health legislation, and the substandard medical care that mental health patients received as a result of long periods in the emergency department. Many respondents were concerned about the lack of applicability of mental health legislation to the emergency department environment, variation in legislation between States and Territories causing problems for clinicians working interstate, and a lack of knowledge and training in mental health legislation. Many felt that clarification of legislative issues around duty of care and intoxicated or violent patients was required. The authors conclude that access block has detrimental effects on emergency mental health care as it does in other areas of emergency medicine. Consideration should be given to uniform national mental health legislation to better serve the needs of people with mental health emergencies.
Galvin, Rose; Gilleit, Yannick; Wallace, Emma; Cousins, Gráinne; Bolmer, Manon; Rainer, Timothy; Smith, Susan M; Fahey, Tom
2017-03-01
older adults are frequent users of emergency services and demonstrate high rates of adverse outcomes following emergency care. to perform a systematic review and meta-analysis of the Identification of Seniors At Risk (ISAR) screening tool, to determine its predictive value in identifying adults ≥65 years at risk of functional decline, unplanned emergency department (ED) readmission, emergency hospitalisation or death within 180 days after index ED visit/hospitalisation. a systematic literature search was conducted in PubMed, EMBASE, CINAHL, EBSCO and the Cochrane Library to identify validation and impact analysis studies of the ISAR tool. A pre-specified ISAR score of ≥2 (maximum score 6 points) was used to identify patients at high risk of adverse outcomes. A bivariate random effects model generated pooled estimates of sensitivity and specificity. Statistical heterogeneity was explored and methodological quality was assessed using validated criteria. thirty-two validation studies (n = 12,939) are included. At ≥2, the pooled sensitivity of the ISAR for predicting ED return, emergency hospitalisation and mortality at 6 months is 0.80 (95% confidence interval (CI) 0.70-0.87), 0.82 (95% CI 0.74-0.88) and 0.87 (95% CI 0.75-0.94), respectively, with a pooled specificity of 0.31 (95% CI 0.24-0.38), 0.32 (95% CI 0.24-0.41) and 0.35 (95% CI 0.26-0.44). Similar values are demonstrated at 30 and 90 days. Three heterogeneous impact analysis studies examined the clinical implementation of the ISAR and reported mixed findings across patient and process outcomes. the ISAR has modest predictive accuracy and may serve as a decision-making adjunct when determining which older adults can be safely discharged. © The Author 2016. Published by Oxford University Press on behalf of the British Geriatrics Society. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Highlights in emergency medicine medical education research: 2008.
Farrell, Susan E; Coates, Wendy C; Khun, Gloria J; Fisher, Jonathan; Shayne, Philip; Lin, Michelle
2009-12-01
The purpose of this article is to highlight medical education research studies published in 2008 that were methodologically superior and whose outcomes were pertinent to teaching and education in emergency medicine. Through a PubMed search of the English language literature in 2008, 30 medical education research studies were independently identified as hypothesis-testing investigations and measurements of educational interventions. Six reviewers independently rated and scored all articles based on eight anchors, four of which related to methodologic criteria. Articles were ranked according to their total rating score. A ranking agreement among the reviewers of 83% was established a priori as a minimum for highlighting articles in this review. Five medical education research studies met the a priori criteria for inclusion and are reviewed and summarized here. Four of these employed experimental or quasi-experimental methodology. Although technology was not a component of the structured literature search employed to identify the candidate articles for this review, 14 of the articles identified, including four of the five highlighted articles, employed or studied technology as a focus of the educational research. Overall, 36% of the reviewed studies were supported by funding; three of the highlighted articles were funded studies. This review highlights quality medical education research studies published in 2008, with outcomes of relevance to teaching and education in emergency medicine. It focuses on research methodology, notes current trends in the use of technology for learning in emergency medicine, and suggests future avenues for continued rigorous study in education.
Identity and Culture: Theorizing Emergent Environmentalism.
ERIC Educational Resources Information Center
Dillon, Justin; Kelsey, Elin; Duque-Aristizabal, Ana Maria
1999-01-01
Examines the methodology and findings of the emergent environmentalism research project as reported in Environmental Education Research v4 n4. Challenges the ontological stance implicit in the research as well as explicit epistemology. (Author/CCM)
ERIC Educational Resources Information Center
Zimmerman, Aaron Samuel; Kim, Jeong-Hee
2017-01-01
Narrative inquiry has been a popular methodology in different disciplines for the last few decades. Using stories, narrative inquiry illuminates lived experience, serving as a valuable complement to research methodologies that are rooted in positivist epistemologies. In this article, we present a brief introduction to narrative inquiry including…
Sexuality Research in Iran: A Focus on Methodological and Ethical Considerations.
Rahmani, Azam; Merghati-Khoei, Effat; Moghaddam-Banaem, Lida; Zarei, Fatemeh; Montazeri, Ali; Hajizadeh, Ebrahim
2015-07-01
Research on sensitive topics, such as sexuality, could raise technical, methodological, ethical, political, and legal challenges. The aim of this paper was to draw the methodological challenges which the authors confronted during sexuality research with young population in the Iranian culture. This study was an exploratory mixed method one conducted in 2013-14. We interviewed 63 young women aged 18-34 yr in qualitative phase and 265 young women in quantitative phase in (university and non-university) dormitories and in an Adolescent Friendly Center. Data were collected using focus group discussions and individual interviews in the qualitative phase. We employed conventional content analysis to analyze the data. To enhance the rigor of the data, multiple data collection methods, maximum variation sampling, and peer checks were applied. Five main themes emerged from the data: interaction with opposite sex, sexual risk, sexual protective, sex education, and sexual vulnerability. Challenges while conducting sex research have been discussed. These challenges included assumption of promiscuity, language of silence and privacy concerns, and sex segregation policy. We described the strategies applied in our study and the rationales for each strategy. Strategies applied in the present study can be employed in contexts with the similar methodological and moral concerns.
Kurkjian, Katie M; Winz, Michelle; Yang, Jun; Corvese, Kate; Colón, Ana; Levine, Seth J; Mullen, Jessica; Ruth, Donna; Anson-Dwamena, Rexford; Bayleyegn, Tesfaye; Chang, David S
2016-04-01
For the past decade, emergency preparedness campaigns have encouraged households to meet preparedness metrics, such as having a household evacuation plan and emergency supplies of food, water, and medication. To estimate current household preparedness levels and to enhance disaster response planning, the Virginia Department of Health with remote technical assistance from the Centers for Disease Control and Prevention conducted a community health assessment in 2013 in Portsmouth, Virginia. Using the Community Assessment for Public Health Emergency Response (CASPER) methodology with 2-stage cluster sampling, we randomly selected 210 households for in-person interviews. Households were questioned about emergency planning and supplies, information sources during emergencies, and chronic health conditions. Interview teams completed 180 interviews (86%). Interviews revealed that 70% of households had an emergency evacuation plan, 67% had a 3-day supply of water for each member, and 77% had a first aid kit. Most households (65%) reported that the television was the primary source of information during an emergency. Heart disease (54%) and obesity (40%) were the most frequently reported chronic conditions. The Virginia Department of Health identified important gaps in local household preparedness. Data from the assessment have been used to inform community health partners, enhance disaster response planning, set community health priorities, and influence Portsmouth's Community Health Improvement Plan.
Energy consumption analysis for various memristive networks under different learning strategies
NASA Astrophysics Data System (ADS)
Deng, Lei; Wang, Dong; Zhang, Ziyang; Tang, Pei; Li, Guoqi; Pei, Jing
2016-02-01
Recently, various memristive systems emerge to emulate the efficient computing paradigm of the brain cortex; whereas, how to make them energy efficient still remains unclear, especially from an overall perspective. Here, a systematical and bottom-up energy consumption analysis is demonstrated, including the memristor device level and the network learning level. We propose an energy estimating methodology when modulating the memristive synapses, which is simulated in three typical neural networks with different synaptic structures and learning strategies for both offline and online learning. These results provide an in-depth insight to create energy efficient brain-inspired neuromorphic devices in the future.
Implications of Contingency Planning Support for Weather and Icing Information
NASA Technical Reports Server (NTRS)
Vigeant-Langlois, Laurence; Hansman, R. John, Jr.
2003-01-01
A human-centered systems analysis was applied to the adverse aircraft weather encounter problem in order to identify desirable functions of weather and icing information. The importance of contingency planning was identified as emerging from a system safety design methodology as well as from results of other aviation decision-making studies. The relationship between contingency planning support and information on regions clear of adverse weather was investigated in a scenario- based analysis. A rapid prototype example of the key elements in the depiction of icing conditions was developed in a case study, and the implications for the components of the icing information system were articulated.
Assaying gene function by growth competition experiment.
Merritt, Joshua; Edwards, Jeremy S
2004-07-01
High-throughput screening and analysis is one of the emerging paradigms in biotechnology. In particular, high-throughput methods are essential in the field of functional genomics because of the vast amount of data generated in recent and ongoing genome sequencing efforts. In this report we discuss integrated functional analysis methodologies which incorporate both a growth competition component and a highly parallel assay used to quantify results of the growth competition. Several applications of the two most widely used technologies in the field, i.e., transposon mutagenesis and deletion strain library growth competition, and individual applications of several developing or less widely reported technologies are presented.
Although important data and methodological challenges facing LCA and emerging materials exist, this LCA captures material and process changes that are important drivers of environmental impacts. LCA methods need to be amended to reflect properties of emerging materials that deter...
The Research Component in the Development of Emergency Medicine as a Specialty
ERIC Educational Resources Information Center
Anwar, Rebecca A. H.; Wagner, David K.
1977-01-01
An example of a methodology for programmatic research development in emergency medicine is identified through the establishment of a nonclinical health services research position, which contributes to building a body of knowledge specific to emergency medicine and creating a sense of professional identity among graduate trainees. (Author/LBH)
SAMCO: Society Adaptation for coping with Mountain risks in a global change COntext
NASA Astrophysics Data System (ADS)
Grandjean, Gilles; Bernardie, Severine; Malet, Jean-Philippe; Puissant, Anne; Houet, Thomas; Berger, Frederic; Fort, Monique; Pierre, Daniel
2013-04-01
The SAMCO project aims to develop a proactive resilience framework enhancing the overall resilience of societies on the impacts of mountain risks. The project aims to elaborate methodological tools to characterize and measure ecosystem and societal resilience from an operative perspective on three mountain representative case studies. To achieve this objective, the methodology is split in several points with (1) the definition of the potential impacts of global environmental changes (climate system, ecosystem e.g. land use, socio-economic system) on landslide hazards, (2) the analysis of these consequences in terms of vulnerability (e.g. changes in the location and characteristics of the impacted areas and level of their perturbation) and (3) the implementation of a methodology for quantitatively investigating and mapping indicators of mountain slope vulnerability exposed to several hazard types, and the development of a GIS-based demonstration platform. The strength and originality of the SAMCO project will be to combine different techniques, methodologies and models (multi-hazard assessment, risk evolution in time, vulnerability functional analysis, and governance strategies) and to gather various interdisciplinary expertises in earth sciences, environmental sciences, and social sciences. The multidisciplinary background of the members could potentially lead to the development of new concepts and emerging strategies for mountain hazard/risk adaptation. Research areas, characterized by a variety of environmental, economical and social settings, are severely affected by landslides, and have experienced significant land use modifications (reforestation, abandonment of traditional agricultural practices) and human interferences (urban expansion, ski resorts construction) over the last century.
Białk-Bielińska, Anna; Kumirska, Jolanta; Borecka, Marta; Caban, Magda; Paszkiewicz, Monika; Pazdro, Ksenia; Stepnowski, Piotr
2016-03-20
Recent developments and improvements in advanced instruments and analytical methodologies have made the detection of pharmaceuticals at low concentration levels in different environmental matrices possible. As a result of these advances, over the last 15 years residues of these compounds and their metabolites have been detected in different environmental compartments and pharmaceuticals have now become recognized as so-called 'emerging' contaminants. To date, a lot of papers have been published presenting the development of analytical methodologies for the determination of pharmaceuticals in aqueous and solid environmental samples. Many papers have also been published on the application of the new methodologies, mainly to the assessment of the environmental fate of pharmaceuticals. Although impressive improvements have undoubtedly been made, in order to fully understand the behavior of these chemicals in the environment, there are still numerous methodological challenges to be overcome. The aim of this paper therefore, is to present a review of selected recent improvements and challenges in the determination of pharmaceuticals in environmental samples. Special attention has been paid to the strategies used and the current challenges (also in terms of Green Analytical Chemistry) that exist in the analysis of these chemicals in soils, marine environments and drinking waters. There is a particular focus on the applicability of modern sorbents such as carbon nanotubes (CNTs) in sample preparation techniques, to overcome some of the problems that exist in the analysis of pharmaceuticals in different environmental samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Heunis, Tosca-Marie; Aldrich, Chris; de Vries, Petrus J
2016-08-01
Electroencephalography (EEG) has been used for almost a century to identify seizure-related disorders in humans, typically through expert interpretation of multichannel recordings. Attempts have been made to quantify EEG through frequency analyses and graphic representations. These "traditional" quantitative EEG analysis methods were limited in their ability to analyze complex and multivariate data and have not been generally accepted in clinical settings. There has been growing interest in identification of novel EEG biomarkers to detect early risk of autism spectrum disorder, to identify clinically meaningful subgroups, and to monitor targeted intervention strategies. Most studies to date have, however, used quantitative EEG approaches, and little is known about the emerging multivariate analytical methods or the robustness of candidate biomarkers in the context of the variability of autism spectrum disorder. Here, we present a targeted review of methodological and clinical challenges in the search for novel resting-state EEG biomarkers for autism spectrum disorder. Three primary novel methodologies are discussed: (1) modified multiscale entropy, (2) coherence analysis, and (3) recurrence quantification analysis. Results suggest that these methods may be able to classify resting-state EEG as "autism spectrum disorder" or "typically developing", but many signal processing questions remain unanswered. We suggest that the move to novel EEG analysis methods is akin to the progress in neuroimaging from visual inspection, through region-of-interest analysis, to whole-brain computational analysis. Novel resting-state EEG biomarkers will have to evaluate a range of potential demographic, clinical, and technical confounders including age, gender, intellectual ability, comorbidity, and medication, before these approaches can be translated into the clinical setting. Copyright © 2016 Elsevier Inc. All rights reserved.
Using mixed methods in health research
Woodman, Jenny
2013-01-01
Summary Mixed methods research is the use of quantitative and qualitative methods in a single study or series of studies. It is an emergent methodology which is increasingly used by health researchers, especially within health services research. There is a growing literature on the theory, design and critical appraisal of mixed methods research. However, there are few papers that summarize this methodological approach for health practitioners who wish to conduct or critically engage with mixed methods studies. The objective of this paper is to provide an accessible introduction to mixed methods for clinicians and researchers unfamiliar with this approach. We present a synthesis of key methodological literature on mixed methods research, with examples from our own work and that of others, to illustrate the practical applications of this approach within health research. We summarize definitions of mixed methods research, the value of this approach, key aspects of study design and analysis, and discuss the potential challenges of combining quantitative and qualitative methods and data. One of the key challenges within mixed methods research is the successful integration of quantitative and qualitative data during analysis and interpretation. However, the integration of different types of data can generate insights into a research question, resulting in enriched understanding of complex health research problems. PMID:23885291
Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard
2013-01-01
Purpose: With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. Methods: A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. Results: The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. Conclusions: The work demonstrates the viability of the design approach and the software tool for analysis of large data sets. PMID:24320426
Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard
2013-11-01
With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. The work demonstrates the viability of the design approach and the software tool for analysis of large data sets.
Quantifying innovation in surgery.
Hughes-Hallett, Archie; Mayer, Erik K; Marcus, Hani J; Cundy, Thomas P; Pratt, Philip J; Parston, Greg; Vale, Justin A; Darzi, Ara W
2014-08-01
The objectives of this study were to assess the applicability of patents and publications as metrics of surgical technology and innovation; evaluate the historical relationship between patents and publications; develop a methodology that can be used to determine the rate of innovation growth in any given health care technology. The study of health care innovation represents an emerging academic field, yet it is limited by a lack of valid scientific methods for quantitative analysis. This article explores and cross-validates 2 innovation metrics using surgical technology as an exemplar. Electronic patenting databases and the MEDLINE database were searched between 1980 and 2010 for "surgeon" OR "surgical" OR "surgery." Resulting patent codes were grouped into technology clusters. Growth curves were plotted for these technology clusters to establish the rate and characteristics of growth. The initial search retrieved 52,046 patents and 1,801,075 publications. The top performing technology cluster of the last 30 years was minimally invasive surgery. Robotic surgery, surgical staplers, and image guidance were the most emergent technology clusters. When examining the growth curves for these clusters they were found to follow an S-shaped pattern of growth, with the emergent technologies lying on the exponential phases of their respective growth curves. In addition, publication and patent counts were closely correlated in areas of technology expansion. This article demonstrates the utility of publically available patent and publication data to quantify innovations within surgical technology and proposes a novel methodology for assessing and forecasting areas of technological innovation.
Reps, Jenna M; Aickelin, Uwe; Hubbard, Richard B
2016-02-01
To develop a framework for identifying and incorporating candidate confounding interaction terms into a regularised cox regression analysis to refine adverse drug reaction signals obtained via longitudinal observational data. We considered six drug families that are commonly associated with myocardial infarction in observational healthcare data, but where the causal relationship ground truth is known (adverse drug reaction or not). We applied emergent pattern mining to find itemsets of drugs and medical events that are associated with the development of myocardial infarction. These are the candidate confounding interaction terms. We then implemented a cohort study design using regularised cox regression that incorporated and accounted for the candidate confounding interaction terms. The methodology was able to account for signals generated due to confounding and a cox regression with elastic net regularisation correctly ranking the drug families known to be true adverse drug reactions above those that are not. This was not the case without the inclusion of the candidate confounding interaction terms, where confounding leads to a non-adverse drug reaction being ranked highest. The methodology is efficient, can identify high-order confounding interactions and does not require expert input to specify outcome specific confounders, so it can be applied for any outcome of interest to quickly refine its signals. The proposed method shows excellent potential to overcome some forms of confounding and therefore reduce the false positive rate for signal analysis using longitudinal data. Copyright © 2015 Elsevier Ltd. All rights reserved.
In their own words: Success stories from The Great Lakes Native American Research Center for Health.
Dellinger, Matthew; Jackson, Brian; Poupart, Amy
2016-01-01
In 2009, the Great Lakes Native American Research Center for Health (GLNARCH) set out to generate a promotional video that highlights the successes of the program. Ten GLNARCH interns were interviewed and filmed for participation in the promotional video using a documentary production style. During the editing and transcription process, interviewer responses were noted for relevance to theoretical frameworks--specifically, tribal critical race theory, mentoring, and cultural compatibility--which guided GLNARCH program design. Quotations were transcribed to illustrate these themes. Though the interviews were not intended as a formal qualitative analysis, powerful narratives that are relevant to participatory research emerged. The emergence of narratives that align with relevant theoretical frameworks suggests a novel methodology for a culturally responsive, participatory reporting system.
Arnold, Bruce L; Lloyd, Linda S
2014-05-01
Terminally ill patients can have unexpected, enigmatic, and profound cognitive shifts that significantly alter their perception of themselves, thereby eliminating their fear of death and dying. However, there are no systematic studies into these remarkable yet ineffable transcendence experiences. They therefore remain easily overlooked or viewed as isolated anomalies and therefore excluded from quality-of-life patient considerations. We use a multimodal methodology for identifying the prevalence and thematic properties of complex emergent metaphors patients use to report these experiences. Although previous research has pioneered the importance of understanding conventional or primary metaphors at the end of life, our findings indicate the considerable potential of more complex metaphors for reducing barriers to effective communication in palliative care.
NASA Technical Reports Server (NTRS)
Cruit, Wendy; Schutzenhofer, Scott; Goldberg, Ben; Everhart, Kurt
1993-01-01
This project served to define an appropriate methodology for effective prioritization of technology efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results will be implemented as a guideline for consideration for current NASA propulsion systems.
ERIC Educational Resources Information Center
Berkovich, Izhak; Eyal, Ori
2017-01-01
Purpose: The purpose of this paper is to do methodological review of the literature on educational leaders and emotions that includes 49 empirical studies published in peer-reviewed journals between 1992 and 2012. Design/methodology/approach: The work systematically analyzes descriptive information, methods, and designs in these studies, and their…
Building a Better Canine Warrior
2017-10-12
without adversely affecting pe r formance and to develop technical methodology that would dissipate metabolic heat without the expense o f body water...technical methodology that would dissipate metabolic heat without the expense of body water. Neither an increase in dietary salt nor decrease in...from a methodological aspect as well as emerging regulatory issues related to research in working dogs. Data suggested that the effect of high
The meaning of work among Chinese university students: findings from prototype research methodology.
Zhou, Sili; Leung, S Alvin; Li, Xu
2012-07-01
This study examined Chinese university students' conceptualization of the meaning of work. One hundred and ninety students (93 male, 97 female) from Beijing, China, participated in the study. Prototype research methodology (J. Li, 2001) was used to explore the meaning of work and the associations among the identified meanings. Cluster analysis was used to organize the identified meanings into a structure consisting of lateral and hierarchical levels. The themes that emerged fell into 2 large categories named "ideal" and "reality." A series of superordinate-level and basic-level prototypes were found under each of these 2 categories. These prototypes reflected influences from both Chinese traditional and Western value orientations, as well as perceptions that are to be understood in the contemporary social and economic contexts of China. Implications for career development theory, research, and practice are discussed.
Adapting nurse competence to future patient needs using Checkland's Soft Systems Methodology.
Železnik, Danica; Kokol, Peter; Blažun Vošner, Helena
2017-01-01
New emerging technologies, health globalization, demographic change, new healthcare paradigms, advances in healthcare delivery and social networking will change the needs of patients in the future and consequently will require that new knowledge, competence and skill sets be acquired by nurses. Checkland's Soft Systems Methodology, focusing on the enriched CATWOE and PQR elements of the root definitions, combined with our own developed "Too much - Too little constraint" approach was used to devise impending knowledge, competence and skill sets. The analysis revealed ten needs among patients of the future, 63 constraints and 18 knowledge, competence and skill sets for the future nurse. The completed study showed that SSM is an appropriate tool for high level structuring of a "messy" real-world problem situation to meet prospective nursing challenges. Copyright © 2016 Elsevier Ltd. All rights reserved.
The future is now: single-cell genomics of bacteria and archaea
Blainey, Paul C.
2013-01-01
Interest in the expanding catalog of uncultivated microorganisms, increasing recognition of heterogeneity among seemingly similar cells, and technological advances in whole-genome amplification and single-cell manipulation are driving considerable progress in single-cell genomics. Here, the spectrum of applications for single-cell genomics, key advances in the development of the field, and emerging methodology for single-cell genome sequencing are reviewed by example with attention to the diversity of approaches and their unique characteristics. Experimental strategies transcending specific methodologies are identified and organized as a road map for future studies in single-cell genomics of environmental microorganisms. Over the next decade, increasingly powerful tools for single-cell genome sequencing and analysis will play key roles in accessing the genomes of uncultivated organisms, determining the basis of microbial community functions, and fundamental aspects of microbial population biology. PMID:23298390
An evolving systems-based methodology for healthcare planning.
Warwick, Jon; Bell, Gary
2007-01-01
Healthcare planning seems beset with problems at all hierarchical levels. These are caused by the 'soft' nature of many of the issues present in healthcare planning and the high levels of complexity inherent in healthcare services. There has, in recent years, been a move to utilize systems thinking ideas in an effort to gain a better understanding of the forces at work within the healthcare environment and these have had some success. This paper argues that systems-based methodologies can be further enhanced by metrication and modeling which assist in exploring the changed emergent behavior of a system resulting from management intervention. The paper describes the Holon Framework as an evolving systems-based approach that has been used to help clients understand complex systems (in the education domain) that would have application in the analysis of healthcare problems.
Intelligent systems/software engineering methodology - A process to manage cost and risk
NASA Technical Reports Server (NTRS)
Friedlander, Carl; Lehrer, Nancy
1991-01-01
A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.
Factors influencing societal response of nanotechnology: an expert stakeholder analysis
NASA Astrophysics Data System (ADS)
Gupta, Nidhi; Fischer, Arnout R. H.; van der Lans, Ivo A.; Frewer, Lynn J.
2012-05-01
Nanotechnology can be described as an emerging technology and, as has been the case with other emerging technologies such as genetic modification, different socio-psychological factors will potentially influence societal responses to its development and application. These factors will play an important role in how nanotechnology is developed and commercialised. This article aims to identify expert opinion on factors influencing societal response to applications of nanotechnology. Structured interviews with experts on nanotechnology from North West Europe were conducted using repertory grid methodology in conjunction with generalized Procrustes analysis to examine the psychological constructs underlying societal uptake of 15 key applications of nanotechnology drawn from different areas (e.g. medicine, agriculture and environment, chemical, food, military, sports, and cosmetics). Based on expert judgement, the main factors influencing societal response to different applications of nanotechnology will be the extent to which applications are perceived to be beneficial, useful, and necessary, and how 'real' and physically close to the end-user these applications are perceived to be by the public.
Wardley, Matt Nj; Flaxman, Paul E; Willig, Carla; Gillanders, David
2016-08-01
This empirical study investigates psychological practitioners' experience of worksite training in acceptance and commitment therapy using an interpretative phenomenological analysis methodology. Semi-structured interviews were conducted with eight participants, and three themes emerged from the interpretative phenomenological analysis data analysis: influence of previous experiences, self and others and impact and application The significance of the experiential nature of the acceptance and commitment therapy training is explored as well as the dual aspects of developing participants' self-care while also considering their own clinical practice. Consistencies and inconsistencies across acceptance and commitment therapy processes are considered as well as clinical implications, study limitations and future research suggestions. © The Author(s) 2014.
Systematic Review Methodology for the Fatigue in Emergency Medical Services Project
DOT National Transportation Integrated Search
2018-01-11
Background: Guidance for managing fatigue in the Emergency Medical Services (EMS) setting is limited. The Fatigue in EMS Project sought to complete multiple systematic reviews guided by seven explicit research questions, assemble the best available e...
"It's a wild thing, waiting to get me": stance analysis of African Americans with diabetes.
Davis, Boyd H; Pope, Charlene; Mason, Peyton R; Magwood, Gayenell; Jenkins, Carolyn M
2011-01-01
This mixed methods study uses a unique approach from social science and linguistics methodologies, a combination of positioning theory and stance analysis, to examine how 20 African Americans with type 2 diabetes make sense of the practices that led to recurrent emergency department visits to identify needs for more effective intervention. In a purposive sample of postemergency department visit interviews with a same-race interviewer, people responded to open-ended questions reflecting on the decision to seek emergency department care. As applied to diabetes education, positioning theory explains that people use their language to position themselves toward their disease, their medications, and the changes in their lives. Transcriptions were coded using discourse analysis to categorize themes. As a form of triangulation, stance analysis measured language patterns using factor analysis to see when and how speakers revealed affect, attitude, and agentive choices for action. Final analysis revealed that one third of the sample exhibited high scores for positive agency or capacity for decision-making and self-management, while the rest expressed less control and more negative emotions and fears that may preclude self-management. This approach suggests a means to tailor diabetes education considering alternative approaches focused on communication for those facing barriers.
ERIC Educational Resources Information Center
Noor, Farukh; Hanafi, Zahyah
2017-01-01
Purpose: Academic achievement of students can be fostered and improved if they learn to apply emotional intelligence in their emerging adulthood. The core objective of this research is to test the relationship between emerging adulthood and academic achievement by taking emotional intelligence as a mediator. Methodology: The sample comprises 90…
ERIC Educational Resources Information Center
Mack, Frances L.
2012-01-01
This study examined how teachers design and implement instructional strategies to enhance students' emergent writing. A case study methodology was used to examine the elements of an emergent writing program of two kindergarten teachers. The study hoped to define a classroom environment that is conducive to literacy and writing using best…
Burkle, Frederick M
2018-02-01
Triage management remains a major challenge, especially in resource-poor settings such as war, complex humanitarian emergencies, and public health emergencies in developing countries. In triage it is often the disruption of physiology, not anatomy, that is critical, supporting triage methodology based on clinician-assessed physiological parameters as well as anatomy and mechanism of injury. In recent times, too many clinicians from developed countries have deployed to humanitarian emergencies without the physical exam skills needed to assess patients without the benefit of remotely fed electronic monitoring, laboratory, and imaging studies. In triage, inclusion of the once-widely accepted and collectively taught "art of decoding vital signs" with attention to their character and meaning may provide clues to a patient's physiological state, improving triage sensitivity. Attention to decoding vital signs is not a triage methodology of its own or a scoring system, but rather a skill set that supports existing triage methodologies. With unique triage management challenges being raised by an ever-changing variety of humanitarian crises, these once useful skill sets need to be revisited, understood, taught, and utilized by triage planners, triage officers, and teams as a necessary adjunct to physiologically based triage decision-making. (Disaster Med Public Health Preparedness. 2018;12:76-85).
Rodríguez-Prieto, V; Vicente-Rubiano, M; Sánchez-Matamoros, A; Rubio-Guerri, C; Melero, M; Martínez-López, B; Martínez-Avilés, M; Hoinville, L; Vergne, T; Comin, A; Schauer, B; Dórea, F; Pfeiffer, D U; Sánchez-Vizcaíno, J M
2015-07-01
In this globalized world, the spread of new, exotic and re-emerging diseases has become one of the most important threats to animal production and public health. This systematic review analyses conventional and novel early detection methods applied to surveillance. In all, 125 scientific documents were considered for this study. Exotic (n = 49) and re-emerging (n = 27) diseases constituted the most frequently represented health threats. In addition, the majority of studies were related to zoonoses (n = 66). The approaches found in the review could be divided in surveillance modalities, both active (n = 23) and passive (n = 5); and tools and methodologies that support surveillance activities (n = 57). Combinations of surveillance modalities and tools (n = 40) were also found. Risk-based approaches were very common (n = 60), especially in the papers describing tools and methodologies (n = 50). The main applications, benefits and limitations of each approach were extracted from the papers. This information will be very useful for informing the development of tools to facilitate the design of cost-effective surveillance strategies. Thus, the current literature review provides key information about the advantages, disadvantages, limitations and potential application of methodologies for the early detection of new, exotic and re-emerging diseases.
A Learner-led, Discussion-based Elective on Emerging Infectious Disease
2015-01-01
Objective. To implement a learner-led, discussion-based course aimed at exposing second-year pharmacy learners to the study of emerging infectious diseases from a global health perspective and to assess the role and importance of pharmacists in the management of disease outbreaks. Design. Learners examined literature pertinent to an emerging infectious disease in a 3-credit, discussion-based course and participated in peer discussion led by a designated learner. Instructional materials included journal articles, audio-visual presentations, documentaries, book chapters, movies, newspaper/magazine articles, and other materials. Learning outcomes were measured based on the ability of learners to perform critical thinking and analysis, communicate with their peers, and participate in class discussions. Assessment. The course was offered to 2 consecutive cohorts consisting of 14 and 16 learners, respectively. Overall, every learner in the first cohort achieved a final grade of A for the course. In the second cohort, the overall grade distribution consisted of grades of A, B, and C for the course. Learner evaluations indicated that the active-learning, discussion-based environment significantly enhanced interest in the topic and overall performance in the course. Conclusion. The elective course on emerging infectious diseases provided in-depth exposure to disease topics normally not encountered in the pharmacy curriculum. Learners found the material and format valuable, and the course enhanced their appreciation of infectious diseases, research methodology, critical thinking and analysis, and their roles as pharmacists. PMID:26430268
A Learner-led, Discussion-based Elective on Emerging Infectious Disease.
Mathias, Clinton
2015-08-25
Objective. To implement a learner-led, discussion-based course aimed at exposing second-year pharmacy learners to the study of emerging infectious diseases from a global health perspective and to assess the role and importance of pharmacists in the management of disease outbreaks. Design. Learners examined literature pertinent to an emerging infectious disease in a 3-credit, discussion-based course and participated in peer discussion led by a designated learner. Instructional materials included journal articles, audio-visual presentations, documentaries, book chapters, movies, newspaper/magazine articles, and other materials. Learning outcomes were measured based on the ability of learners to perform critical thinking and analysis, communicate with their peers, and participate in class discussions. Assessment. The course was offered to 2 consecutive cohorts consisting of 14 and 16 learners, respectively. Overall, every learner in the first cohort achieved a final grade of A for the course. In the second cohort, the overall grade distribution consisted of grades of A, B, and C for the course. Learner evaluations indicated that the active-learning, discussion-based environment significantly enhanced interest in the topic and overall performance in the course. Conclusion. The elective course on emerging infectious diseases provided in-depth exposure to disease topics normally not encountered in the pharmacy curriculum. Learners found the material and format valuable, and the course enhanced their appreciation of infectious diseases, research methodology, critical thinking and analysis, and their roles as pharmacists.
Ethical and methodological issues in research with Sami experiencing disability.
Melbøe, Line; Hansen, Ketil Lenert; Johnsen, Bjørn-Eirik; Fedreheim, Gunn Elin; Dinesen, Tone; Minde, Gunn-Tove; Rustad, Marit
2016-01-01
Background A study of disability among the indigenous Sami people in Norway presented a number of ethical and methodological challenges rarely addressed in the literature. Objectives The main study was designed to examine and understand the everyday life, transitions between life stages and democratic participation of Norwegian Sami people experiencing disability. Hence, the purpose of this article is to increase the understanding of possible ethical and methodological issues in research within this field. The article describes and discusses ethical and methodological issues that arose when conducting our study and identifies some strategies for addressing issues like these. Methods The ethical and methodological issues addressed in the article are based on a qualitative study among indigenous Norwegian Sami people experiencing disability. The data in this study were collected through 31 semi-structured in-depth interviews with altogether 24 Sami people experiencing disability and 13 next of kin of Sami people experiencing disability (8 mothers, 2 fathers, 2 sister and 1 guardian). Findings and discussion The researchers identified 4 main areas of ethical and methodological issues. We present these issues chronologically as they emerged in the research process: 1) concept of knowledge when designing the study, 2) gaining access, 3) data collection and 4) analysis and accountability. Conclusion The knowledge generated from this study has the potential to benefit future health research, specifically of Norwegian Sami people experiencing disability, as well as health research concerning indigenous people in general, providing scientific-based insight into important ethical and methodological issues in research with indigenous people experiencing disability.
Ethical and methodological issues in research with Sami experiencing disability
Melbøe, Line; Hansen, Ketil Lenert; Johnsen, Bjørn-Eirik; Fedreheim, Gunn Elin; Dinesen, Tone; Minde, Gunn-Tove; Rustad, Marit
2016-01-01
Background A study of disability among the indigenous Sami people in Norway presented a number of ethical and methodological challenges rarely addressed in the literature. Objectives The main study was designed to examine and understand the everyday life, transitions between life stages and democratic participation of Norwegian Sami people experiencing disability. Hence, the purpose of this article is to increase the understanding of possible ethical and methodological issues in research within this field. The article describes and discusses ethical and methodological issues that arose when conducting our study and identifies some strategies for addressing issues like these. Methods The ethical and methodological issues addressed in the article are based on a qualitative study among indigenous Norwegian Sami people experiencing disability. The data in this study were collected through 31 semi-structured in-depth interviews with altogether 24 Sami people experiencing disability and 13 next of kin of Sami people experiencing disability (8 mothers, 2 fathers, 2 sister and 1 guardian). Findings and discussion The researchers identified 4 main areas of ethical and methodological issues. We present these issues chronologically as they emerged in the research process: 1) concept of knowledge when designing the study, 2) gaining access, 3) data collection and 4) analysis and accountability. Conclusion The knowledge generated from this study has the potential to benefit future health research, specifically of Norwegian Sami people experiencing disability, as well as health research concerning indigenous people in general, providing scientific-based insight into important ethical and methodological issues in research with indigenous people experiencing disability. PMID:27396747
Véliz, Pedro L; Berra, Esperanza M; Jorna, Ana R
2015-07-01
INTRODUCTION Medical specialties' core curricula should take into account functions to be carried out, positions to be filled and populations to be served. The functions in the professional profile for specialty training of Cuban intensive care and emergency medicine specialists do not include all the activities that they actually perform in professional practice. OBJECTIVE Define the specific functions and procedural skills required of Cuban specialists in intensive care and emergency medicine. METHODS The study was conducted from April 2011 to September 2013. A three-stage methodological strategy was designed using qualitative techniques. By purposive maximum variation sampling, 82 professionals were selected. Documentary analysis and key informant criteria were used in the first stage. Two expert groups were formed in the second stage: one used various group techniques (focus group, oral and written brainstorming) and the second used a three-round Delphi method. In the final stage, a third group of experts was questioned in semistructured in-depth interviews, and a two-round Delphi method was employed to assess priorities. RESULTS Ultimately, 78 specific functions were defined: 47 (60.3%) patient care, 16 (20.5%) managerial, 6 (7.7%) teaching, and 9 (11.5%) research. Thirty-one procedural skills were identified. The specific functions and procedural skills defined relate to the profession's requirements in clinical care of the critically ill, management of patient services, teaching and research at the specialist's different occupational levels. CONCLUSIONS The specific functions and procedural skills required of intensive care and emergency medicine specialists were precisely identified by a scientific method. This product is key to improving the quality of teaching, research, administration and patient care in this specialty in Cuba. The specific functions and procedural skills identified are theoretical, practical, methodological and social contributions to inform future curricular reform and to help intensive care specialists enhance their performance in comprehensive patient care. KEYWORDS Intensive care, urgent care, emergency medicine, continuing medical education, curriculum, diagnostic techniques and procedures, medical residency, Cuba.
Individuals and Their Employability
ERIC Educational Resources Information Center
McQuade, Eamonn; Maguire, Theresa
2005-01-01
Purpose: This paper aims to describe a research project that is addressing the employability of individuals in the higher-cost Irish economy. Design/methodology/approach: The Programme for University-Industry Interface (PUII) uses a community-of-practice methodology combined with academic research. Findings: A number of emerging enterprise models…
Latino Immigrants, Acculturation, and Health: Promising New Directions in Research.
Abraído-Lanza, Ana F; Echeverría, Sandra E; Flórez, Karen R
2016-01-01
This article provides an analysis of novel topics emerging in recent years in research on Latino immigrants, acculturation, and health. In the past ten years, the number of studies assessing new ways to conceptualize and understand how acculturation-related processes may influence health has grown. These new frameworks draw from integrative approaches testing new ground to acknowledge the fundamental role of context and policy. We classify the emerging body of evidence according to themes that we identify as promising directions--intrapersonal, interpersonal, social environmental, community, political, and global contexts, cross-cutting themes in life course and developmental approaches, and segmented assimilation--and discuss the challenges and opportunities each theme presents. This body of work, which considers acculturation in context, points to the emergence of a new wave of research that holds great promise in driving forward the study of Latino immigrants, acculturation, and health. We provide suggestions to further advance the ideologic and methodologic rigor of this new wave.
Analyzing 7000 texts on deep brain stimulation: what do they tell us?
Ineichen, Christian; Christen, Markus
2015-01-01
The enormous increase in numbers of scientific publications in the last decades requires quantitative methods for obtaining a better understanding of topics and developments in various fields. In this exploratory study, we investigate the emergence, trends, and connections of topics within the whole text corpus of the deep brain stimulation (DBS) literature based on more than 7000 papers (title and abstracts) published between 1991 to 2014 using a network approach. Taking the co-occurrence of basic terms that represent important topics within DBS as starting point, we outline the statistics of interconnections between DBS indications, anatomical targets, positive, and negative effects, as well as methodological, technological, and economic issues. This quantitative approach confirms known trends within the literature (e.g., regarding the emergence of psychiatric indications). The data also reflect an increased discussion about complex issues such as personality connected tightly to the ethical context, as well as an apparent focus on depression as important DBS indication, where the co-occurrence of terms related to negative effects is low both for the indication as well as the related anatomical targets. We also discuss consequences of the analysis from a bioethical perspective, i.e., how such a quantitative analysis could uncover hidden subject matters that have ethical relevance. For example, we find that hardware-related issues in DBS are far more robustly connected to an ethical context compared to impulsivity, concrete side-effects or death/suicide. Our contribution also outlines the methodology of quantitative text analysis that combines statistical approaches with expert knowledge. It thus serves as an example how innovative quantitative tools can be made useful for gaining a better understanding in the field of DBS.
Analyzing 7000 texts on deep brain stimulation: what do they tell us?
Ineichen, Christian; Christen, Markus
2015-01-01
The enormous increase in numbers of scientific publications in the last decades requires quantitative methods for obtaining a better understanding of topics and developments in various fields. In this exploratory study, we investigate the emergence, trends, and connections of topics within the whole text corpus of the deep brain stimulation (DBS) literature based on more than 7000 papers (title and abstracts) published between 1991 to 2014 using a network approach. Taking the co-occurrence of basic terms that represent important topics within DBS as starting point, we outline the statistics of interconnections between DBS indications, anatomical targets, positive, and negative effects, as well as methodological, technological, and economic issues. This quantitative approach confirms known trends within the literature (e.g., regarding the emergence of psychiatric indications). The data also reflect an increased discussion about complex issues such as personality connected tightly to the ethical context, as well as an apparent focus on depression as important DBS indication, where the co-occurrence of terms related to negative effects is low both for the indication as well as the related anatomical targets. We also discuss consequences of the analysis from a bioethical perspective, i.e., how such a quantitative analysis could uncover hidden subject matters that have ethical relevance. For example, we find that hardware-related issues in DBS are far more robustly connected to an ethical context compared to impulsivity, concrete side-effects or death/suicide. Our contribution also outlines the methodology of quantitative text analysis that combines statistical approaches with expert knowledge. It thus serves as an example how innovative quantitative tools can be made useful for gaining a better understanding in the field of DBS. PMID:26578908
Figueiredo, L; Erny, G L; Santos, L; Alves, A
2016-01-01
Personal-care products (PCPs) involve a variety of chemicals whose persistency along with their constant release into the environment raised concern to their potential impact on wildlife and humans health. Regarded as emergent contaminants, PCPs demonstrated estrogenic activity leading to the need of new methodologies to detect and remove those compounds from the environment. Molecular imprinting starts with a complex between a template molecule and a functional monomer, which is then polymerized in the presence of a cross-linker. After template removal, the polymer will contain specific cavities. Based on a good selectivity towards the template, molecularly imprinted polymers (MIPs) have been investigated as efficient materials for the analysis and extraction of the so called emergent pollutants contaminants. Rather than lowering the limit of detections, the key theoretical advantage of MIP over existing methodologies is the potential to target specific chemicals. This unique feature, sometime named specificity (as synonym to very high selectivity) allows to use cheap, simple and/or rapid quantitative techniques such as fast separation with ultra-violet (UV) detection, sensors or even spectrometric techniques. When a high degree of selectivity is achieved, samples extracted with MIPs can be directly analyzed without the need of a separation step. However, while some papers clearly demonstrated the specificity of their MIP toward the targeted PCP, such prove is often lacking, especially with real matrices, making it difficult to assess the success of the different approaches. This review paper focusses on the latest development of MIPs for the analysis of personal care products in the environment, with particular emphasis on design, preparation and practical applications of MIPs. Copyright © 2015 Elsevier B.V. All rights reserved.
Understanding palliative care on the heart failure care team: an innovative research methodology.
Lingard, Lorelei A; McDougall, Allan; Schulz, Valerie; Shadd, Joshua; Marshall, Denise; Strachan, Patricia H; Tait, Glendon R; Arnold, J Malcolm; Kimel, Gil
2013-05-01
There is a growing call to integrate palliative care for patients with advanced heart failure (HF). However, the knowledge to inform integration efforts comes largely from interview and survey research with individual patients and providers. This work has been critically important in raising awareness of the need for integration, but it is insufficient to inform solutions that must be enacted not by isolated individuals but by complex care teams. Research methods are urgently required to support systematic exploration of the experiences of patients with HF, family caregivers, and health care providers as they interact as a care team. To design a research methodology that can support systematic exploration of the experiences of patients with HF, caregivers, and health care providers as they interact as a care team. This article describes in detail a methodology that we have piloted and are currently using in a multisite study of HF care teams. We describe three aspects of the methodology: the theoretical framework, an innovative sampling strategy, and an iterative system of data collection and analysis that incorporates four data sources and four analytical steps. We anticipate that this innovative methodology will support groundbreaking research in both HF care and other team settings in which palliative integration efforts are emerging for patients with advanced nonmalignant disease. Copyright © 2013 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.
Pilatti, Fernanda Kokowicz; Ramlov, Fernanda; Schmidt, Eder Carlos; Costa, Christopher; Oliveira, Eva Regina de; Bauer, Claudia M; Rocha, Miguel; Bouzon, Zenilda Laurita; Maraschin, Marcelo
2017-01-30
Fossil fuels, e.g. gasoline and diesel oil, account for substantial share of the pollution that affects marine ecosystems. Environmental metabolomics is an emerging field that may help unravel the effect of these xenobiotics on seaweeds and provide methodologies for biomonitoring coastal ecosystems. In the present study, FTIR and multivariate analysis were used to discriminate metabolic profiles of Ulva lactuca after in vitro exposure to diesel oil and gasoline, in combinations of concentrations (0.001%, 0.01%, 0.1%, and 1.0% - v/v) and times of exposure (30min, 1h, 12h, and 24h). PCA and HCA performed on entire mid-infrared spectral window were able to discriminate diesel oil-exposed thalli from the gasoline-exposed ones. HCA performed on spectral window related to the protein absorbance (1700-1500cm -1 ) enabled the best discrimination between gasoline-exposed samples regarding the time of exposure, and between diesel oil-exposed samples according to the concentration. The results indicate that the combination of FTIR with multivariate analysis is a simple and efficient methodology for metabolic profiling with potential use for biomonitoring strategies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Naunheim, Matthew R; Kozin, Elliot D; Sethi, Rosh K; Ota, H G; Gray, Stacey T; Shrime, Mark G
2017-03-01
Specialty emergency departments (EDs) provide a unique mechanism of health care delivery, but the value that they add to the medical system is not known. Evaluation of patient preferences to determine value can have a direct impact on resource allocation and direct-to-specialist care. To assess the feasibility of contingent valuation (CV) methodology using a willingness-to-pay (WTP) survey to evaluate specialty emergency services, in the context of an ophthalmology- and otolaryngology-specific ED. Contingent valuation analysis of a standalone otolaryngology and ophthalmology ED. Participants were English-speaking adults presenting to a dedicated otolaryngology and ophthalmology ED. The WTP questions were assessed using a payment card format, with reference to an alternative modality of treatment (ie, general ED), and were analyzed with multivariate regression. Validated WTP survey administered from October 14, 2014, through October 1, 2015. Sociodemographic data, level of distress, referral data, income, and WTP. A total of 327 of 423 (77.3%) ED patients responded to the WTP survey, with 116 ophthalmology and 211 otolaryngology patients included (52.3% female; mean [range] age, 46 [18-90] years). The most common reason for seeking care at this facility was a reputation for specialty care for both ear, nose, and throat (80 [37.9%]) and ophthalmology (43 [37.1%]). Mean WTP for specialty-specific ED services was $377 for ophthalmology patients, and $321 for otolaryngology patients ($340 overall; 95% CI, $294 to $386), without significant difference between groups (absolute difference, $56; 95% CI, $-156 to $43). Self-reported level of distress was higher among ear, nose, and throat vs ophthalmology patients (absolute difference, 0.47 on a Likert scale of 1-7; 95% CI, 0.10 to 0.84). Neither level of distress, income, nor demographic characteristics influenced WTP, but patients with higher estimates of total visit cost were more likely to have higher WTP (β coefficient, 0.27; SE, 0.05; adjusted R2 = 0.17 for model). Patients with eye and ear, nose, and throat complaints place a mean explicit value on specialty emergency services of $340 per visit, relative to general emergency care. Ultimately, CV data using WTP methodology are useful in valuing patient preferences in monetary terms and can help inform state-wide resource allocation and the availability of direct-to-specialist care.
Advocating for Additional Aspects of Quality
ERIC Educational Resources Information Center
Herbel-Eisenmann, Beth A.
2016-01-01
Across the set of articles in this Special Issue, various authors describe methodology and methodological decisions, illustrate analyses, and share emerging findings from The Evolution of the Discourse of School Mathematics (EDSM) project. In general, the investigators were interested in identifying "changes over time in the kind of…
Wolf, Lisa A; Perhats, Cydne; Clark, Paul R; Moon, Michael D; Zavotsky, Kathleen Evanovich
2017-09-22
The Institute of Medicine recognizes that the workplace environment is a crucial factor in the ability of nurses to provide safe and effective care, and thus interactions that affect the quality and safety of the work environment require exploration. The purpose of this study was to use situational analysis to develop a grounded theory of workplace bullying as it manifests specifically in the emergency care setting. This study used a grounded theory methodology called situational analysis. 44 emergency RNs were recruited to participate in one of 4 focus group sessions, which were transcribed in their entirety, and, along with field notes, served as the dataset. This grounded theory describes the characteristics of human actors and their reactions to conditions in the practice environment that lead to greater or lesser levels of bullying, and the responses to bullying as it occurs in U.S. emergency departments. Workplace bullying is a significant factor in the dynamics of patient care, nursing work culture, and nursing retention. The impact on patient care cannot be overestimated, both in terms of errors, substandard care, and the negative effects of high turnover of experienced RNs who leave, compounded by the inexperience of newly hired RNs. An assessment of hospital work environments should include nurse perceptions of workplace bullying, and interventions should focus on effective managerial processes for handling workplace bullying. Future research should include testing of the theoretical coherence of the model, and the testing of bullying interventions to determine the effect on workplace environment, nursing intent to leave/retention, and patient outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.
Pluye, Pierre; Gagnon, Marie-Pierre; Griffiths, Frances; Johnson-Lafleur, Janique
2009-04-01
A new form of literature review has emerged, Mixed Studies Review (MSR). These reviews include qualitative, quantitative and mixed methods studies. In the present paper, we examine MSRs in health sciences, and provide guidance on processes that should be included and reported. However, there are no valid and usable criteria for concomitantly appraising the methodological quality of the qualitative, quantitative and mixed methods studies. To propose criteria for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies or study components. A three-step critical review was conducted. 2322 references were identified in MEDLINE, and their titles and abstracts were screened; 149 potentially relevant references were selected and the full-text papers were examined; 59 MSRs were retained and scrutinized using a deductive-inductive qualitative thematic data analysis. This revealed three types of MSR: convenience, reproducible, and systematic. Guided by a proposal, we conducted a qualitative thematic data analysis of the quality appraisal procedures used in the 17 systematic MSRs (SMSRs). Of 17 SMSRs, 12 showed clear quality appraisal procedures with explicit criteria but no SMSR used valid checklists to concomitantly appraise qualitative, quantitative and mixed methods studies. In two SMSRs, criteria were developed following a specific procedure. Checklists usually contained more criteria than needed. In four SMSRs, a reliability assessment was described or mentioned. While criteria for quality appraisal were usually based on descriptors that require specific methodological expertise (e.g., appropriateness), no SMSR described the fit between reviewers' expertise and appraised studies. Quality appraisal usually resulted in studies being ranked by methodological quality. A scoring system is proposed for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies for SMSRs. This scoring system may also be used to appraise the methodological quality of qualitative, quantitative and mixed methods components of mixed methods research.
Hetrick, Sarah E; Parker, Alexandra G; Callahan, Patrick; Purcell, Rosemary
2010-12-01
Within the field of evidence-based practice, a process termed 'evidence mapping' is emerging as a less exhaustive yet systematic and replicable methodology that allows an understanding of the extent and distribution of evidence in a broad clinical area, highlighting both what is known and where gaps in evidence exist. This article describes the general principles of mapping methodology by using illustrations derived from our experience conducting an evidence map of interventions for youth mental-health disorders. Evidence maps are based on an explicit research question relating to the field of enquiry, which may vary in depth, but should be informed by end-users. The research question then drives the search for, and collection of, appropriate studies utilizing explicit and reproducible methods at each stage. This includes clear definition of components of the research question, development of a thorough and reproducible search strategy, development of explicit inclusion and exclusion criteria, and transparent decisions about the level of information to be obtained from each study. Evidence mapping is emerging as a rigorous methodology for gathering and disseminating up-to-date information to end-users. Thoughtful planning and assessment of available resources (e.g. staff, time, budget) are required by those applying this methodology to their particular field of clinical enquiry given the potential scope of the work. The needs of the end-user need to be balanced with available resources. Information derived needs to be effectively communicated, with the uptake of that evidence into clinical practice the ultimate aim. © 2010 The Authors. Journal compilation © 2010 Blackwell Publishing Ltd.
Chemical Synthesis of Complex Molecules Using Nanoparticle Catalysis
Cong, Huan; Porco, John A.
2011-01-01
Nanoparticle catalysis has emerged as an active topic in organic synthesis. Of particular interest is the development of enabling methodologies to efficiently assemble complex molecules using nanoparticle catalysis. This Viewpoint highlights recent developments and discusses future perspectives in this emerging field. PMID:22347681
RAPID ASSESSMENT OF POTENTIAL GROUND-WATER CONTAMINATION UNDER EMERGENCY RESPONSE CONDITIONS
Emergency response actions at chemical spills and abandoned hazardous waste sites often require rapid assessment of the potential for groundwater contamination by the chemical or waste compound. This manual provides a rapid assessment methodology for performing such an evaluation...
Borycki, Elizabeth; Kushniruk, Andre; Carvalho, Christopher
2013-01-01
Internationally, health information systems (HIS) safety has emerged as a significant concern for governments. Recently, research has emerged that has documented the ability of HIS to be implicated in the harm and death of patients. Researchers have attempted to develop methods that can be used to prevent or reduce technology-induced errors. Some researchers are developing methods that can be employed prior to systems release. These methods include the development of safety heuristics and clinical simulations. In this paper, we outline our methodology for developing safety heuristics specific to identifying the features or functions of a HIS user interface design that may lead to technology-induced errors. We follow this with a description of a methodological approach to validate these heuristics using clinical simulations. PMID:23606902
Addressing the gap between public health emergency planning and incident response
Freedman, Ariela M; Mindlin, Michele; Morley, Christopher; Griffin, Meghan; Wooten, Wilma; Miner, Kathleen
2013-01-01
Objectives: Since 9/11, Incident Command System (ICS) and Emergency Operations Center (EOC) are relatively new concepts to public health, which typically operates using less hierarchical and more collaborative approaches to organizing staff. This paper describes the 2009 H1N1 influenza outbreak in San Diego County to explore the use of ICS and EOC in public health emergency response. Methods: This study was conducted using critical case study methodology consisting of document review and 18 key-informant interviews with individuals who played key roles in planning and response. Thematic analysis was used to analyze data. Results: Several broad elements emerged as key to ensuring effective and efficient public health response: 1) developing a plan for emergency response; 2) establishing the framework for an ICS; 3) creating the infrastructure to support response; 4) supporting a workforce trained on emergency response roles, responsibilities, and equipment; and 5) conducting regular preparedness exercises. Conclusions: This research demonstrates the value of investments made and that effective emergency preparedness requires sustained efforts to maintain personnel and material resources. By having the infrastructure and experience based on ICS and EOC, the public health system had the capability to surge-up: to expand its day-to-day operation in a systematic and prolonged manner. None of these critical actions are possible without sustained funding for the public health infrastructure. Ultimately, this case study illustrates the importance of public health as a key leader in emergency response. PMID:28228983
“It’s a Wild Thing, Waiting to Get Me”
Davis, Boyd H.; Pope, Charlene; Mason, Peyton R.; Magwood, Gayenell; Jenkins, Carolyn M.
2016-01-01
Purpose This mixed methods study uses a unique approach from social science and linguistics methodologies, a combination of positioning theory and stance analysis, to examine how 20 African Americans with type 2 diabetes make sense of the practices that led to recurrent emergency department visits to identify needs for more effective intervention. Methods In a purposive sample of postemergency department visit interviews with a same-race interviewer, people responded to open-ended questions reflecting on the decision to seek emergency department care. As applied to diabetes education, positioning theory explains that people use their language to position themselves toward their disease, their medications, and the changes in their lives. Transcriptions were coded using discourse analysis to categorize themes. As a form of triangulation, stance analysis measured language patterns using factor analysis to see when and how speakers revealed affect, attitude, and agentive choices for action. Conclusion Final analysis revealed that one third of the sample exhibited high scores for positive agency or capacity for decision-making and self-management, while the rest expressed less control and more negative emotions and fears that may preclude self-management. This approach suggests a means to tailor diabetes education considering alternative approaches focused on communication for those facing barriers. PMID:21515541
The Fungal Frontier: A Comparative Analysis of Methods Used in the Study of the Human Gut Mycobiome
Huseyin, Chloe E.; Rubio, Raul Cabrera; O’Sullivan, Orla; Cotter, Paul D.; Scanlan, Pauline D.
2017-01-01
The human gut is host to a diverse range of fungal species, collectively referred to as the gut “mycobiome”. The gut mycobiome is emerging as an area of considerable research interest due to the potential roles of these fungi in human health and disease. However, there is no consensus as to what the best or most suitable methodologies available are with respect to characterizing the human gut mycobiome. The aim of this study is to provide a comparative analysis of several previously published mycobiome-specific culture-dependent and -independent methodologies, including choice of culture media, incubation conditions (aerobic versus anaerobic), DNA extraction method, primer set and freezing of fecal samples to assess their relative merits and suitability for gut mycobiome analysis. There was no significant effect of media type or aeration on culture-dependent results. However, freezing was found to have a significant effect on fungal viability, with significantly lower fungal numbers recovered from frozen samples. DNA extraction method had a significant effect on DNA yield and quality. However, freezing and extraction method did not have any impact on either α or β diversity. There was also considerable variation in the ability of different fungal-specific primer sets to generate PCR products for subsequent sequence analysis. Through this investigation two DNA extraction methods and one primer set was identified which facilitated the analysis of the mycobiome for all samples in this study. Ultimately, a diverse range of fungal species were recovered using both approaches, with Candida and Saccharomyces identified as the most common fungal species recovered using culture-dependent and culture-independent methods, respectively. As has been apparent from ecological surveys of the bacterial fraction of the gut microbiota, the use of different methodologies can also impact on our understanding of gut mycobiome composition and therefore requires careful consideration. Future research into the gut mycobiome needs to adopt a common strategy to minimize potentially confounding effects of methodological choice and to facilitate comparative analysis of datasets. PMID:28824566
Introduction on performance analysis and profiling methodologies for KVM on ARM virtualization
NASA Astrophysics Data System (ADS)
Motakis, Antonios; Spyridakis, Alexander; Raho, Daniel
2013-05-01
The introduction of hardware virtualization extensions on ARM Cortex-A15 processors has enabled the implementation of full virtualization solutions for this architecture, such as KVM on ARM. This trend motivates the need to quantify and understand the performance impact, emerged by the application of this technology. In this work we start looking into some interesting performance metrics on KVM for ARM processors, which can provide us with useful insight that may lead to potential improvements in the future. This includes measurements such as interrupt latency and guest exit cost, performed on ARM Versatile Express and Samsung Exynos 5250 hardware platforms. Furthermore, we discuss additional methodologies that can provide us with a deeper understanding in the future of the performance footprint of KVM. We identify some of the most interesting approaches in this field, and perform a tentative analysis on how these may be implemented in the KVM on ARM port. These take into consideration hardware and software based counters for profiling, and issues related to the limitations of the simulators which are often used, such as the ARM Fast Models platform.
A Survey on Chinese Scholars' Adoption of Mixed Methods
ERIC Educational Resources Information Center
Zhou, Yuchun
2018-01-01
Since the 1980s when mixed methods emerged as "the third research methodology", it was widely adopted in Western countries. However, inadequate literature revealed how this methodology was accepted by scholars in Asian countries, such as China. Therefore, this paper used a quantitative survey to investigate Chinese scholars' perceptions…
Making the Invisible Visible: A Methodological and a Substantive Issue
ERIC Educational Resources Information Center
Dagley, Valerie
2004-01-01
This article discusses the issue of "making the invisible visible" from a methodological and a substantive viewpoint. The ideas emerged from a doctoral research study into individual target setting with middle ability students in an English secondary school. The students involved had been identified by assessments as "average"…
ERIC Educational Resources Information Center
Armson, Genevieve; Whiteley, Alma
2010-01-01
Purpose: The purpose of this paper is to investigate employees' and managers' accounts of interactive learning and what might encourage or inhibit emergent learning. Design/methodology/approach: The approach taken was a constructivist/social constructivist ontology, interpretive epistemology and qualitative methodology, using grounded theory…
Duoethnography: A New Research Methodology for Mathematics Education
ERIC Educational Resources Information Center
Rapke, Tina Kathleen
2014-01-01
I have developed an adaptation of the emerging duoethnography methodology that allows me to draw on my processes of creating mathematics, interpret these processes for what they might mean for classrooms, and explore/reconceptualize my complementary and competing perspectives as a mathematician and an educator. This article includes a…
Molecular cytogenetics: an indispensable tool for cancer diagnosis.
Wan, Thomas Sk; Ma, Edmond Sk
2012-01-01
Cytogenetic aberrations may escape detection or recognition in traditional karyotyping. The past decade has seen an explosion of methodological advances in molecular cytogenetics technology. These cytogenetics techniques add color to the black and white world of conventional banding. Fluorescence in-situ hybridization (FISH) study has emerged as an indispensable tool for both basic and clinical research, as well as diagnostics, in leukemia and cancers. FISH can be used to identify chromosomal abnormalities through fluorescent labeled DNA probes that target specific DNA sequences. Subsequently, FISH-based tests such as multicolor karyotyping, comparative genomic hybridization (CGH) and array CGH have been used in emerging clinical applications as they enable resolution of complex karyotypic aberrations and whole global scanning of genomic imbalances. More recently, crossspecies array CGH analysis has also been employed in cancer gene identification. The clinical impact of FISH is pivotal, especially in the diagnosis, prognosis and treatment decisions for hematological diseases, all of which facilitate the practice of personalized medicine. This review summarizes the methodology and current utilization of these FISH techniques in unraveling chromosomal changes and highlights how the field is moving away from conventional methods towards molecular cytogenetics approaches. In addition, the potential of the more recently developed FISH tests in contributing information to genetic abnormalities is illustrated.
Joseph, Jacquleen; Jaswal, Surinder
2014-06-01
The field of "Public Health in Disasters and Complex Emergencies" is replete with either epidemiological studies or studies in the area of hospital preparedness and emergency care. The field is dominated by hospital-based or emergency phase-related literature, with very little attention on long-term health and mental health consequences. The social science, or the public mental health perspective, too, is largely missing. It is in this context that the case report of the November 26, 2008 Mumbai terror attack survivors is presented to bring forth the multi-dimensional and dynamic long-term impacts, and their consequences for psychological well-being, two years after the incident. Based on literature, the report formulates a theoretical framework through which the lived experiences of the survivors is analyzed and understood from a social science perspective. This report is an outcome of the ongoing work with the survivors over a period of two years. A mixed methodology was used. It quantitatively captures the experience of 231 families following the attack, and also uses a self-reporting questionnaire (SRQ), SRQ20, to understand the psychological distress. In-depth qualitative case studies constructed from the process records and in-depth interviews focus on lived experiences of the survivors and explain the patterns emerging from the quantitative analysis. This report outlines the basic profile of the survivors, the immediate consequences of the attack, the support received, psychological consequences, and the key factors contributing to psychological distress. Through analysis of the key factors and the processes emerging from the lived experiences that explain the progression of vulnerability to psychological distress, this report puts forth a psychosocial framework for understanding psychological distress among survivors of the November 26, 2008 Mumbai terror attack.
An Integrated Approach to Life Cycle Analysis
NASA Technical Reports Server (NTRS)
Chytka, T. M.; Brown, R. W.; Shih, A. T.; Reeves, J. D.; Dempsey, J. A.
2006-01-01
Life Cycle Analysis (LCA) is the evaluation of the impacts that design decisions have on a system and provides a framework for identifying and evaluating design benefits and burdens associated with the life cycles of space transportation systems from a "cradle-to-grave" approach. Sometimes called life cycle assessment, life cycle approach, or "cradle to grave analysis", it represents a rapidly emerging family of tools and techniques designed to be a decision support methodology and aid in the development of sustainable systems. The implementation of a Life Cycle Analysis can vary and may take many forms; from global system-level uncertainty-centered analysis to the assessment of individualized discriminatory metrics. This paper will focus on a proven LCA methodology developed by the Systems Analysis and Concepts Directorate (SACD) at NASA Langley Research Center to quantify and assess key LCA discriminatory metrics, in particular affordability, reliability, maintainability, and operability. This paper will address issues inherent in Life Cycle Analysis including direct impacts, such as system development cost and crew safety, as well as indirect impacts, which often take the form of coupled metrics (i.e., the cost of system unreliability). Since LCA deals with the analysis of space vehicle system conceptual designs, it is imperative to stress that the goal of LCA is not to arrive at the answer but, rather, to provide important inputs to a broader strategic planning process, allowing the managers to make risk-informed decisions, and increase the likelihood of meeting mission success criteria.
Grek, Ami; Booth, Sandra; Festic, Emir; Maniaci, Michael; Shirazi, Ehsan; Thompson, Kristine; Starbuck, Angela; Mcree, Chad; Naessens, James M; Moreno Franco, Pablo
The Surviving Sepsis Campaign guidelines are designed to decrease mortality through consistent application of a 7-element bundle. This study evaluated the impact of improvement in bundle adherence using a time-series analysis of compliance with the bundle elements before and after interventions intended to improve the process, while also looking at hospital mortality. This article describes interventions used to improve bundle compliance and hospital mortality in patients admitted through the emergency department with sepsis, severe sepsis, or septic shock. Quality improvement methodology was used to develop high-impact interventions that led to dramatically improved adherence to the Surviving Sepsis Campaign guidelines bundle. Improved performance was associated with a significant decrease in the in-hospital mortality of severe sepsis patients presenting to the emergency department.
Dynamic functional connectivity: Promise, issues, and interpretations
Hutchison, R. Matthew; Womelsdorf, Thilo; Allen, Elena A.; Bandettini, Peter A.; Calhoun, Vince D.; Corbetta, Maurizio; Penna, Stefania Della; Duyn, Jeff H.; Glover, Gary H.; Gonzalez-Castillo, Javier; Handwerker, Daniel A.; Keilholz, Shella; Kiviniemi, Vesa; Leopold, David A.; de Pasquale, Francesco; Sporns, Olaf; Walter, Martin; Chang, Catie
2013-01-01
The brain must dynamically integrate, coordinate, and respond to internal and external stimuli across multiple time scales. Non-invasive measurements of brain activity with fMRI have greatly advanced our understanding of the large-scale functional organization supporting these fundamental features of brain function. Conclusions from previous resting-state fMRI investigations were based upon static descriptions of functional connectivity (FC), and only recently studies have begun to capitalize on the wealth of information contained within the temporal features of spontaneous BOLD FC. Emerging evidence suggests that dynamic FC metrics may index changes in macroscopic neural activity patterns underlying critical aspects of cognition and behavior, though limitations with regard to analysis and interpretation remain. Here, we review recent findings, methodological considerations, neural and behavioral correlates, and future directions in the emerging field of dynamic FC investigations. PMID:23707587
Phylodynamic analysis of porcine circovirus type 2: Methodological approach and datasets.
Franzo, Giovanni; Cortey, Martì; Segalés, Joaquim; Hughes, Joseph; Drigo, Michele
2016-09-01
Since its first description, PCV2 has emerged as one of the most economically relevant diseases for the swine industry. Despite the introduction of vaccines effective in controlling clinical syndromes, PCV2 spread was not prevented and some potential evidences of vaccine immuno escape have recently been reported ("Complete genome sequence of a novel porcine circovirus type 2b variant present in cases of vaccine failures in the United States" (Xiao and Halbur, 2012) [1], "Genetic and antigenic characterization of a newly emerging porcine circovirus type 2b mutant first isolated in cases of vaccine failure in Korea" (Seo et al., 2014) [2]). In this article, we used a collection of PCV2 full genomes, provided in the present manuscript, and several phylogentic, phylodynamic and bioinformatic methods to investigate different aspects of PCV2 epidemiology, history and evolution (more thoroughly described in "PHYLODYNAMIC ANALYSIS of PORCINE CIRCOVIRUS TYPE 2 REVEALS GLOBAL WAVES of EMERGING GENOTYPES and the CIRCULATION of RECOMBINANT FORMS"[3]). The methodological approaches used to consistently detect recombiantion events and estimate population dymanics and spreading patterns of rapidly evolving ssDNA viruses are herein reported. Programs used are described and original scripts have been provided. Ensembled databases used are also made available. These consist of a broad collection of complete genome sequences (i.e. 843 sequences; 63 complete genomes of PCV2a, 310 of PCV2b, 4 of PCV2c, 217 of PCV2d, 64 of CRF01, 140 of CRF02 and 45 of CRF03.), divided in differnt ORF (i.e. ORF1, ORF2 and intergenic regions), of PCV2 genotypes and major Circulating Recombinat Forms (CRF) properly annotated with respective collection data and country. Globally, all of these data can be used as a starting point for further studies and for classification purpose.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loulou, Richard; Waaub, Jean-Philippe; Zaccour, Georges
2005-07-01
This volume on energy and environmental modeling describes a broad variety of modeling methodologies. It includes chapters covering: The Sustainability of Economic Growth by Cabo, Martin-Herran & Martinez-Garcia; Abatement Scenarios in the Swiss Housing Sector by L. Drouet and others; Support and Planning for Off-Site Emergency Management, by Geldermann and others; Hybrid Energy-Economy Models, by Jaccard; The World-MARKAL Model and Its Application, by Kanudia and others; Methodology for Evaluating a Market of Tradable CO{sub 2}-Permits, by Kunsch and Springael; MERGE - A Model for Global Climate Change, by Manne and Richels; A Linear Programming Model for Capacity Expansion in anmore » Autonomous Power Generation System, by Mavrotas and Diakoulaki; Transport and Climate Policy Modeling in the Transport Sector, by Paltsev and others; Analysis of Ontario Electricity Capacity Requirements and Emissions, by Pineau and Schott; Environmental Damage in Energy/Environmental Policy Evaluation, by Van Regemorter. 71 figs.« less
NASA Astrophysics Data System (ADS)
Giordan, Daniele; Hayakawa, Yuichi; Nex, Francesco; Remondino, Fabio; Tarolli, Paolo
2018-04-01
The number of scientific studies that consider possible applications of remotely piloted aircraft systems (RPASs) for the management of natural hazards effects and the identification of occurred damages strongly increased in the last decade. Nowadays, in the scientific community, the use of these systems is not a novelty, but a deeper analysis of the literature shows a lack of codified complex methodologies that can be used not only for scientific experiments but also for normal codified emergency operations. RPASs can acquire on-demand ultra-high-resolution images that can be used for the identification of active processes such as landslides or volcanic activities but can also define the effects of earthquakes, wildfires and floods. In this paper, we present a review of published literature that describes experimental methodologies developed for the study and monitoring of natural hazards.
Gupta, B L
1991-06-01
This review surveys the emergence of electron probe X-ray microanalysis as a quantitative method for measuring the chemical elements in situ. The extension of the method to the biological sciences under the influence of Ted Hall is reviewed. Some classical experiments by Hall and his colleagues in Cambridge, UK, previously unpublished, are described; as are some of the earliest quantitative results from the cryo-sections obtained in Cambridge and elsewhere. The progress of the methodology is critically evaluated from the earliest starts to the present state of the art. Particular attention has been focused on the application of the method in providing fresh insights into the role of ions in cell and tissue physiology and pathology. A comprehensive list of references is included for a further pursuit of the topics by the interested reader.
Plue, J; Colas, F; Auffret, A G; Cousins, S A O
2017-03-01
Persistent seed banks are a key plant regeneration strategy, buffering environmental variation to allow population and species persistence. Understanding seed bank functioning within herb layer dynamics is therefore important. However, rather than assessing emergence from the seed bank in herb layer gaps, most studies evaluate the seed bank functioning via a greenhouse census. We hypothesise that greenhouse data may not reflect seed bank-driven emergence in disturbance gaps due to methodological differences. Failure in detecting (specialist) species may then introduce methodological bias into the ecological interpretation of seed bank functions using greenhouse data. The persistent seed bank was surveyed in 40 semi-natural grassland plots across a fragmented landscape, quantifying seedling emergence in both the greenhouse and in disturbance gaps. Given the suspected interpretational bias, we tested whether each census uncovers similar seed bank responses to fragmentation. Seed bank characteristics were similar between censuses. Census type affected seed bank composition, with >25% of species retrieved better by either census type, dependent on functional traits including seed longevity, production and size. Habitat specialists emerged more in disturbance gaps than in the greenhouse, while the opposite was true for ruderal species. Both censuses uncovered fragmentation-induced seed bank patterns. Low surface area sampling, larger depth of sampling and germination conditions cause underrepresentation of the habitat-specialised part of the persistent seed bank flora during greenhouse censuses. Methodological bias introduced in the recorded seed bank data may consequently have significant implications for the ecological interpretation of seed bank community functions based on greenhouse data. © 2016 German Botanical Society and The Royal Botanical Society of the Netherlands.
Willingness of Emerging Adults to Engage in Consensual Non-Monogamy: A Mixed-Methods Analysis.
Sizemore, Kayla M; Olmstead, Spencer B
2018-07-01
Over the past decade, research on consensual non-monogamy (CNM) has increased. However, willingness to engage in CNM is an understudied phenomenon within this field. Because qualitative methods are rarely used to study this phenomenon, little is known about why individuals may or may not be willing to engage in CNM. Further, research on CNM has devoted little attention to the period of emerging adulthood. The current study used a mixed-methods approach to examine a sample of emerging adults' (ages 18-29; N = 549) willingness to engage in CNM. Results from a qualitative content analysis revealed three distinct groups (Unwilling, Willing, and Open-Minded), and several subthemes emerged within each group that help explain why emerging adults are willing to engage in CNM. Quantitative analyses examined the relationship between group membership and demographic characteristics, finding that a greater proportion of women and heterosexual participants were Unwilling. Results also indicated that a greater proportion of men were Willing, and a greater proportion of sexual minorities were Open-Minded. Group mean differences were examined using quantitative measures of CNM attitudes and willingness. The Unwilling group reported more negative attitudes towards CNM compared to the Open-Minded and Willing groups. Additionally, the Open-Minded group reported more negative attitudes compared to the Willing group. On the willingness to engage in CNM Scale, the Unwilling group had lower mean scores compared to the Willing and Open-Minded groups. The Willing group had higher mean scores compared to the Open-Minded group. Implications for CNM research and methodology are discussed.
An alternative method to measure the likelihood of a financial crisis in an emerging market
NASA Astrophysics Data System (ADS)
Özlale, Ümit; Metin-Özcan, Kıvılcım
2007-07-01
This paper utilizes an early warning system in order to measure the likelihood of a financial crisis in an emerging market economy. We introduce a methodology, where we can both obtain a likelihood series and analyze the time-varying effects of several macroeconomic variables on this likelihood. Since the issue is analyzed in a non-linear state space framework, the extended Kalman filter emerges as the optimal estimation algorithm. Taking the Turkish economy as our laboratory, the results indicate that both the derived likelihood measure and the estimated time-varying parameters are meaningful and can successfully explain the path that the Turkish economy had followed between 2000 and 2006. The estimated parameters also suggest that overvalued domestic currency, current account deficit and the increase in the default risk increase the likelihood of having an economic crisis in the economy. Overall, the findings in this paper suggest that the estimation methodology introduced in this paper can also be applied to other emerging market economies as well.
Collaboration and patient safety at an emergency department - a qualitative case study.
Pedersen, Anna Helene Meldgaard; Rasmussen, Kurt; Grytnes, Regine; Nielsen, Kent Jacob
2018-03-19
Purpose The purpose of this paper is to examine how conflicts about collaboration between staff at different departments arose during the establishment of a new emergency department and how these conflicts affected the daily work and ultimately patient safety at the emergency department. Design/methodology/approach This qualitative single case study draws on qualitative semi-structured interviews and participant observation. The theoretical concepts "availability" and "receptiveness" as antecedents for collaboration will be applied in the analysis. Findings Close collaboration between departments was an essential precondition for the functioning of the new emergency department. The study shows how a lack of antecedents for collaboration affected the working relation and communication between employees and departments, which spurred negative feelings and reproduced conflicts. This situation was seen as a potential threat for the safety of the emergency patients. Research limitations/implications This study presents a single case study, at a specific point in time, and should be used as an illustrative example of how contextual and situational factors affect the working environment and through that patient safety. Originality/value Few studies provide an in-depth investigation of what actually takes place when collaboration between professional groups goes wrong and escalates, and how problems in collaboration may affect patient safety.
Repp, Kimberly K; Hawes, Eva; Rees, Kathleen J; Vorderstrasse, Beth; Mohnkern, Sue
2018-06-07
Conducting a large-scale Community Assessment for Public Health Emergency Response (CASPER) in a geographically and linguistically diverse county presents significant methodological challenges that require advance planning. The Centers for Disease Control and Prevention (CDC) has adapted methodology and provided a toolkit for a rapid needs assessment after a disaster. The assessment provides representative data of the sampling frame to help guide effective distribution of resources. This article describes methodological considerations and lessons learned from a CASPER exercise conducted by Washington County Public Health in June 2016 to assess community emergency preparedness. The CDC's CASPER toolkit provides detailed guidance for exercises in urban areas where city blocks are well defined with many single family homes. Converting the exercise to include rural areas with challenging geographical terrain, including accessing homes without public roads, required considerable adjustments in planning. Adequate preparations for vulnerable populations with English linguistic barriers required additional significant resources. Lessons learned are presented from the first countywide CASPER exercise in Oregon. Approximately 61% of interviews were completed, and 85% of volunteers reported they would participate in another CASPER exercise. Results from the emergency preparedness survey will be presented elsewhere. This experience indicates the most important considerations for conducting a CASPER exercise are oversampling clusters, overrecruiting volunteers, anticipating the actual cost of staff time, and ensuring timely language services are available during the event.
Palese, Alvisa; Marini, Eva; Guarnier, Annamaria; Barelli, Paolo; Zambiasi, Paola; Allegrini, Elisabetta; Bazoli, Letizia; Casson, Paola; Marin, Meri; Padovan, Marisa; Picogna, Michele; Taddia, Patrizia; Chiari, Paolo; Salmaso, Daniele; Marognolli, Oliva; Canzan, Federica; Ambrosi, Elisa; Saiani, Luisa; Grassetti, Luca
2016-10-01
There is growing interest in validating tools aimed at supporting the clinical decision-making process and research. However, an increased bureaucratization of clinical practice and redundancies in the measures collected have been reported by clinicians. Redundancies in clinical assessments affect negatively both patients and nurses. To validate a meta-tool measuring the risks/problems currently estimated by multiple tools used in daily practice. A secondary analysis of a database was performed, using a cross-validation and a longitudinal study designs. In total, 1464 patients admitted to 12 medical units in 2012 were assessed at admission with the Brass, Barthel, Conley and Braden tools. Pertinent outcomes such as the occurrence of post-discharge need for resources and functional decline at discharge, as well as falls and pressure sores, were measured. Explorative factor analysis of each tool, inter-tool correlations and a conceptual evaluation of the redundant/similar items across tools were performed. Therefore, the validation of the meta-tool was performed through explorative factor analysis, confirmatory factor analysis and the structural equation model to establish the ability of the meta-tool to predict the outcomes estimated by the original tools. High correlations between the tools have emerged (from r 0.428 to 0.867) with a common variance from 18.3% to 75.1%. Through a conceptual evaluation and explorative factor analysis, the items were reduced from 42 to 20, and the three factors that emerged were confirmed by confirmatory factor analysis. According to the structural equation model results, two out of three emerged factors predicted the outcomes. From the initial 42 items, the meta-tool is composed of 20 items capable of predicting the outcomes as with the original tools. © 2016 John Wiley & Sons, Ltd.
In silico reconstitution of Listeria propulsion exhibits nano-saltation.
Alberts, Jonathan B; Odell, Garrett M
2004-12-01
To understand how the actin-polymerization-mediated movements in cells emerge from myriad individual protein-protein interactions, we developed a computational model of Listeria monocytogenes propulsion that explicitly simulates a large number of monomer-scale biochemical and mechanical interactions. The literature on actin networks and L. monocytogenes motility provides the foundation for a realistic mathematical/computer simulation, because most of the key rate constants governing actin network dynamics have been measured. We use a cluster of 80 Linux processors and our own suite of simulation and analysis software to characterize salient features of bacterial motion. Our "in silico reconstitution" produces qualitatively realistic bacterial motion with regard to speed and persistence of motion and actin tail morphology. The model also produces smaller scale emergent behavior; we demonstrate how the observed nano-saltatory motion of L. monocytogenes,in which runs punctuate pauses, can emerge from a cooperative binding and breaking of attachments between actin filaments and the bacterium. We describe our modeling methodology in detail, as it is likely to be useful for understanding any subcellular system in which the dynamics of many simple interactions lead to complex emergent behavior, e.g., lamellipodia and filopodia extension, cellular organization, and cytokinesis.
"Grey" Areas and "Organized Chaos" in Emergency Response
ERIC Educational Resources Information Center
Taber, Nancy; Plumb, Donovan; Jolemore, Shawn
2008-01-01
Purpose: The purpose of this research is to explore the interaction between organizational policies and daily work practices of paramedics and firefighters within two emergency response organizations. Design/methodology/approach: Data were collected in a case study consisting of interviews, focus groups, and observations. The theoretical grounding…
Almedom, Astier M.; Tesfamichael, Berhe; Yacob, Abdu; Debretsion, Zaïd; Teklehaimanot, Kidane; Beyene, Teshome; Kuhn, Kira; Alemu, Zemui
2003-01-01
OBJECTIVE: To establish the context in which maternal psychosocial well-being is understood in war-affected settings in Eritrea. METHOD: Pretested and validated participatory methods and tools of investigation and analysis were employed to allow participants to engage in processes of qualitative data collection, on-site analysis, and interpretation. FINDINGS: Maternal psychosocial well-being in Eritrea is maintained primarily by traditional systems of social support that are mostly outside the domain of statutory primary care. Traditional birth attendants provide a vital link between the two. Formal training and regular supplies of sterile delivery kits appear to be worthwhile options for health policy and practice in the face of the post-conflict challenges of ruined infrastructure and an overstretched and/or ill-mannered workforce in the maternity health service. CONCLUSION: Methodological advances in health research and the dearth of data on maternal psychosocial well-being in complex emergency settings call for scholars and practitioners to collaborate in creative searches for sound evidence on which to base maternity, mental health and social care policy and practice. Participatory methods facilitate the meaningful engagement of key stakeholders and enhance data quality, reliability and usability. PMID:12856054
1988-01-01
ignored but the Volkersen model is extended to include adherend deformations will be discussed. STATISTICAL METHODOLOGY FOR DESIGN ALLOWABLES [15-17...structure. In the certification methodology , the development test program and the calculation of composite design allowables is orchestrated to support...Development of design methodology of thick composites and their test methods. (b) Role of interface in emerging composite systems. *CONTRACTS IMPROVED DAMAGE
ERIC Educational Resources Information Center
Marr, Vanessa L.
2014-01-01
This essay explores the autoethnographic possibilities of critical service-learning research and the emerging realities of a community-centered womanist methodological response. Drawing from Alice Walker's definition of "womanist" as a commitment to "survival and wholeness of entire people, male 'and' female," the author argues…
IDR: A Participatory Methodology for Interdisciplinary Design in Technology Enhanced Learning
ERIC Educational Resources Information Center
Winters, Niall; Mor, Yishay
2008-01-01
One of the important themes that emerged from the CAL'07 conference was the failure of technology to bring about the expected disruptive effect to learning and teaching. We identify one of the causes as an inherent weakness in prevalent development methodologies. While the problem of designing technology for learning is irreducibly…
ERIC Educational Resources Information Center
Savvides, Nicola; Al-Youssef, Joanna; Colin, Mindy; Garrido, Cecilia
2014-01-01
This article highlights key theoretical and methodological issues and implications of being an insider/outsider when undertaking qualitative research in international educational settings. It first addresses discourses of "self" and "other," noting that identity and belonging emerge from fluid engagement between researchers and…
Qualitative Research: Emerging Opportunity in Business Education
ERIC Educational Resources Information Center
Gaytan, Jorge
2007-01-01
The purpose of this qualitative study was to examine the research methods used in articles published in "The Delta Pi Epsilon Journal" and the "NABTE Review" between 2001 and 2005 to determine the extent to which qualitative research methodologies have been employed by researchers and the extent to which these research methodologies were clearly…
Philosophical underpinnings of an emergent methodology for nursing as caring inquiry.
Schoenhofer, Savina O
2002-10-01
Two research approaches congruent with the theory of nursing as caring are described: group interpretive phenomenology and nursing as caring research as praxis. The idea of praxis and the theory of communicative action are explored for congruence as philosophical underpinnings of a possible eventual methodology for developing knowledge of nursing as caring.
ERIC Educational Resources Information Center
Gauthier, Benoit; And Others
1997-01-01
Identifies the more representative problem-solving models in environmental education. Suggests the addition of a strategy for defining a problem situation using Soft Systems Methodology to environmental education activities explicitly designed for the development of critical thinking. Contains 45 references. (JRH)
ERIC Educational Resources Information Center
Hare, Kathleen A.; Dubé, Anik; Marshall, Zack; Gahagan, Jacqueline; Harris, Gregory E.; Tucker, Maryanne; Dykeman, Margaret; MacDonald, Jo-Ann
2016-01-01
Policy scoping reviews are an effective method for generating evidence-informed policies. However, when applying guiding methodological frameworks to complex policy evidence, numerous, unexpected challenges can emerge. This paper details five challenges experienced and addressed by a policy trainee-led, multi-disciplinary research team, while…
Learning through Work: Emerging Perspectives and New Challenges
ERIC Educational Resources Information Center
Billett, Stephen; Choy, Sarojni
2013-01-01
Purpose: This paper aims to consider and appraise current developments and emerging perspectives on learning in the circumstances of work, to propose how some of the challenges for securing effective workplace learning may be redressed. Design/methodology/approach: First, new challenges and perspectives on learning in the circumstances of work are…
Channeling the Innovation Stream: A Decision Framework for Selecting Emerging Technologies
ERIC Educational Resources Information Center
Sauer, Philip S.
2010-01-01
The proliferation of emerging technologies offers opportunity but also presents challenges to defense acquisition decision makers seeking to incorporate those technologies as part of the acquisition process. Assessment frameworks and methodologies found in the literature typically address the primary focus of a sponsoring organization's interest…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curry, J J; Gallagher, D W; Modarres, M
Appendices are presented concerning isolation condenser makeup; vapor suppression system; station air system; reactor building closed cooling water system; turbine building secondary closed water system; service water system; emergency service water system; fire protection system; emergency ac power; dc power system; event probability estimation; methodology of accident sequence quantification; and assignment of dominant sequences to release categories.
Preparing Emerging Doctoral Scholars for Transdisciplinary Research: A Developmental Approach
ERIC Educational Resources Information Center
Kemp, Susan Patricia; Nurius, Paula S.
2015-01-01
Research models that bridge disciplinary, theoretical, and methodological boundaries are increasingly common as funders and the public push for effective responses to pressing social problems. Although social work is inherently an integrative discipline, there is growing recognition of the need to better prepare emerging scholars for sophisticated…
ERIC Educational Resources Information Center
Mathews, Sarah A.; Lovett, Maria K.
2017-01-01
Video participatory research (VPR) is an emergent methodology that bridges visual methods with the epistemology of participatory research. This approach is motivated by the "crisis of representation" or "reflective turn" (Gubrium & Harper, 2013) that promotes research conducted with or by participants, conceptualizing…
ERIC Educational Resources Information Center
Haegele, Justin A.; Hodge, Samuel R.
2015-01-01
Emerging professionals, particularly senior-level undergraduate and graduate students in kinesiology who have an interest in physical education for individuals with and without disabilities, should understand the basic assumptions of the quantitative research paradigm. Knowledge of basic assumptions is critical for conducting, analyzing, and…
ERIC Educational Resources Information Center
Lerman, Dorothea C.; Parten, Mandy; Addison, Laura R.; Vorndran, Christina M.; Volkert, Valerie M.; Kodak, Tiffany
2005-01-01
An approach based on Skinner's (1957) theory of verbal behavior has been developed to understand and teach elementary communication skills to children with autism and developmental disabilities (Sundberg & Partington, 1998). However, few studies have directly examined the characteristics of emerging language in children with developmental…
Bond, William F; Hui, Joshua; Fernandez, Rosemarie
2018-02-01
Over the past decade, emergency medicine (EM) took a lead role in healthcare simulation in part due to its demands for successful interprofessional and multidisciplinary collaboration, along with educational needs in a diverse array of cognitive and procedural skills. Simulation-based methodologies have the capacity to support training and research platforms that model micro-, meso-, and macrosystems of healthcare. To fully capitalize on the potential of simulation-based research to improve emergency healthcare delivery will require the application of rigorous methods from engineering, social science, and basic science disciplines. The Academic Emergency Medicine (AEM) Consensus Conference "Catalyzing System Change Through Healthcare Simulation: Systems, Competency, and Outcome" was conceived to foster discussion among experts in EM, engineering, and social sciences, focusing on key barriers and opportunities in simulation-based research. This executive summary describes the overall rationale for the conference, conference planning, and consensus-building approaches and outlines the focus of the eight breakout sessions. The consensus outcomes from each breakout session are summarized in proceedings papers published in this issue of Academic Emergency Medicine. Each paper provides an overview of methodologic and knowledge gaps in simulation research and identifies future research targets aimed at improving the safety and quality of healthcare. © 2017 by the Society for Academic Emergency Medicine.
Dehnavieh, Reza; Ebrahimipour, Hossein; Molavi-Taleghani, Yasamin; Vafaee-Najar, Ali; Noori Hekmat, Somayeh; Esmailzdeh, Hamid
2014-12-25
Pediatric emergency has been considered as a high risk area, and blood transfusion is known as a unique clinical measure, therefore this study was conducted with the purpose of assessing the proactive risk assessment of blood transfusion process in Pediatric Emergency of Qaem education- treatment center in Mashhad, by the Healthcare Failure Mode and Effects Analysis (HFMEA) methodology. This cross-sectional study analyzed the failure mode and effects of blood transfusion process by a mixture of quantitative-qualitative method. The proactive HFMEA was used to identify and analyze the potential failures of the process. The information of the items in HFMEA forms was collected after obtaining a consensus of experts' panel views via the interview and focus group discussion sessions. The Number of 77 failure modes were identified for 24 sub-processes enlisted in 8 processes of blood transfusion. Totally 13 failure modes were identified as non-acceptable risk (a hazard score above 8) in the blood transfusion process and were transferred to the decision tree. Root causes of high risk modes were discussed in cause-effect meetings and were classified based on the UK national health system (NHS) approved classifications model. Action types were classified in the form of acceptance (11.6%), control (74.2%) and elimination (14.2%). Recommendations were placed in 7 categories using TRIZ ("Theory of Inventive Problem Solving.") The re-engineering process for the required changes, standardizing and updating the blood transfusion procedure, root cause analysis of blood transfusion catastrophic events, patient identification bracelet, training classes and educational pamphlets for raising awareness of personnel, and monthly gathering of transfusion medicine committee have all been considered as executive strategies in work agenda in pediatric emergency.
Jennings, Natasha; Clifford, Stuart; Fox, Amanda R; O'Connell, Jane; Gardner, Glenn
2015-01-01
To provide the best available evidence to determine the impact of nurse practitioner services on cost, quality of care, satisfaction and waiting times in the emergency department for adult patients. The delivery of quality care in the emergency department is emerging as one of the most important service indicators in health delivery. Increasing service pressures in the emergency department have resulted in the adoption of service innovation models: the most common and rapidly expanding of these is emergency nurse practitioner services. The rapid uptake of emergency nurse practitioner service in Australia has outpaced the capacity to evaluate this service model in terms of outcomes related to safety and quality of patient care. Previous research is now outdated and not commensurate with the changing domain of delivering emergency care with nurse practitioner services. A comprehensive search of four electronic databases from 2006 to 2013 was conducted to identify research evaluating nurse practitioner service impact in the emergency department. English language articles were sought using MEDLINE, CINAHL, Embase and Cochrane and included two previous systematic reviews completed five and seven years ago. A three step approach was used. Following a comprehensive search, two reviewers assessed all identified studies against the inclusion criteria. From the original 1013 studies, 14 papers were retained for critical appraisal on methodological quality by two independent reviewers and data were extracted using standardised tools. Narrative synthesis was conducted to summarise and report the findings as insufficient data was available for meta-analysis of results. This systematic review has shown that emergency nurse practitioner service has a positive impact on quality of care, patient satisfaction and waiting times. There was insufficient evidence to draw conclusions regarding outcomes of a cost benefit analysis. Synthesis of the available research attempts to provide an evidence base for emergency nurse practitioner service to guide healthcare leaders, policy makers and clinicians in reform of emergency service provision. The findings suggest that further high quality research is required for comparative measures of clinical and service effectiveness of emergency nurse practitioner service. In the context of increased health service demand and the need to provide timely and effective care to patients, such measures will assist in evidence based health service planning. Copyright © 2014 Elsevier Ltd. All rights reserved.
Fajardo-Ortiz, David; Duran, Luis; Moreno, Laura; Ochoa, Hector; Castaño, Victor M
2014-09-03
We explored how the knowledge translation and innovation processes are structured when theyresult in innovations, as in the case of liposomal doxorubicin research. In order to map the processes, a literature network analysis was made through Cytoscape and semantic analysis was performed by GOPubmed which is based in the controlled vocabularies MeSH (Medical Subject Headings) and GO (Gene Ontology). We found clusters related to different stages of the technological development (invention, innovation and imitation) and the knowledge translation process (preclinical, translational and clinical research), and we were able to map the historic emergence of Doxil as a paradigmatic nanodrug. This research could be a powerful methodological tool for decision-making and innovation management in drug delivery research.
Behavioral economics and empirical public policy.
Hursh, Steven R; Roma, Peter G
2013-01-01
The application of economics principles to the analysis of behavior has yielded novel insights on value and choice across contexts ranging from laboratory animal research to clinical populations to national trends of global impact. Recent innovations in demand curve methods provide a credible means of quantitatively comparing qualitatively different reinforcers as well as quantifying the choice relations between concurrently available reinforcers. The potential of the behavioral economic approach to inform public policy is illustrated with examples from basic research, pre-clinical behavioral pharmacology, and clinical drug abuse research as well as emerging applications to public transportation and social behavior. Behavioral Economics can serve as a broadly applicable conceptual, methodological, and analytical framework for the development and evaluation of empirical public policy. © Society for the Experimental Analysis of Behavior.
Emerging technologies for the changing global market
NASA Technical Reports Server (NTRS)
Cruit, Wendy; Schutzenhofer, Scott; Goldberg, Ben; Everhart, Kurt
1993-01-01
This project served to define an appropriate methodology for effective prioritization of technology efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi-quantative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results will be implemented as a guideline for consideration for current NASA propulsion systems.
Synthetic consciousness: the distributed adaptive control perspective
2016-01-01
Understanding the nature of consciousness is one of the grand outstanding scientific challenges. The fundamental methodological problem is how phenomenal first person experience can be accounted for in a third person verifiable form, while the conceptual challenge is to both define its function and physical realization. The distributed adaptive control theory of consciousness (DACtoc) proposes answers to these three challenges. The methodological challenge is answered relative to the hard problem and DACtoc proposes that it can be addressed using a convergent synthetic methodology using the analysis of synthetic biologically grounded agents, or quale parsing. DACtoc hypothesizes that consciousness in both its primary and secondary forms serves the ability to deal with the hidden states of the world and emerged during the Cambrian period, affording stable multi-agent environments to emerge. The process of consciousness is an autonomous virtualization memory, which serializes and unifies the parallel and subconscious simulations of the hidden states of the world that are largely due to other agents and the self with the objective to extract norms. These norms are in turn projected as value onto the parallel simulation and control systems that are driving action. This functional hypothesis is mapped onto the brainstem, midbrain and the thalamo-cortical and cortico-cortical systems and analysed with respect to our understanding of deficits of consciousness. Subsequently, some of the implications and predictions of DACtoc are outlined, in particular, the prediction that normative bootstrapping of conscious agents is predicated on an intentionality prior. In the view advanced here, human consciousness constitutes the ultimate evolutionary transition by allowing agents to become autonomous with respect to their evolutionary priors leading to a post-biological Anthropocene. This article is part of the themed issue ‘The major synthetic evolutionary transitions’. PMID:27431526
Synthetic consciousness: the distributed adaptive control perspective.
Verschure, Paul F M J
2016-08-19
Understanding the nature of consciousness is one of the grand outstanding scientific challenges. The fundamental methodological problem is how phenomenal first person experience can be accounted for in a third person verifiable form, while the conceptual challenge is to both define its function and physical realization. The distributed adaptive control theory of consciousness (DACtoc) proposes answers to these three challenges. The methodological challenge is answered relative to the hard problem and DACtoc proposes that it can be addressed using a convergent synthetic methodology using the analysis of synthetic biologically grounded agents, or quale parsing. DACtoc hypothesizes that consciousness in both its primary and secondary forms serves the ability to deal with the hidden states of the world and emerged during the Cambrian period, affording stable multi-agent environments to emerge. The process of consciousness is an autonomous virtualization memory, which serializes and unifies the parallel and subconscious simulations of the hidden states of the world that are largely due to other agents and the self with the objective to extract norms. These norms are in turn projected as value onto the parallel simulation and control systems that are driving action. This functional hypothesis is mapped onto the brainstem, midbrain and the thalamo-cortical and cortico-cortical systems and analysed with respect to our understanding of deficits of consciousness. Subsequently, some of the implications and predictions of DACtoc are outlined, in particular, the prediction that normative bootstrapping of conscious agents is predicated on an intentionality prior. In the view advanced here, human consciousness constitutes the ultimate evolutionary transition by allowing agents to become autonomous with respect to their evolutionary priors leading to a post-biological Anthropocene.This article is part of the themed issue 'The major synthetic evolutionary transitions'. © 2016 The Author(s).
Harold, P D; de Souza, A S; Louchart, P; Russell, D; Brunt, H
2014-11-01
Hazardous and noxious chemicals are increasingly being transported by sea. Current estimates indicate some 2000 hazardous and noxious substances (HNS) are carried regularly by sea with bulk trade of 165milliontonnes per year worldwide. Over 100 incidents involving HNS have been reported in EU waters. Incidents occurring in a port or coastal area can have potential and actual public health implications. A methodology has been developed for prioritisation of HNS, based upon potential public health risks. The work, undertaken for the Atlantic Region Pollution Response programme (ARCOPOL), aims to provide information for incident planning and preparedness. HNS were assessed using conventional methodology based upon acute toxicity, behaviour and reactivity. Tonnage was used as a proxy for likelihood, although other factors such as shipping frequency and local navigation may also contribute. Analysis of 350 individual HNS identified the highest priority HNS as being those that present an inhalation risk. Limitations were identified around obtaining accurate data on HNS handled on a local and regional level due to a lack of port records and also political and commercial confidentiality issues. To account for this the project also developed a software tool capable of combining chemical data from the study with user defined shipping data to be used by operators to produce area-specific prioritisations. In conclusion a risk prioritisation matrix has been developed to assess the acute risks to public health from the transportation of HNS. Its potential use in emergency planning and preparedness is discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.
Statistical process control: separating signal from noise in emergency department operations.
Pimentel, Laura; Barrueto, Fermin
2015-05-01
Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.
Twitmographics: Learning the Emergent Properties of the Twitter Community
NASA Astrophysics Data System (ADS)
Cheong, Marc; Lee, Vincent
This paper presents a framework for discovery of the emergent properties of users of the Twitter microblogging platform. The novelty of our methodology is the use of machine-learning methods to deduce user demographic information and online usage patterns and habits not readily apparent from the raw messages posted on Twitter. This is different from existing social network analysis performed on de facto social networks such as Face-book, in the sense that we use publicly available metadata from Twitter messages to explore the inherent characteristics about different segments of the Twitter community, in a simple yet effective manner. Our framework is coupled with the self-organizing map visualization method, and tested on a corpus of messages which deal with issues of socio politi-cal and economic impact, to gain insight into the properties of human interaction via Twitter as a medium for computer-mediated self-expression.
Mitochondrial network complexity emerges from fission/fusion dynamics.
Zamponi, Nahuel; Zamponi, Emiliano; Cannas, Sergio A; Billoni, Orlando V; Helguera, Pablo R; Chialvo, Dante R
2018-01-10
Mitochondrial networks exhibit a variety of complex behaviors, including coordinated cell-wide oscillations of energy states as well as a phase transition (depolarization) in response to oxidative stress. Since functional and structural properties are often interwinded, here we characterized the structure of mitochondrial networks in mouse embryonic fibroblasts using network tools and percolation theory. Subsequently we perturbed the system either by promoting the fusion of mitochondrial segments or by inducing mitochondrial fission. Quantitative analysis of mitochondrial clusters revealed that structural parameters of healthy mitochondria laid in between the extremes of highly fragmented and completely fusioned networks. We confirmed our results by contrasting our empirical findings with the predictions of a recently described computational model of mitochondrial network emergence based on fission-fusion kinetics. Altogether these results offer not only an objective methodology to parametrize the complexity of this organelle but also support the idea that mitochondrial networks behave as critical systems and undergo structural phase transitions.
Dowall, Stuart D; Graham, Victoria A; Tipton, Thomas R W; Hewson, Roger
2009-08-31
Work with highly pathogenic material mandates the use of biological containment facilities, involving microbiological safety cabinets and specialist laboratory engineering structures typified by containment level 3 (CL3) and CL4 laboratories. Consequences of working in high containment are the practical difficulties associated with containing specialist assays and equipment often essential for experimental analyses. In an era of increased interest in biodefence pathogens and emerging diseases, immunological analysis has developed rapidly alongside traditional techniques in virology and molecular biology. For example, in order to maximise the use of small sample volumes, multiplexing has become a more popular and widespread approach to quantify multiple analytes simultaneously, such as cytokines and chemokines. The luminex microsphere system allows for the detection of many cytokines and chemokines in a single sample, but the detection method of using aligned lasers and fluidics means that samples often have to be analysed in low containment facilities. In order to perform cytokine analysis in materials from high containment (CL3 and CL4 laboratories), we have developed an appropriate inactivation methodology after staining steps, which although results in a reduction of median fluorescent intensity, produces statistically comparable outcomes when judged against non-inactivated samples. This methodology thus extends the use of luminex technology for material that contains highly pathogenic biological agents.
ERIC Educational Resources Information Center
Edwards, Lisa M.; Pedrotti, Jennifer Teramoto
2008-01-01
This study describes a comprehensive content and methodological review of articles about multiracial issues in 6 journals related to counseling up to the year 2006. The authors summarize findings about the 18 articles that emerged from this review of the "Journal of Counseling Psychology," "Journal of Counseling & Development," "The Counseling…
ERIC Educational Resources Information Center
Stassart, Pierre Marie; Mathieu, Valerie; Melard, Francois
2011-01-01
This paper proposes a new way for sociology, through both methodology and theory, to understand the reality of social groups and their "minority practices." It is based on an experiment that concerns a very specific category of agriculturalists called "pluriactive" stock farmers. These stock farmers, who engage in raising livestock part-time…
Policy Mobilities and Methodology: A Proposition for Inventive Methods in Education Policy Studies
ERIC Educational Resources Information Center
Gulson, Kalervo N.; Lewis, Steven; Lingard, Bob; Lubienski, Christopher; Takayama, Keita; Webb, P. Taylor
2017-01-01
The argument of this paper is that new methodologies associated with the emerging field of "policy mobilities" can be applied, and are in fact required, to examine and research the networked and relational, or "topological", nature of globalised education policy, which cuts across the new spaces of policymaking and new modes of…
ERIC Educational Resources Information Center
Karpova, Natalia Konstantinovna; Uvarovsky, Alexander Pavlovich; Mareev, Vladimir Ivanovich; Petrova, Nina Petrovna; Borzilov, Yuri Petrovich
2016-01-01
The article is devoted to some methodological features of modernization in modern education in the context of "the knowledge economy" development which is aimed at shaping new mental potential of the modern state. Features and prerequisites for the emergence of the knowledge economy, the priorities of which include the development and…
Methodological Flaws in Corpus-Based Studies on Malaysian ESL Textbooks
ERIC Educational Resources Information Center
Zarifi, Abdolvahed; Mukundan, Jayakaran; Rezvani Kalajahi, Seyed Ali
2014-01-01
With the increasing interest among the pedagogy researchers in the use of corpus linguistics methodologies to study textbooks, there has emerged a similar enthusiasm among the materials developers to draw on empirical findings in the development of the state-of-the-art curricula and syllabi. In order for these research findings to have their…
ERIC Educational Resources Information Center
Bonometti, Patrizia
2012-01-01
Purpose: The aim of this contribution is to describe a new complexity-science-based approach for improving safety, quality and efficiency and the way it was implemented by TenarisDalmine. Design/methodology/approach: This methodology is called "a safety-building community". It consists of a safety-behaviour social self-construction…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-28
... . You can search for the document by entering ``Public Notice '' in the Search bar. If necessary, use... the time and cost burden for this proposed collection, including the validity of the methodology and.... Methodology: The Bureau of Consular Affairs will be posting this form on Department of State Web sites to give...
ERIC Educational Resources Information Center
Lincoln, Yvonna S.; Gonzalez y Gonzalez, Elsa M.
2008-01-01
Many non-Western and non-English-speaking scholars express the need for supporting a methodological approach that foregrounds the voices of nationals and locals (or indigenous peoples). Supporting this stance, Western scholars will reach out in democratic and liberatory ways that effect research collaboration, helping to foster social justice and…
Eastwood, John Graeme; Jalaludin, Bin Badrudin; Kemp, Lynn Ann; Phung, Hai Ngoc
2014-01-01
We have previously reported in this journal on an ecological study of perinatal depressive symptoms in South Western Sydney. In that article, we briefly reported on a factor analysis that was utilized to identify empirical indicators for analysis. In this article, we report on the mixed method approach that was used to identify those latent variables. Social epidemiology has been slow to embrace a latent variable approach to the study of social, political, economic, and cultural structures and mechanisms, partly for philosophical reasons. Critical realist ontology and epistemology have been advocated as an appropriate methodological approach to both theory building and theory testing in the health sciences. We describe here an emergent mixed method approach that uses qualitative methods to identify latent constructs followed by factor analysis using empirical indicators chosen to measure identified qualitative codes. Comparative analysis of the findings is reported together with a limited description of realist approaches to abstract reasoning.
Improving ED specimen TAT using Lean Six Sigma.
Sanders, Janet H; Karr, Tedd
2015-01-01
Lean and Six Sigma are continuous improvement methodologies that have garnered international fame for improving manufacturing and service processes. Increasingly these methodologies are demonstrating their power to also improve healthcare processes. The purpose of this paper is to discuss a case study for the application of Lean and Six Sigma tools in the reduction of turnaround time (TAT) for Emergency Department (ED) specimens. This application of the scientific methodologies uncovered opportunities to improve the entire ED to lab system for the specimens. This case study provides details on the completion of a Lean Six Sigma project in a 1,000 bed tertiary care teaching hospital. Six Sigma's Define, Measure, Analyze, Improve, and Control methodology is very similar to good medical practice: first, relevant information is obtained and assembled; second, a careful and thorough diagnosis is completed; third, a treatment is proposed and implemented; and fourth, checks are made to determine if the treatment was effective. Lean's primary goal is to do more with less work and waste. The Lean methodology was used to identify and eliminate waste through rapid implementation of change. The initial focus of this project was the reduction of turn-around-times for ED specimens. However, the results led to better processes for both the internal and external customers of this and other processes. The project results included: a 50 percent decrease in vials used for testing, a 50 percent decrease in unused or extra specimens, a 90 percent decrease in ED specimens without orders, a 30 percent decrease in complete blood count analysis (CBCA) Median TAT, a 50 percent decrease in CBCA TAT Variation, a 10 percent decrease in Troponin TAT Variation, a 18.2 percent decrease in URPN TAT Variation, and a 2-5 minute decrease in ED registered nurses rainbow draw time. This case study demonstrated how the quantitative power of Six Sigma and the speed of Lean worked in harmony to improve the blood draw process for a 1,000 bed tertiary care teaching hospital. The blood draw process is a standard process used in hospitals to collect blood chemistry and hematology information for clinicians. The methods used in this case study demonstrated valuable and practical applications of process improvement methodologies that can be used for any hospital process and/or service environment. While this is not the first case study that has demonstrated the use of continuous process improvement methodologies to improve a hospital process, it is unique in the way in which it utilizes the strength of the project focussed approach that adheres more to the structure and rigor of Six Sigma and relied less on the speed of lean. Additionally, the application of these methodologies in healthcare is emerging research.
ON IDENTIFIABILITY OF NONLINEAR ODE MODELS AND APPLICATIONS IN VIRAL DYNAMICS
MIAO, HONGYU; XIA, XIAOHUA; PERELSON, ALAN S.; WU, HULIN
2011-01-01
Ordinary differential equations (ODE) are a powerful tool for modeling dynamic processes with wide applications in a variety of scientific fields. Over the last 2 decades, ODEs have also emerged as a prevailing tool in various biomedical research fields, especially in infectious disease modeling. In practice, it is important and necessary to determine unknown parameters in ODE models based on experimental data. Identifiability analysis is the first step in determing unknown parameters in ODE models and such analysis techniques for nonlinear ODE models are still under development. In this article, we review identifiability analysis methodologies for nonlinear ODE models developed in the past one to two decades, including structural identifiability analysis, practical identifiability analysis and sensitivity-based identifiability analysis. Some advanced topics and ongoing research are also briefly reviewed. Finally, some examples from modeling viral dynamics of HIV, influenza and hepatitis viruses are given to illustrate how to apply these identifiability analysis methods in practice. PMID:21785515
A dictionary based informational genome analysis
2012-01-01
Background In the post-genomic era several methods of computational genomics are emerging to understand how the whole information is structured within genomes. Literature of last five years accounts for several alignment-free methods, arisen as alternative metrics for dissimilarity of biological sequences. Among the others, recent approaches are based on empirical frequencies of DNA k-mers in whole genomes. Results Any set of words (factors) occurring in a genome provides a genomic dictionary. About sixty genomes were analyzed by means of informational indexes based on genomic dictionaries, where a systemic view replaces a local sequence analysis. A software prototype applying a methodology here outlined carried out some computations on genomic data. We computed informational indexes, built the genomic dictionaries with different sizes, along with frequency distributions. The software performed three main tasks: computation of informational indexes, storage of these in a database, index analysis and visualization. The validation was done by investigating genomes of various organisms. A systematic analysis of genomic repeats of several lengths, which is of vivid interest in biology (for example to compute excessively represented functional sequences, such as promoters), was discussed, and suggested a method to define synthetic genetic networks. Conclusions We introduced a methodology based on dictionaries, and an efficient motif-finding software application for comparative genomics. This approach could be extended along many investigation lines, namely exported in other contexts of computational genomics, as a basis for discrimination of genomic pathologies. PMID:22985068
Wirihana, Lisa; Welch, Anthony; Williamson, Moira; Christensen, Martin; Bakon, Shannon; Craft, Judy
2018-03-16
Phenomenology is a useful methodological approach in qualitative nursing research. It enables researchers to put aside their perceptions of a phenomenon and give meaning to a participant's experiences. Exploring the experiences of others enables previously unavailable insights to be discovered. To delineate the implementation of Colaizzi's ( 1978 ) method of data analysis in descriptive phenomenological nursing research. The use of Colaizzi's method of data analysis enabled new knowledge to be revealed and provided insights into the experiences of nurse academics teaching on satellite campuses. Local adaptation of the nursing curriculum and additional unnoticed responsibilities had not been identified previously and warrant further research. Colaizzi's ( 1978 ) method of data analysis is rigorous and robust, and therefore a qualitative method that ensures the credibility and reliability of its results. It allows researchers to reveal emergent themes and their interwoven relationships. Researchers using a descriptive phenomenological approach should consider using this method as a clear and logical process through which the fundamental structure of an experience can be explored. Colaizzi's phenomenological methodology can be used reliably to understand people's experiences. This may prove beneficial in the development of therapeutic policy and the provision of patient-centred care. ©2018 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.
Methodological aspects of EEG and body dynamics measurements during motion
Reis, Pedro M. R.; Hebenstreit, Felix; Gabsteiger, Florian; von Tscharner, Vinzenz; Lochmann, Matthias
2014-01-01
EEG involves the recording, analysis, and interpretation of voltages recorded on the human scalp which originate from brain gray matter. EEG is one of the most popular methods of studying and understanding the processes that underlie behavior. This is so, because EEG is relatively cheap, easy to wear, light weight and has high temporal resolution. In terms of behavior, this encompasses actions, such as movements that are performed in response to the environment. However, there are methodological difficulties which can occur when recording EEG during movement such as movement artifacts. Thus, most studies about the human brain have examined activations during static conditions. This article attempts to compile and describe relevant methodological solutions that emerged in order to measure body and brain dynamics during motion. These descriptions cover suggestions on how to avoid and reduce motion artifacts, hardware, software and techniques for synchronously recording EEG, EMG, kinematics, kinetics, and eye movements during motion. Additionally, we present various recording systems, EEG electrodes, caps and methods for determinating real/custom electrode positions. In the end we will conclude that it is possible to record and analyze synchronized brain and body dynamics related to movement or exercise tasks. PMID:24715858
Frederiksen, Kirsten; Lomborg, Kirsten; Beedholm, Kirsten
2015-09-01
This study takes its point of departure in an oft-voiced critique that the French philosopher Michel Foucault gives discourse priority over practice, thereby being deterministic and leaving little space for the individual to act as an agent. Based on an interpretation of the latter part of Foucault's oeuvre, we argue against this critique and provide a methodological discussion of the perception that Foucault's method constitutes, primarily, discourse analysis. We argue that it is possible to overcome this critique of Foucault's work by the application of methodological tools adapted from Foucault's later writings and his diagnosis of his own work as studies of forms of problematization. To shed light on the possibilities that this approach offers to the researcher, we present a reading of aspects of Foucault's work, with a focus on his notion of forms of problematization. Furthermore, we elaborate on concepts from his so-called genealogical period, namely 'the dispositive', strategy and tactics. Our interpretation is supported by examples from a study of the emergence of Danish nursing education, which is based on an analytical framework that we developed in the light of an interpretation of aspects of Foucault's work. © 2015 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi
2015-11-01
Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.
A general methodology for population analysis
NASA Astrophysics Data System (ADS)
Lazov, Petar; Lazov, Igor
2014-12-01
For a given population with N - current and M - maximum number of entities, modeled by a Birth-Death Process (BDP) with size M+1, we introduce utilization parameter ρ, ratio of the primary birth and death rates in that BDP, which, physically, determines (equilibrium) macrostates of the population, and information parameter ν, which has an interpretation as population information stiffness. The BDP, modeling the population, is in the state n, n=0,1,…,M, if N=n. In presence of these two key metrics, applying continuity law, equilibrium balance equations concerning the probability distribution pn, n=0,1,…,M, of the quantity N, pn=Prob{N=n}, in equilibrium, and conservation law, and relying on the fundamental concepts population information and population entropy, we develop a general methodology for population analysis; thereto, by definition, population entropy is uncertainty, related to the population. In this approach, what is its essential contribution, the population information consists of three basic parts: elastic (Hooke's) or absorption/emission part, synchronization or inelastic part and null part; the first two parts, which determine uniquely the null part (the null part connects them), are the two basic components of the Information Spectrum of the population. Population entropy, as mean value of population information, follows this division of the information. A given population can function in information elastic, antielastic and inelastic regime. In an information linear population, the synchronization part of the information and entropy is absent. The population size, M+1, is the third key metric in this methodology. Namely, right supposing a population with infinite size, the most of the key quantities and results for populations with finite size, emerged in this methodology, vanish.
The ALMA CONOPS project: the impact of funding decisions on observatory performance
NASA Astrophysics Data System (ADS)
Ibsen, Jorge; Hibbard, John; Filippi, Giorgio
2014-08-01
In time when every penny counts, many organizations are facing the question of how much scientific impact a budget cut can have or, putting it in more general terms, which is the science impact of alternative (less costly) operational modes. In reply to such question posted by the governing bodies, the ALMA project had to develop a methodology (ALMA Concepts for Operations, CONOPS) that attempts to measure the impact that alternative operational scenarios may have on the overall scientific production of the Observatory. Although the analysis and the results are ALMA specific, the developed approach is rather general and provides a methodology for a cost-performance analysis of alternatives before any radical alterations to the operations model are adopted. This paper describes the key aspects of the methodology: a) the definition of the Figures of Merit (FoMs) for the assessment of quantitative science performance impacts as well as qualitative impacts, and presents a methodology using these FoMs to evaluate the cost and impact of the different operational scenarios; b) the definition of a REFERENCE operational baseline; c) the identification of Alternative Scenarios each replacing one or more concepts in the REFERENCE by a different concept that has a lower cost and some level of scientific and/or operational impact; d) the use of a Cost-Performance plane to graphically combine the effects that the alternative scenarios can have in terms of cost reduction and affected performance. Although is a firstorder assessment, we believe this approach is useful for comparing different operational models and to understand the cost performance impact of these choices. This can be used to take decision to meet budget cuts as well as in evaluating possible new emergent opportunities.
Gao, Su-qing; Wang, Zhen; Gao, Hong-wei; Liu, Peng; Wang, Ze-rui; Li, Yan-li; Zhu, Xu-guang; Li, Xin-lou; Xu, Bo; Li, Yin-jun; Yang, Hong; de Vlas, Sake J.; Shi, Tao-xing; Cao, Wu-chun
2013-01-01
Background For years, emerging infectious diseases have appeared worldwide and threatened the health of people. The emergence and spread of an infectious-disease outbreak are usually unforeseen, and have the features of suddenness and uncertainty. Timely understanding of basic information in the field, and the collection and analysis of epidemiological information, is helpful in making rapid decisions and responding to an infectious-disease emergency. Therefore, it is necessary to have an unobstructed channel and convenient tool for the collection and analysis of epidemiologic information in the field. Methodology/Principal Findings Baseline information for each county in mainland China was collected and a database was established by geo-coding information on a digital map of county boundaries throughout the country. Google Maps was used to display geographic information and to conduct calculations related to maps, and the 3G wireless network was used to transmit information collected in the field to the server. This study established a decision support system for the response to infectious-disease emergencies based on WebGIS and mobile services (DSSRIDE). The DSSRIDE provides functions including data collection, communication and analyses in real time, epidemiological detection, the provision of customized epidemiological questionnaires and guides for handling infectious disease emergencies, and the querying of professional knowledge in the field. These functions of the DSSRIDE could be helpful for epidemiological investigations in the field and the handling of infectious-disease emergencies. Conclusions/Significance The DSSRIDE provides a geographic information platform based on the Google Maps application programming interface to display information of infectious disease emergencies, and transfers information between workers in the field and decision makers through wireless transmission based on personal computers, mobile phones and personal digital assistants. After a 2-year practice and application in infectious disease emergencies, the DSSRIDE is becoming a useful platform and is a useful tool for investigations in the field carried out by response sections and individuals. The system is suitable for use in developing countries and low-income districts. PMID:23372780
On the Aesthetic Difficulties of Research on Sex Education: Toward a Methodology of Affect
ERIC Educational Resources Information Center
Sandlos, Karyn
2010-01-01
This paper emerges from an ongoing, three-year qualitative study of how adolescents, teachers, and peer sexual health educators interpret the language of abstinence and represent the emotional meanings that enliven sexuality and sexual health. The paper demonstrates how conflicts of thinking and relationality emerge from aesthetic narratives about…
ERIC Educational Resources Information Center
O'Donoghue, Rob; Russo, Vladimir
2004-01-01
This paper examines how emerging materials and associated methods became inscribed within and have shaped developing patterns of practice in environmental education. In so doing, it gives attention to how materials and methods have informed methodological narratives and shaped abstracted propositions used in professional development activities.…
Solar Power Generation for ICT and Sustainable Development in Emerging Economies
ERIC Educational Resources Information Center
Paul, Damasen I.; Uhomoibhi, James
2012-01-01
Purpose: The purpose of this paper is to systematically examine and draw attention to the potential benefits of solar power generation for access to and use of information and communication technologies (ICT) aimed at sustainable development in emerging economies. Design/methodology/approach: Electricity plays a crucial role in the development and…
Improvements in analytical methodology have allowed low-level detection of an ever increasing number of pharmaceuticals, personal care products, hormones, pathogens and other contaminants of emerging concern (CECs). The use of these improved analytical tools has allowed researche...
Emerging Trends in Science Education in a Dynamic Academic Environment
ERIC Educational Resources Information Center
Avwiri, H. E.
2016-01-01
Emerging Trends in Science Education in a Dynamic Academic Environment highlights the changes that have occurred in science education particularly in institutions of higher learning in southern Nigeria. Impelled by the fact that most Nigerian Universities and Colleges of Education still adhere to the practices and teaching methodologies of the…
A Methodology to Develop Ontologies for Emerging Domains
ERIC Educational Resources Information Center
Meenorngwar, Chai
2013-01-01
The characteristic of complex, dynamic domains, such as an emerging domain, is that the information necessary to describe them is not fully established. Standards are not yet established for these domains, and hence they are difficult to describe and present, and methods are needed that will reflect the changes that will occur as the domains…
Challenges in the estimation of Net SURvival: The CENSUR working survival group.
Giorgi, R
2016-10-01
Net survival, the survival probability that would be observed, in a hypothetical world, where the cancer of interest would be the only possible cause of death, is a key indicator in population-based cancer studies. Accounting for mortality due to other causes, it allows cross-country comparisons or trends analysis and provides a useful indicator for public health decision-making. The objective of this study was to show how the creation and formalization of a network comprising established research teams, which already had substantial and complementary experience in both cancer survival analysis and methodological development, make it possible to meet challenges and thus provide more adequate tools, to improve the quality and the comparability of cancer survival data, and to promote methodological transfers in areas of emerging interest. The Challenges in the Estimation of Net SURvival (CENSUR) working survival group is composed of international researchers highly skilled in biostatistics, methodology, and epidemiology, from different research organizations in France, the United Kingdom, Italy, Slovenia, and Canada, and involved in French (FRANCIM) and European (EUROCARE) cancer registry networks. The expected advantages are an interdisciplinary, international, synergistic network capable of addressing problems in public health, for decision-makers at different levels; tools for those in charge of net survival analyses; a common methodology that makes unbiased cross-national comparisons of cancer survival feasible; transfer of methods for net survival estimations to other specific applications (clinical research, occupational epidemiology); and dissemination of results during an international training course. The formalization of the international CENSUR working survival group was motivated by a need felt by scientists conducting population-based cancer research to discuss, develop, and monitor implementation of a common methodology to analyze net survival in order to provide useful information for cancer control and cancer policy. A "team science" approach is necessary to address new challenges concerning the estimation of net survival. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Emergency situations and deaf people in Israel: Communication obstacles and recommendations
Tannenbaum-Baruchi, Carolina; Feder-Bubis, Paula; Adini, Bruria; Aharonson-Daniel, Limor
2014-01-01
The absence of the ability to hear sounds in deaf people is an obstacle to optimal communication in a predominantly hearing world. Emergency situations harbor sufficient challenge for the hearing person and pose even greater barriers for the deaf and hard of hearing. During disasters and emergency situations, deaf people have great difficulty in obtaining and sharing information, increasing their dependence on others. This article focuses on the experience of deaf people during a period of security threat, when missiles from the Gaza strip were aimed at the civilian population in Southern Israel, in 2009. The aim of this article is to illustrate the complexities that deaf citizens experienced, and describe their coping mechanisms. A qualitative study including 15 heterogeneous-background Deaf participants interviewed by a researcher that belongs to the deaf community using a multiple-method facilitated questionnaire. Data was analyzed using grounded theory methodology principles. Main categories that arose from data analysis were communication problems during emergencies, the pager as a questionable warning device about emergencies (due to timing and content/context issues of its use), and the implications of the location of deaf people at time of emergency. Various channels for conveying information should be examined and created in order to maximize the heterogeneous deaf community's ability to receive vital information during an emergency. Professional sign language interpreters are necessary during emergencies, helping to reduce both dependence on informal sources (such as family members, including minors, friends, neighbors, by-standers) and risk. The development of new technologies may bear potential help for deaf persons during emergencies. Being a socio-linguistic minority, it is recommended to ensure these technologies will be accessible to the whole deaf community. PMID:28229005
Messori, Stefano; Zilli, Romano; Mariano, Valeria; Bagni, Marina
2017-03-31
Diseases evolve constantly and research is needed to face emerging new threats. Evidences suggest that the impact of such threats will have its peak in the Mediterranean area. The FORE‑Med, Foresight project for the Mediterranean, aims at identifying the future challenges on livestock health and aquaculture in this area, to ensure an effective coordination of research activities and the delivery of timely solution to emerging issues. One hundred experts with multidisciplinary background and coming from countries all around the Mediterranean basin were gathered to participate in a think‑tank to develop a Strategic Research Agenda on animal health for Mediterranean up to 2030. A tailored foresight methodology was implemented, merging the best fit for purpose techniques (e.g. '7 questions', Social, Technological, Economical, Environmental, and Political (STEEP), analysis, scenario building, and backcasting). Both remote and face‑to‑face debates were held, to ensure a fruitful exchanges and participation among experts. Research needs were identified and prioritised, both on relevance and on temporal scale. The implemented participative approach allowed for the definition of a research priority list for animal health and aquaculture in the Mediterranean, which served as a basis to build a strategic research agenda. The latter is expected to satisfy the sectors' needs and guarantee a much‑needed coordination for research activities in the Mediterranean area.
Critical reflections on methodological challenge in arts and dementia evaluation and research.
Gray, Karen; Evans, Simon Chester; Griffiths, Amanda; Schneider, Justine
2017-01-01
Methodological rigour, or its absence, is often a focus of concern for the emerging field of evaluation and research around arts and dementia. However, this paper suggests that critical attention should also be paid to the way in which individual perceptions, hidden assumptions and underlying social and political structures influence methodological work in the field. Such attention will be particularly important for addressing methodological challenges relating to contextual variability, ethics, value judgement and signification identified through a literature review on this topic. Understanding how, where and when evaluators and researchers experience such challenges may help to identify fruitful approaches for future evaluation.
Hess, Erik P; Wells, George A; Jaffe, Allan; Stiell, Ian G
2008-01-01
Background Chest pain is the second most common chief complaint in North American emergency departments. Data from the U.S. suggest that 2.1% of patients with acute myocardial infarction and 2.3% of patients with unstable angina are misdiagnosed, with slightly higher rates reported in a recent Canadian study (4.6% and 6.4%, respectively). Information obtained from the history, 12-lead ECG, and a single set of cardiac enzymes is unable to identify patients who are safe for early discharge with sufficient sensitivity. The 2007 ACC/AHA guidelines for UA/NSTEMI do not identify patients at low risk for adverse cardiac events who can be safely discharged without provocative testing. As a result large numbers of low risk patients are triaged to chest pain observation units and undergo provocative testing, at significant cost to the healthcare system. Clinical decision rules use clinical findings (history, physical exam, test results) to suggest a diagnostic or therapeutic course of action. Currently no methodologically robust clinical decision rule identifies patients safe for early discharge. Methods/design The goal of this study is to derive a clinical decision rule which will allow emergency physicians to accurately identify patients with chest pain who are safe for early discharge. The study will utilize a prospective cohort design. Standardized clinical variables will be collected on all patients at least 25 years of age complaining of chest pain prior to provocative testing. Variables strongly associated with the composite outcome acute myocardial infarction, revascularization, or death will be further analyzed with multivariable analysis to derive the clinical rule. Specific aims are to: i) apply standardized clinical assessments to patients with chest pain, incorporating results of early cardiac testing; ii) determine the inter-observer reliability of the clinical information; iii) determine the statistical association between the clinical findings and the composite outcome; and iv) use multivariable analysis to derive a highly sensitive clinical decision rule to guide triage decisions. Discussion The study will derive a highly sensitive clinical decision rule to identify low risk patients safe for early discharge. This will improve patient care, lower healthcare costs, and enhance flow in our busy and overcrowded emergency departments. PMID:18254973
Development of a virtual learning environment for cardiorespiratory arrest training.
Silva, Anazilda Carvalho da; Bernardes, Andrea; Évora, Yolanda Dora Martinez; Dalri, Maria Célia Barcellos; Silva, Alexandre Ribeiro da; Sampaio, Camila Santana Justo Cintra
2016-01-01
To develop a Virtual Learning Environment (VLE) aiming at the training of nursing team workers and emergency vehicle drivers in Basic Life Support (BLS) to attend Cardiorespiratory arrest, and to evaluate the quality of its contents among specialists in the area of Emergency and Urgent care. Applied research of technological development. The methodology used was based on the Instructional Design Model (ADDIE), which structures the teaching-learning planning in different stages (analysis, design, development, implementation and evaluation). The VLE was composed of texts elaborated from bibliographic research, links, edited video from a simulation scenario in the laboratory and questions to evaluate the fixation of the content, organized in modules. After its development, it was evaluated as adequate to satisfy the needs of the target public, by eight expert judges, which was made available for electronic access. The VLE has potential as a tool for training and qualification in BLS, as it can be easily integrated with other pedagogical approaches and strategies with active methodologies. Desenvolver um Ambiente Virtual de Aprendizagem (AVA) visando à capacitação de trabalhadores da equipe de enfermagem e condutores de veículo de emergência em Suporte Básico de Vida (SBV) no atendimento à Parada Cardiorrespiratória, e avaliar a qualidade do seu conteúdo junto a especialistas na área de Urgência e Emergência. Pesquisa aplicada, de produção tecnológica. A metodologia utilizada foi baseada no Modelo de Design Instrucional (ADDIE), que estrutura o planejamento de ensino-aprendizagem em estágios distintos (analysis, design, development, implementation and evaluation). O AVA foi composto por textos elaborados a partir de pesquisa bibliográfica, links, vídeo construído a partir de um cenário de simulação em laboratório e questões para avaliar a fixação do conteúdo, organizados em módulos. Após a sua construção, foi avaliado como adequado para satisfazer às necessidades do público-alvo, por oito juízes especialistas, sendo disponibilizado para acesso eletrônico. O AVA tem potencial como ferramenta para formação e capacitação em SBV porser facilmente integrado a outras abordagens pedagógicas e estratégias com metodologias ativas.
Fox, Amanda; Gardner, Glenn; Osborne, Sonya
2018-02-01
This research aimed to explore factors that influence sustainability of health service innovation, specifically emergency nurse practitioner service. Planning for cost effective provision of healthcare services is a concern globally. Reform initiatives are implemented often incorporating expanding scope of practice for health professionals and innovative service delivery models. Introducing new models is costly in both human and financial resources and therefore understanding factors influencing sustainability is imperative to viable service provision. This research used case study methodology (Yin, ). Data were collected during 2014 from emergency nurse practitioners, emergency department multidisciplinary team members and documents related to nurse practitioner services. Collection methods included telephone and semi-structured interviews, survey and document analysis. Pattern matching techniques were used to compare findings with study propositions. In this study, emergency nurse practitioner services did not meet factors that support health service sustainability. Multidisciplinary team members were confident that emergency nurse practitioner services were safe and helped to meet population health needs. Organizational support for integration of nurse practitioner services was marginal and led to poor understanding of service capability and underuse. This research provides evidence informing sustainability of nursing service models but more importantly raises questions about this little explored field. The findings highlight poor organizational support, excessive restrictions and underuse of the service. This is in direct contrast to contemporary expanding practice reform initiatives. Organizational support for integration is imperative to future service sustainability. © 2017 John Wiley & Sons Ltd.
Task analysis method for procedural training curriculum development.
Riggle, Jakeb D; Wadman, Michael C; McCrory, Bernadette; Lowndes, Bethany R; Heald, Elizabeth A; Carstens, Patricia K; Hallbeck, M Susan
2014-06-01
A central venous catheter (CVC) is an important medical tool used in critical care and emergent situations. Integral to proper care in many circumstances, insertion of a CVC introduces the risk of central line-associated blood stream infections and mechanical adverse events; proper training is important for safe CVC insertion. Cognitive task analysis (CTA) methods have been successfully implemented in the medical field to improve the training of postgraduate medical trainees, but can be very time-consuming to complete and require a significant time commitment from many subject matter experts (SMEs). Many medical procedures such as CVC insertion are linear processes with well-documented procedural steps. These linear procedures may not require a traditional CTA to gather the information necessary to create a training curriculum. Accordingly, a novel, streamlined CTA method designed primarily to collect cognitive cues for linear procedures was developed to be used by medical professionals with minimal CTA training. This new CTA methodology required fewer trained personnel, fewer interview sessions, and less time commitment from SMEs than a traditional CTA. Based on this study, a streamlined CTA methodology can be used to efficiently gather cognitive information on linear medical procedures for the creation of resident training curricula and procedural skills assessments.
Filip, Xenia; Borodi, Gheorghe; Filip, Claudiu
2011-10-28
A solid state structural investigation of ethoxzolamide is performed on microcrystalline powder by using a multi-technique approach that combines X-ray powder diffraction (XRPD) data analysis based on direct space methods with information from (13)C((15)N) solid-state Nuclear Magnetic Resonance (SS-NMR) and molecular modeling. Quantum chemical computations of the crystal were employed for geometry optimization and chemical shift calculations based on the Gauge Including Projector Augmented-Wave (GIPAW) method, whereas a systematic search in the conformational space was performed on the isolated molecule using a molecular mechanics (MM) approach. The applied methodology proved useful for: (i) removing ambiguities in the XRPD crystal structure determination process and further refining the derived structure solutions, and (ii) getting important insights into the relationship between the complex network of non-covalent interactions and the induced supra-molecular architectures/crystal packing patterns. It was found that ethoxzolamide provides an ideal case study for testing the accuracy with which this methodology allows to distinguish between various structural features emerging from the analysis of the powder diffraction data. This journal is © the Owner Societies 2011
Citation Discovery Tools for Conducting Adaptive Meta-analyses to Update Systematic Reviews.
Bae, Jong-Myon; Kim, Eun Hee
2016-03-01
The systematic review (SR) is a research methodology that aims to synthesize related evidence. Updating previously conducted SRs is necessary when new evidence has been produced, but no consensus has yet emerged on the appropriate update methodology. The authors have developed a new SR update method called 'adaptive meta-analysis' (AMA) using the 'cited by', 'similar articles', and 'related articles' citation discovery tools in the PubMed and Scopus databases. This study evaluates the usefulness of these citation discovery tools for updating SRs. Lists were constructed by applying the citation discovery tools in the two databases to the articles analyzed by a published SR. The degree of overlap between the lists and distribution of excluded results were evaluated. The articles ultimately selected for the SR update meta-analysis were found in the lists obtained from the 'cited by' and 'similar' tools in PubMed. Most of the selected articles appeared in both the 'cited by' lists in Scopus and PubMed. The Scopus 'related' tool did not identify the appropriate articles. The AMA, which involves using both citation discovery tools in PubMed, and optionally, the 'related' tool in Scopus, was found to be useful for updating an SR.
Consensus-based methodology for detection communities in multilayered networks
NASA Astrophysics Data System (ADS)
Karimi-Majd, Amir-Mohsen; Fathian, Mohammad; Makrehchi, Masoud
2018-03-01
Finding groups of network users who are densely related with each other has emerged as an interesting problem in the area of social network analysis. These groups or so-called communities would be hidden behind the behavior of users. Most studies assume that such behavior could be understood by focusing on user interfaces, their behavioral attributes or a combination of these network layers (i.e., interfaces with their attributes). They also assume that all network layers refer to the same behavior. However, in real-life networks, users' behavior in one layer may differ from their behavior in another one. In order to cope with these issues, this article proposes a consensus-based community detection approach (CBC). CBC finds communities among nodes at each layer, in parallel. Then, the results of layers should be aggregated using a consensus clustering method. This means that different behavior could be detected and used in the analysis. As for other significant advantages, the methodology would be able to handle missing values. Three experiments on real-life and computer-generated datasets have been conducted in order to evaluate the performance of CBC. The results indicate superiority and stability of CBC in comparison to other approaches.
Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.
Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís
2016-01-08
In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. Copyright © 2015 Elsevier B.V. All rights reserved.
Macroeconomic effects on mortality revealed by panel analysis with nonlinear trends.
Ionides, Edward L; Wang, Zhen; Tapia Granados, José A
2013-10-03
Many investigations have used panel methods to study the relationships between fluctuations in economic activity and mortality. A broad consensus has emerged on the overall procyclical nature of mortality: perhaps counter-intuitively, mortality typically rises above its trend during expansions. This consensus has been tarnished by inconsistent reports on the specific age groups and mortality causes involved. We show that these inconsistencies result, in part, from the trend specifications used in previous panel models. Standard econometric panel analysis involves fitting regression models using ordinary least squares, employing standard errors which are robust to temporal autocorrelation. The model specifications include a fixed effect, and possibly a linear trend, for each time series in the panel. We propose alternative methodology based on nonlinear detrending. Applying our methodology on data for the 50 US states from 1980 to 2006, we obtain more precise and consistent results than previous studies. We find procyclical mortality in all age groups. We find clear procyclical mortality due to respiratory disease and traffic injuries. Predominantly procyclical cardiovascular disease mortality and countercyclical suicide are subject to substantial state-to-state variation. Neither cancer nor homicide have significant macroeconomic association.
Macroeconomic effects on mortality revealed by panel analysis with nonlinear trends
Ionides, Edward L.; Wang, Zhen; Tapia Granados, José A.
2013-01-01
Many investigations have used panel methods to study the relationships between fluctuations in economic activity and mortality. A broad consensus has emerged on the overall procyclical nature of mortality: perhaps counter-intuitively, mortality typically rises above its trend during expansions. This consensus has been tarnished by inconsistent reports on the specific age groups and mortality causes involved. We show that these inconsistencies result, in part, from the trend specifications used in previous panel models. Standard econometric panel analysis involves fitting regression models using ordinary least squares, employing standard errors which are robust to temporal autocorrelation. The model specifications include a fixed effect, and possibly a linear trend, for each time series in the panel. We propose alternative methodology based on nonlinear detrending. Applying our methodology on data for the 50 US states from 1980 to 2006, we obtain more precise and consistent results than previous studies. We find procyclical mortality in all age groups. We find clear procyclical mortality due to respiratory disease and traffic injuries. Predominantly procyclical cardiovascular disease mortality and countercyclical suicide are subject to substantial state-to-state variation. Neither cancer nor homicide have significant macroeconomic association. PMID:24587843
Sequence analysis by iterated maps, a review.
Almeida, Jonas S
2014-05-01
Among alignment-free methods, Iterated Maps (IMs) are on a particular extreme: they are also scale free (order free). The use of IMs for sequence analysis is also distinct from other alignment-free methodologies in being rooted in statistical mechanics instead of computational linguistics. Both of these roots go back over two decades to the use of fractal geometry in the characterization of phase-space representations. The time series analysis origin of the field is betrayed by the title of the manuscript that started this alignment-free subdomain in 1990, 'Chaos Game Representation'. The clash between the analysis of sequences as continuous series and the better established use of Markovian approaches to discrete series was almost immediate, with a defining critique published in same journal 2 years later. The rest of that decade would go by before the scale-free nature of the IM space was uncovered. The ensuing decade saw this scalability generalized for non-genomic alphabets as well as an interest in its use for graphic representation of biological sequences. Finally, in the past couple of years, in step with the emergence of BigData and MapReduce as a new computational paradigm, there is a surprising third act in the IM story. Multiple reports have described gains in computational efficiency of multiple orders of magnitude over more conventional sequence analysis methodologies. The stage appears to be now set for a recasting of IMs with a central role in processing nextgen sequencing results.
Pal Choudhury, Pabitra
2017-01-01
Periplasmic c7 type cytochrome A (PpcA) protein is determined in Geobacter sulfurreducens along with its other four homologs (PpcB-E). From the crystal structure viewpoint the observation emerges that PpcA protein can bind with Deoxycholate (DXCA), while its other homologs do not. But it is yet to be established with certainty the reason behind this from primary protein sequence information. This study is primarily based on primary protein sequence analysis through the chemical basis of embedded amino acids. Firstly, we look for the chemical group specific score of amino acids. Along with this, we have developed a new methodology for the phylogenetic analysis based on chemical group dissimilarities of amino acids. This new methodology is applied to the cytochrome c7 family members and pinpoint how a particular sequence is differing with others. Secondly, we build a graph theoretic model on using amino acid sequences which is also applied to the cytochrome c7 family members and some unique characteristics and their domains are highlighted. Thirdly, we search for unique patterns as subsequences which are common among the group or specific individual member. In all the cases, we are able to show some distinct features of PpcA that emerges PpcA as an outstanding protein compared to its other homologs, resulting towards its binding with deoxycholate. Similarly, some notable features for the structurally dissimilar protein PpcD compared to the other homologs are also brought out. Further, the five members of cytochrome family being homolog proteins, they must have some common significant features which are also enumerated in this study. PMID:28362850
Amy L. Sheaffer; Jay Beaman; Joseph T. O' Leary; Rebecca L. Williams; Doran M. Mason
2001-01-01
Sampling for research in recreation settings in an ongoing challenge. Often certain groups of users are more likely to be sampled. It is important in measuring public support for resource conservation and in understanding use of natural resources for recreation to evaluate issues of bias in survey methodologies. Important methodological issues emerged from a statewide...
An Effective Methodology for Processing and Analyzing Large, Complex Spacecraft Data Streams
ERIC Educational Resources Information Center
Teymourlouei, Haydar
2013-01-01
The emerging large datasets have made efficient data processing a much more difficult task for the traditional methodologies. Invariably, datasets continue to increase rapidly in size with time. The purpose of this research is to give an overview of some of the tools and techniques that can be utilized to manage and analyze large datasets. We…
ERIC Educational Resources Information Center
Costa, Rejane Pinto
2011-01-01
This study emerged from a broader research completed during my Masters Course. (THEORY/METHODOLOGY) Theory and methodology were guided by the critical multiculturalism as seen in McLaren (1997, 2000). In my doctoral thesis, this concept was deepened by and linked to the peace studies of Galtung (1990, 2005, 2006), to empower multicultural peace…
ERIC Educational Resources Information Center
Roberts, Amanda; Riley, Howard
2014-01-01
The paper proposes the activity of drawing as a methodological strategy within a university research context. It is illustrated with examples from one of the authors' (Roberts) practice-based PhD research. The paper argues that drawing as a research method can be validated in a manner akin to the more established research methods associated…
From Theory to Practice: Concept-Based Inquiry in a High School Art Classroom
ERIC Educational Resources Information Center
Walker, Margaret A.
2014-01-01
This study examines what an emerging educational theory looks like when put into practice in an art classroom. It explores the teaching methodology of a high school art teacher who has utilized concept-based inquiry in the classroom to engage his students in artmaking and analyzes the influence this methodology has had on his adolescent students.…
NASA Astrophysics Data System (ADS)
Albano, R.; Sole, A.; Adamowski, J.; Mancusi, L.
2014-11-01
Efficient decision-making regarding flood risk reduction has become a priority for authorities and stakeholders in many European countries. Risk analysis methods and techniques are a useful tool for evaluating costs and benefits of possible interventions. Within this context, a methodology to estimate flood consequences was developed in this paper that is based on GIS, and integrated with a model that estimates the degree of accessibility and operability of strategic emergency response structures in an urban area. The majority of the currently available approaches do not properly analyse road network connections and dependencies within systems, and as such a loss of roads could cause significant damages and problems to emergency services in cases of flooding. The proposed model is unique in that it provides a maximum-impact estimation of flood consequences on the basis of the operability of the strategic emergency structures in an urban area, their accessibility, and connection within the urban system of a city (i.e. connection between aid centres and buildings at risk), in the emergency phase. The results of a case study in the Puglia region in southern Italy are described to illustrate the practical applications of this newly proposed approach. The main advantage of the proposed approach is that it allows for defining a hierarchy between different infrastructure in the urban area through the identification of particular components whose operation and efficiency are critical for emergency management. This information can be used by decision-makers to prioritize risk reduction interventions in flood emergencies in urban areas, given limited financial resources.
Sriram, V; Gururaj, G; Razzak, J A; Naseer, R; Hyder, A A
2016-08-01
Strengthened emergency medical services (EMS) are urgently required in South Asia to reduce needless death and disability. Several EMS models have been introduced in India and Pakistan, and research on these models can facilitate improvements to EMS in the region. Our objective was to conduct a cross-case comparative analysis of three EMS organizations in India and Pakistan - GVK EMRI, Aman Foundation and Rescue 1122 - in order to draw out similarities and differences in their models. Case study methodology was used to systematically explore the organizational models of GVK EMRI (Karnataka, India), Aman Foundation (Karachi, Pakistan), and Rescue 1122 (Punjab, Pakistan). Qualitative methods - interviews, document review and non-participant observation - were utilized, and using a process of constant comparison, data were analysed across cases according to the WHO health system 'building blocks'. Emergent themes under each health system 'building block' of service delivery, health workforce, medical products and technology, health information systems, leadership and governance, and financing were described. Cross-cutting issues not applicable to any single building block were further identified. This cross-case comparison, the first of its kind in low- and middle-income countries, highlights key innovations and lessons, and areas of further research across EMS organizations in India, Pakistan and other resource-poor settings. Copyright © 2016 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Cenek, Martin; Dahl, Spencer K.
2016-11-01
Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.
Cenek, Martin; Dahl, Spencer K
2016-11-01
Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.
Henden, Lyndal; Lee, Stuart; Mueller, Ivo; Barry, Alyssa; Bahlo, Melanie
2018-05-01
Identification of genomic regions that are identical by descent (IBD) has proven useful for human genetic studies where analyses have led to the discovery of familial relatedness and fine-mapping of disease critical regions. Unfortunately however, IBD analyses have been underutilized in analysis of other organisms, including human pathogens. This is in part due to the lack of statistical methodologies for non-diploid genomes in addition to the added complexity of multiclonal infections. As such, we have developed an IBD methodology, called isoRelate, for analysis of haploid recombining microorganisms in the presence of multiclonal infections. Using the inferred IBD status at genomic locations, we have also developed a novel statistic for identifying loci under positive selection and propose relatedness networks as a means of exploring shared haplotypes within populations. We evaluate the performance of our methodologies for detecting IBD and selection, including comparisons with existing tools, then perform an exploratory analysis of whole genome sequencing data from a global Plasmodium falciparum dataset of more than 2500 genomes. This analysis identifies Southeast Asia as having many highly related isolates, possibly as a result of both reduced transmission from intensified control efforts and population bottlenecks following the emergence of antimalarial drug resistance. Many signals of selection are also identified, most of which overlap genes that are known to be associated with drug resistance, in addition to two novel signals observed in multiple countries that have yet to be explored in detail. Additionally, we investigate relatedness networks over the selected loci and determine that one of these sweeps has spread between continents while the other has arisen independently in different countries. IBD analysis of microorganisms using isoRelate can be used for exploring population structure, positive selection and haplotype distributions, and will be a valuable tool for monitoring disease control and elimination efforts of many diseases.
NASA Astrophysics Data System (ADS)
Ilev, Ilko K.; Walker, Bennett; Calhoun, William; Hassan, Moinuddin
2016-03-01
Biophotonics is an emerging field in modern biomedical technology that has opened up new horizons for transfer of state-of-the-art techniques from the areas of lasers, fiber optics and biomedical optics to the life sciences and medicine. This field continues to vastly expand with advanced developments across the entire spectrum of biomedical applications ranging from fundamental "bench" laboratory studies to clinical patient "bedside" diagnostics and therapeutics. However, in order to translate these technologies to clinical device applications, the scientific and industrial community, and FDA are facing the requirement for a thorough evaluation and review of laser radiation safety and efficacy concerns. In many cases, however, the review process is complicated due the lack of effective means and standard test methods to precisely analyze safety and effectiveness of some of the newly developed biophotonics techniques and devices. There is, therefore, an immediate public health need for new test protocols, guidance documents and standard test methods to precisely evaluate fundamental characteristics, performance quality and safety of these technologies and devices. Here, we will overview our recent developments of novel test methodologies for safety and efficacy evaluation of some emerging biophotonics technologies and medical devices. These methodologies are based on integrating the advanced features of state-of-the-art optical sensor technologies and approaches such as high-resolution fiber-optic sensing, confocal and optical coherence tomography imaging, and infrared spectroscopy. The presentation will also illustrate some methodologies developed and implemented for testing intraocular lens implants, biochemical contaminations of medical devices, ultrahigh-resolution nanoscopy, and femtosecond laser therapeutics.
Chambers, Lori A; Jackson, Randy; Worthington, Catherine; Wilson, Ciann L; Tharao, Wangari; Greenspan, Nicole R; Masching, Renee; Pierre-Pierre, Valérie; Mbulaheni, Tola; Amirault, Marni; Brownlee, Patrick
2018-01-01
This article summarizes our deepened understanding of decolonizing research with, for, and by Indigenous peoples and peoples of African descent that emerged from conducting a scoping review of the methodological literature and reflecting on our review process. Although our review identified decolonizing methodologies as a promising approach, we questioned if our scoping review process engaged in decolonizing knowing. To unpack the epistemological tensions between decolonizing knowing and Western ways of doing scoping reviews, we engaged in individual and collective reflective processes- dialoguing with the tensions-moving from individual immersion in the literature to transformative dialogues among the team. In reflecting upon our tensions with the scoping review process, themes that emerged included (a) ontological/epistemological disjunctures, (b) tensions with concepts and language, and (c) relationships with the literature and beyond. This reflexive process provides valuable insight into ways in which review methods might be made a decolonizing research experience.
Conducting systematic reviews of association (etiology): The Joanna Briggs Institute's approach.
Moola, Sandeep; Munn, Zachary; Sears, Kim; Sfetcu, Raluca; Currie, Marian; Lisy, Karolina; Tufanaru, Catalin; Qureshi, Rubab; Mattis, Patrick; Mu, Peifan
2015-09-01
The systematic review of evidence is the research method which underpins the traditional approach to evidence-based healthcare. There is currently no uniform methodology for conducting a systematic review of association (etiology). This study outlines and describes the Joanna Briggs Institute's approach and guidance for synthesizing evidence related to association with a predominant focus on etiology and contributes to the emerging field of systematic review methodologies. It should be noted that questions of association typically address etiological or prognostic issues.The systematic review of studies to answer questions of etiology follows the same basic principles of systematic review of other types of data. An a priori protocol must inform the conduct of the systematic review, comprehensive searching must be performed and critical appraisal of retrieved studies must be carried out.The overarching objective of systematic reviews of etiology is to identify and synthesize the best available evidence on the factors of interest that are associated with a particular disease or outcome. The traditional PICO (population, interventions, comparators and outcomes) format for systematic reviews of effects does not align with questions relating to etiology. A systematic review of etiology should include the following aspects: population, exposure of interest (independent variable) and outcome (dependent variable).Studies of etiology are predominantly explanatory or predictive. The objective of reviews of explanatory or predictive studies is to contribute to, and improve our understanding of, the relationship of health-related events or outcomes by examining the association between variables. When interpreting possible associations between variables based on observational study data, caution must be exercised due to the likely presence of confounding variables or moderators that may impact on the results.As with all systematic reviews, there are various approaches to present the results, including a narrative, graphical or tabular summary, or meta-analysis. When meta-analysis is not possible, a set of alternative methods for synthesizing research is available. On the basis of the research question and objectives, narrative, tabular and/or visual approaches can be used for data synthesis. There are some special considerations when conducting meta-analysis for questions related to risk and correlation. These include, but are not limited to, causal inference.Systematic review and meta-analysis of studies related to etiology is an emerging methodology in the field of evidence synthesis. These reviews can provide useful information for healthcare professionals and policymakers on the burden of disease. The standardized Joanna Briggs Institute approach offers a rigorous and transparent method to conduct reviews of etiology.
Velloso, Isabela; Ceci, Christine; Alves, Marilia
2013-09-01
In this paper, we make explicit the changing configurations of power relations that currently characterize the Brazilian Emergency Care System (SAMU) team in Belo Horizonte, Brazil. The SAMU is a recent innovation in Brazilian healthcare service delivery. A qualitative case study methodology was used to explore SAMU's current organizational arrangements, specifically the power relations that have developed and that demonstrate internal team struggles over space and defense of particular occupational interests. The argument advanced in this paper is that these professionals are developing their work in conditions of exposure, that is, they are always being observed by someone, and that such observational exposure provides the conditions whereby everyday emergency care practices are enacted such that practice is shaped by, as well as shapes, particular, yet recognizable power relationships. Data were collected through the observation of the SAMU's work processes and through semi-structured interviews. Research materials were analyzed using discourse analysis. In the emergency care process of work, visibility is actually embedded in the disciplinary context and can thus be analyzed as a technique applied to produce disciplined individuals through the simple mechanisms elaborated by Foucault such as hierarchical surveillance, normalizing judgment, and the examination. © 2012 John Wiley & Sons Ltd.
de Laat, Sonya; Schwartz, Lisa
2016-01-01
Introduction Prospective informed consent is required for most research involving human participants; however, this is impracticable under some circumstances. The Tri-Council Policy Statement: Ethical Conduct for Research Involving Humans (TCPS) outlines the requirements for research involving human participants in Canada. The need for an exception to consent (deferred consent) is recognised and endorsed in the TCPS for research in individual medical emergencies; however, little is known about substitute decision-maker (SDM) experiences. A paediatric resuscitation trial (SQUEEZE) (NCT01973907) using an exception to consent process began enrolling at McMaster Children's Hospital in January 2014. This qualitative research study aims to generate new knowledge on SDM experiences with the exception to consent process as implemented in a randomised controlled trial. Methods and analysis The SDMs of children enrolled into the SQUEEZE pilot trial will be the sampling frame from which ethics study participants will be derived. Design: Qualitative research study involving individual interviews and grounded theory methodology. Participants: SDMs for children enrolled into the SQUEEZE pilot trial. Sample size: Up to 25 SDMs. Qualitative methodology: SDMs will be invited to participate in the qualitative ethics study. Interviews with consenting SDMs will be conducted in person or by telephone, taped and professionally transcribed. Participants will be encouraged to elaborate on their experience of being asked to consent after the fact and how this process occurred. Analysis: Data gathering and analysis will be undertaken simultaneously. The investigators will collaborate in developing the coding scheme, and data will be coded using NVivo. Emerging themes will be identified. Ethics and dissemination This research represents a rare opportunity to interview parents/guardians of critically ill children enrolled into a resuscitation trial without their knowledge or prior consent. Findings will inform implementation of the exception to consent process in the planned definitive SQUEEZE trial and support development of evidence-based ethics guidelines. PMID:27625066
Knowledge Representation Standards and Interchange Formats for Causal Graphs
NASA Technical Reports Server (NTRS)
Throop, David R.; Malin, Jane T.; Fleming, Land
2005-01-01
In many domains, automated reasoning tools must represent graphs of causally linked events. These include fault-tree analysis, probabilistic risk assessment (PRA), planning, procedures, medical reasoning about disease progression, and functional architectures. Each of these fields has its own requirements for the representation of causation, events, actors and conditions. The representations include ontologies of function and cause, data dictionaries for causal dependency, failure and hazard, and interchange formats between some existing tools. In none of the domains has a generally accepted interchange format emerged. The paper makes progress towards interoperability across the wide range of causal analysis methodologies. We survey existing practice and emerging interchange formats in each of these fields. Setting forth a set of terms and concepts that are broadly shared across the domains, we examine the several ways in which current practice represents them. Some phenomena are difficult to represent or to analyze in several domains. These include mode transitions, reachability analysis, positive and negative feedback loops, conditions correlated but not causally linked and bimodal probability distributions. We work through examples and contrast the differing methods for addressing them. We detail recent work in knowledge interchange formats for causal trees in aerospace analysis applications in early design, safety and reliability. Several examples are discussed, with a particular focus on reachability analysis and mode transitions. We generalize the aerospace analysis work across the several other domains. We also recommend features and capabilities for the next generation of causal knowledge representation standards.
Concepts of Infidelity among African American Emerging Adults: Implications for HIV/STI Prevention
ERIC Educational Resources Information Center
Eyre, Stephen L.; Flythe, Michelle; Hoffman, Valerie; Fraser, Ashley E.
2012-01-01
In this study, we used an exploratory methodology to determine what cultural models African American emerging adults use to understand infidelity/cheating. Cultural models are defined as "cognitive schema[s] that [are] intersubjectively shared by a social group" (D'Andrade, 1987, p. 112). We interviewed 144 participants ages 19-22 from three…
The Emergence of Public Health Open Educational Resources
ERIC Educational Resources Information Center
Angell, C.; Hartwell, H.; Hemingway, A.
2011-01-01
Purpose: The purpose of this paper is to identify key concepts in the literature relating to the release of open educational resources (OER), with specific reference to the emergence of public health OER. Design/methodology/approach: A review of the literature relating to the development of OER was followed by an online search for OER literature…
Race and Emotion in Computer-Based HIV Prevention Videos for Emergency Department Patients
ERIC Educational Resources Information Center
Aronson, Ian David; Bania, Theodore C.
2011-01-01
Computer-based video provides a valuable tool for HIV prevention in hospital emergency departments. However, the type of video content and protocol that will be most effective remain underexplored and the subject of debate. This study employs a new and highly replicable methodology that enables comparisons of multiple video segments, each based on…
Workshop on Emergent Literacy in Childhood Report (Suva, Fiji, February 5-23, 1996).
ERIC Educational Resources Information Center
Golda Meir Mount Carmel International Training Centre, Haifa (Israel).
This document reports on a workshop on emergent literacy in early childhood held in Fiji. The workshop was sponsored by UNICEF and the Fiji Ministry of Education. The course objectives were to: (1) review education philosophies and methodologies currently influencing early childhood education in Fiji; (2) gain extended knowledge on language…
ERIC Educational Resources Information Center
Goastellec, Gaele
2010-01-01
What do the shared norms emerging in the regulation of access reveal about the higher education internationalisation process? The history of access norms brings to light two characteristics of this process: the spreading of sociotechnic tools and the emergence of moral entrepreneurs. Based on case studies carried out in France, the US, South…
Project #138. Coronary Care Education of Health Care Team. Final Report.
ERIC Educational Resources Information Center
Saint Joseph Hospital, MO.
The goal of this project was to develop, establish, and implement a system for the educational development of health care team members of the St. Joseph region in emergency and coronary care. Programs, curricula, and evaluation methodology were devised for four levels of critical care personnel: R.N.s emphasizing emergency and coronary care;…
ERIC Educational Resources Information Center
Reese-Weber, Marla
2008-01-01
The present study provides experimental data comparing emerging adults' attitudes toward dating and sibling violence in adolescence using a new methodology in which participants observe a violent interaction between adolescents. The reported amount of violence experienced in dating and sibling relationships among emerging adults is also compared.…
ERIC Educational Resources Information Center
Côté, James E.
2014-01-01
This article examines the theory of emerging adulthood, introduced into the literature by Arnett (2000), in terms of its methodological and evidential basis, and finds it to be unsubstantiated on numerous grounds. Other, more convincing, formulations of variations in the transition to adulthood are examined. Most flawed academic theories are…
ERIC Educational Resources Information Center
Godina, Heriberto; Soto-Ramirez, Cynthia
2017-01-01
This study examines fifth-grade Mexican American students' beliefs about emergent gender roles. We used participant-observation methodology to conduct research on six focal-student participants selected from the general fifth-grade population at an elementary school located in the Southwestern United States. Collected data included focal-student…
ERIC Educational Resources Information Center
Guterman, Neil B.
2004-01-01
Prevention research on the related problems of child abuse, youth violence, and domestic violence has grown at an accelerating pace in recent years. In this context, a set of shared methodological issues has emerged as investigators seek to advance the interpersonal violence prevention knowledge base. This article considers some of the persistent…
Hand, Carri; Huot, Suzanne; Laliberte Rudman, Debbie; Wijekoon, Sachindri
2017-06-01
Research exploring how places shape and interact with the lives of aging adults must be grounded in the places where aging adults live and participate. Combined participatory geospatial and qualitative methods have the potential to illuminate the complex processes enacted between person and place to create much-needed knowledge in this area. The purpose of this scoping review was to identify methods that can be used to study person-place relationships among aging adults and their neighborhoods by determining the extent and nature of research with aging adults that combines qualitative methods with participatory geospatial methods. A systematic search of nine databases identified 1,965 articles published from 1995 to late 2015. We extracted data and assessed whether the geospatial and qualitative methods were supported by a specified methodology, the methods of data analysis, and the extent of integration of geospatial and qualitative methods. Fifteen studies were included and used the photovoice method, global positioning system tracking plus interview, or go-along interviews. Most included articles provided sufficient detail about data collection methods, yet limited detail about methodologies supporting the study designs and/or data analysis. Approaches that combine participatory geospatial and qualitative methods are beginning to emerge in the aging literature. By more explicitly grounding studies in a methodology, better integrating different types of data during analysis, and reflecting on methods as they are applied, these methods can be further developed and utilized to provide crucial place-based knowledge that can support aging adults' health, well-being, engagement, and participation. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Validating competence: a new credential for clinical documentation improvement practitioners.
Ryan, Jessica; Patena, Karen; Judd, Wallace; Niederpruem, Mike
2013-01-01
As the health information management (HIM) profession continues to expand and become more specialized, there is an ever-increasing need to identify emerging HIM workforce roles that require a codified level of proficiency and professional standards. The Commission on Certification for Health Informatics and Information Management (CCHIIM) explored one such role-clinical documentation improvement (CDI) practitioner-to define the tasks and responsibilities of the job as well as the knowledge required to perform them effectively. Subject-matter experts (SMEs) defined the CDI specialty by following best practices for job analysis methodology. A random sample of 4,923 CDI-related professionals was surveyed regarding the tasks and knowledge required for the job. The survey data were used to create a weighted blueprint of the six major domains that make up the CDI practitioner role, which later formed the foundation for the clinical documentation improvement practitioner (CDIP) credential. As a result, healthcare organizations can be assured that their certified documentation improvement practitioners have demonstrated excellence in clinical care, treatment, coding guidelines, and reimbursement methodologies.
A Counter-IED Preparedness Methodology for Large Event Planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Payne, Patricia W; Koch, Daniel B
Since 2009, Oak Ridge National Laboratory (ORNL) has been involved in a project sponsored by the Department of Homeland Security Science and Technology Directorate aimed at improving preparedness against Improvised Explosive Devices (IED) at large sporting events. Led by the University of Southern Mississippi (USM) as part of the Southeast Region Research Initiative, the project partners have been developing tools and methodologies for use by security personnel and first responders at sports stadiums. ORNL s contribution has been to develop an automated process to gather and organize disparate data that is usually part of an organization s security plan. Themore » organized data informs a table-top exercise (TTX) conducted by USM using additional tools developed by them and their subcontractors. After participating in several pilot TTXs, patterns are beginning to emerge that would enable improvements to be formulated to increase the level of counter-IED preparedness. This paper focuses on the data collection and analysis process and shares insights gained to date.« less
Kanfiszer, Lucie; Davies, Fran; Collins, Suzanne
2017-08-01
Existing literature exploring autism spectrum disorders within female populations predominantly utilises quantitative methodology. A limited number of small-scale, qualitative studies have explored the experiences of adolescent girls with autism spectrum disorder, but adult women have remained largely unheard. This study aims to broaden the stories told within autobiographical literature and empower those within the wider community of women with autism spectrum disorder. In doing so, it seeks to extend existing conceptualisations of experience to include socially and culturally located factors. A qualitative methodology was adopted, utilising multi-stage narrative analysis. Seven semi-structured interviews with women who received a diagnosis in adulthood were conducted. Recruitment spanned community mental health services, an inpatient service and a community support group. From the women's diverse experiences and stories emerged two broad categories related to gender identity and social relationships. The findings are discussed in relation to existing constructs of autism in women.
Online monitoring of seismic damage in water distribution systems
NASA Astrophysics Data System (ADS)
Liang, Jianwen; Xiao, Di; Zhao, Xinhua; Zhang, Hongwei
2004-07-01
It is shown that water distribution systems can be damaged by earthquakes, and the seismic damages cannot easily be located, especially immediately after the events. Earthquake experiences show that accurate and quick location of seismic damage is critical to emergency response of water distribution systems. This paper develops a methodology to locate seismic damage -- multiple breaks in a water distribution system by monitoring water pressure online at limited positions in the water distribution system. For the purpose of online monitoring, supervisory control and data acquisition (SCADA) technology can well be used. A neural network-based inverse analysis method is constructed for locating the seismic damage based on the variation of water pressure. The neural network is trained by using analytically simulated data from the water distribution system, and validated by using a set of data that have never been used in the training. It is found that the methodology provides an effective and practical way in which seismic damage in a water distribution system can be accurately and quickly located.
Vernon, Donald D; Bolte, Robert G; Scaife, Eric; Hansen, Kristine W
2005-01-01
Freestanding children's hospitals may lack resources, especially surgical manpower, to meet American College of Surgeons trauma center criteria, and may organize trauma care in alternative ways. At a tertiary care children's hospital, attending trauma surgeons and anesthesiologists took out-of-hospital call and directed initial care for only the most severely injured patients, whereas pediatric emergency physicians directed care for patients with less severe injuries. Survival data were analyzed using TRISS methodology. A total of 903 trauma patients were seen by the system during the period 10/1/96-6/30/01. Median Injury Severity Score was 16, and 508 of patients had Injury Severity Score > or =15. There were 83 deaths, 21 unexpected survivors, and 13 unexpected deaths. TRISS analysis showed that z-score was 4.39 and W-statistic was 3.07. Mortality outcome from trauma in a pediatric hospital using this alternative approach to trauma care was significantly better than predicted by TRISS methodology.
Shi, Li; Wei, Dong; Ngo, Huu Hao; Guo, Wenshan; Du, Bin; Wei, Qin
2015-10-01
This study assessed the biosorption of anaerobic granular sludge (AGS) and its capacity as a biosorbent to remove Pb(II) and methylene blue (MB) from multi-components aqueous solution. It emerged that the biosorption data fitted well to the pseudo-second-order and Langmuir adsorption isotherm models in both single and binary systems. In competitive biosorption systems, Pb(II) and MB will suppress each other's biosorption capacity. Spectroscopic analysis, including Fourier transform infrared spectroscopy (FTIR) and fluorescence spectroscopy were integrated to explain this interaction. Hydroxyl and amine groups in AGS were the key functional groups for sorption. Three-dimensional excitation-emission matrix (3D-EEM) implied that two main protein-like substances were identified and quenched when Pb(II) or MB were present. Response surface methodology (RSM) confirmed that the removal efficiency of Pb(II) and MB reached its peak when the concentration ratios of Pb(II) and MB achieved a constant value of 1. Copyright © 2015 Elsevier Ltd. All rights reserved.
Molinos-Senante, M; Hernández-Sancho, F; Sala-Garrido, R
2011-12-01
Water reuse is an emerging and promising non-conventional water resource. Feasibility studies are essential tools in the decision making process for the implementation of water-reuse projects. However, the methods used to assess economic feasibility tend to focus on internal costs, while external impacts are relegated to unsubstantiated statements about the advantages of water reuse. Using the concept of shadow prices for undesirable outputs of water reclamation, the current study developed a theoretical methodology to assess internal and external economic impacts. The proposed methodological approach is applied to 13 wastewater treatment plants in the Valencia region of Spain that reuse effluent for environmental purposes. Internal benefit analyses indicated that only a proportion of projects were economically viable, while when external benefits are incorporated all projects were economically viable. In conclusion, the economic feasibility assessments of water-reuse projects should quantitatively evaluate economic, environmental and resource availability. Copyright © 2011 Elsevier Ltd. All rights reserved.
Kellogg, Joshua J.; Wallace, Emily D.; Graf, Tyler N.; Oberlies, Nicholas H.; Cech, Nadja B.
2018-01-01
Metabolomics has emerged as an important analytical technique for multiple applications. The value of information obtained from metabolomics analysis depends on the degree to which the entire metabolome is present and the reliability of sample treatment to ensure reproducibility across the study. The purpose of this study was to compare methods of preparing complex botanical extract samples prior to metabolomics profiling. Two extraction methodologies, accelerated solvent extraction and a conventional solvent maceration, were compared using commercial green tea [Camellia sinensis (L.) Kuntze (Theaceae)] products as a test case. The accelerated solvent protocol was first evaluated to ascertain critical factors influencing extraction using a D-optimal experimental design study. The accelerated solvent and conventional extraction methods yielded similar metabolite profiles for the green tea samples studied. The accelerated solvent extraction yielded higher total amounts of extracted catechins, was more reproducible, and required less active bench time to prepare the samples. This study demonstrates the effectiveness of accelerated solvent as an efficient methodology for metabolomics studies. PMID:28787673
High-frequency health data and spline functions.
Martín-Rodríguez, Gloria; Murillo-Fort, Carlos
2005-03-30
Seasonal variations are highly relevant for health service organization. In general, short run movements of medical magnitudes are important features for managers in this field to make adequate decisions. Thus, the analysis of the seasonal pattern in high-frequency health data is an appealing task. The aim of this paper is to propose procedures that allow the analysis of the seasonal component in this kind of data by means of spline functions embedded into a structural model. In the proposed method, useful adaptions of the traditional spline formulation are developed, and the resulting procedures are capable of capturing periodic variations, whether deterministic or stochastic, in a parsimonious way. Finally, these methodological tools are applied to a series of daily emergency service demand in order to capture simultaneous seasonal variations in which periods are different.
Automated analysis of clonal cancer cells by intravital imaging
Coffey, Sarah Earley; Giedt, Randy J; Weissleder, Ralph
2013-01-01
Longitudinal analyses of single cell lineages over prolonged periods have been challenging particularly in processes characterized by high cell turn-over such as inflammation, proliferation, or cancer. RGB marking has emerged as an elegant approach for enabling such investigations. However, methods for automated image analysis continue to be lacking. Here, to address this, we created a number of different multicolored poly- and monoclonal cancer cell lines for in vitro and in vivo use. To classify these cells in large scale data sets, we subsequently developed and tested an automated algorithm based on hue selection. Our results showed that this method allows accurate analyses at a fraction of the computational time required by more complex color classification methods. Moreover, the methodology should be broadly applicable to both in vitro and in vivo analyses. PMID:24349895
Spatial and temporal epidemiological analysis in the Big Data era.
Pfeiffer, Dirk U; Stevens, Kim B
2015-11-01
Concurrent with global economic development in the last 50 years, the opportunities for the spread of existing diseases and emergence of new infectious pathogens, have increased substantially. The activities associated with the enormously intensified global connectivity have resulted in large amounts of data being generated, which in turn provides opportunities for generating knowledge that will allow more effective management of animal and human health risks. This so-called Big Data has, more recently, been accompanied by the Internet of Things which highlights the increasing presence of a wide range of sensors, interconnected via the Internet. Analysis of this data needs to exploit its complexity, accommodate variation in data quality and should take advantage of its spatial and temporal dimensions, where available. Apart from the development of hardware technologies and networking/communication infrastructure, it is necessary to develop appropriate data management tools that make this data accessible for analysis. This includes relational databases, geographical information systems and most recently, cloud-based data storage such as Hadoop distributed file systems. While the development in analytical methodologies has not quite caught up with the data deluge, important advances have been made in a number of areas, including spatial and temporal data analysis where the spectrum of analytical methods ranges from visualisation and exploratory analysis, to modelling. While there used to be a primary focus on statistical science in terms of methodological development for data analysis, the newly emerged discipline of data science is a reflection of the challenges presented by the need to integrate diverse data sources and exploit them using novel data- and knowledge-driven modelling methods while simultaneously recognising the value of quantitative as well as qualitative analytical approaches. Machine learning regression methods, which are more robust and can handle large datasets faster than classical regression approaches, are now also used to analyse spatial and spatio-temporal data. Multi-criteria decision analysis methods have gained greater acceptance, due in part, to the need to increasingly combine data from diverse sources including published scientific information and expert opinion in an attempt to fill important knowledge gaps. The opportunities for more effective prevention, detection and control of animal health threats arising from these developments are immense, but not without risks given the different types, and much higher frequency, of biases associated with these data. Copyright © 2015 Elsevier B.V. All rights reserved.
Multi-scale landslide hazard assessment: Advances in global and regional methodologies
NASA Astrophysics Data System (ADS)
Kirschbaum, Dalia; Peters-Lidard, Christa; Adler, Robert; Hong, Yang
2010-05-01
The increasing availability of remotely sensed surface data and precipitation provides a unique opportunity to explore how smaller-scale landslide susceptibility and hazard assessment methodologies may be applicable at larger spatial scales. This research first considers an emerging satellite-based global algorithm framework, which evaluates how the landslide susceptibility and satellite derived rainfall estimates can forecast potential landslide conditions. An analysis of this algorithm using a newly developed global landslide inventory catalog suggests that forecasting errors are geographically variable due to improper weighting of surface observables, resolution of the current susceptibility map, and limitations in the availability of landslide inventory data. These methodological and data limitation issues can be more thoroughly assessed at the regional level, where available higher resolution landslide inventories can be applied to empirically derive relationships between surface variables and landslide occurrence. The regional empirical model shows improvement over the global framework in advancing near real-time landslide forecasting efforts; however, there are many uncertainties and assumptions surrounding such a methodology that decreases the functionality and utility of this system. This research seeks to improve upon this initial concept by exploring the potential opportunities and methodological structure needed to advance larger-scale landslide hazard forecasting and make it more of an operational reality. Sensitivity analysis of the surface and rainfall parameters in the preliminary algorithm indicates that surface data resolution and the interdependency of variables must be more appropriately quantified at local and regional scales. Additionally, integrating available surface parameters must be approached in a more theoretical, physically-based manner to better represent the physical processes underlying slope instability and landslide initiation. Several rainfall infiltration and hydrological flow models have been developed to model slope instability at small spatial scales. This research investigates the potential of applying a more quantitative hydrological model to larger spatial scales, utilizing satellite and surface data inputs that are obtainable over different geographic regions. Due to the significant role that data and methodological uncertainties play in the effectiveness of landslide hazard assessment outputs, the methodology and data inputs are considered within an ensemble uncertainty framework in order to better resolve the contribution and limitations of model inputs and to more effectively communicate the model skill for improved landslide hazard assessment.
Chronic care management for patients with COPD: a critical review of available evidence.
Lemmens, Karin M M; Lemmens, Lidwien C; Boom, José H C; Drewes, Hanneke W; Meeuwissen, Jolanda A C; Steuten, Lotte M G; Vrijhoef, Hubertus J M; Baan, Caroline A
2013-10-01
Clinical diversity and methodological heterogeneity exists between studies on chronic care management. This study aimed to examine the effectiveness of chronic care management in chronic obstructive pulmonary disease (COPD) while taking heterogeneity into account, enabling the understanding of and the decision making about such programmes. Three investigated sources of heterogeneity were study quality, length of follow-up, and number of intervention components. We performed a review of previously published reviews and meta-analyses on COPD chronic care management. Their primary studies that were analyzed as statistical, clinical and methodological heterogeneity were present. Meta-regression analyses were performed to explain the variances among the primary studies. Generally, the included reviews showed positive results on quality of life and hospitalizations. Inconclusive effects were found on emergency department visits and no effects on mortality. Pooled effects on hospitalizations, emergency department visits and quality of life of primary studies did not reach significant improvement. No effects were found on mortality. Meta-regression showed that the number of components of chronic care management programmes explained present heterogeneity for hospitalizations and emergency department visits. Four components showed significant effects on hospitalizations, whereas two components had significant effects on emergency department visits. Methodological study quality and length of follow-up did not significantly explain heterogeneity. This study demonstrated that COPD chronic care management has the potential to improve outcomes of care; heterogeneity in outcomes was explained. Further research is needed to elucidate the diversity between COPD chronic care management studies in terms of the effects measured and strengthen the support for chronic care management. © 2011 John Wiley & Sons Ltd.
Strudwick, Kirsten; Nelson, Mark; Martin-Khan, Melinda; Bourke, Michael; Bell, Anthony; Russell, Trevor
2015-02-01
There is increasing importance placed on quality of health care for musculoskeletal injuries in emergency departments (EDs). This systematic review aimed to identify existing musculoskeletal quality indicators (QIs) developed for ED use and to critically evaluate their methodological quality. MEDLINE, EMBASE, CINAHL, and the gray literature, including relevant organizational websites, were searched in 2013. English-language articles were included that described the development of at least one QI related to the ED care of musculoskeletal injuries. Data extraction of each included article was conducted. A quality assessment was then performed by rating each relevant QI against the Appraisal of Indicators through Research and Evaluation (AIRE) Instrument. QIs with similar definitions were grouped together and categorized according to the health care quality frameworks of Donabedian and the Institute of Medicine. The search revealed 1,805 potentially relevant articles, of which 15 were finally included in the review. The number of relevant QIs per article ranged from one to 11, resulting in a total of 71 QIs overall. Pain (n = 17) and fracture management (n = 13) QIs were predominant. Ten QIs scored at least 50% across all AIRE Instrument domains, and these related to pain management and appropriate imaging of the spine. Methodological quality of the development of most QIs is poor. Recommendations for a core set of QIs that address the complete spectrum of musculoskeletal injury management in emergency medicine is not possible, and more work is needed. Currently, QIs with highest methodological quality are in the areas of pain management and medical imaging. © 2015 by the Society for Academic Emergency Medicine.
The Commander’s Emergency Response Program: A Model for Future Implementation
2010-04-07
unintended Effects. The INVEST-E methodology serves as a tool for commanders and their designated practitioners to properly select projects, increasing...for commanders and their designated practitioners to properly select projects, increasing the effectiveness of CERP funds. 4 TABLE OF...and unintended Effects. The INVEST-E methodology serves as a tool for commanders and their designated practitioners to properly select projects
TU-D-201-07: Severity Indication in High Dose Rate Brachytherapy Emergency Response Procedure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, K; Rustad, F
Purpose: Understanding the corresponding dose to different staff during the High Dose Rate (HDR) Brachytherapy emergency response procedure could help to develop a strategy in efficiency and effective action. In this study, the variation and risk analysis methodology was developed to simulation the HDR emergency response procedure based on severity indicator. Methods: A GammaMedplus iX HDR unit from Varian Medical System was used for this simulation. The emergency response procedure was decomposed based on risk management methods. Severity indexes were used to identify the impact of a risk occurrence on the step including dose to patient and dose to operationmore » staff by varying the time, HDR source activity, distance from the source to patient and staff and the actions. These actions in 7 steps were to press the interrupt button, press emergency shutoff switch, press emergency button on the afterloader keypad, turn emergency hand-crank, remove applicator from the patient, disconnect transfer tube and move afterloader from the patient, and execute emergency surgical recovery. Results: Given the accumulated time in second at the assumed 7 steps were 15, 5, 30, 15, 180, 120, 1800, and the dose rate of HDR source is 10 Ci, the accumulated dose in cGy to patient at 1cm distance were 188, 250, 625, 813, 3063, 4563 and 27063, and the accumulated exposure in rem to operator at outside the vault, 1m and 10cm distance were 0.0, 0.0, 0.1, 0.1, 22.6, 37.6 and 262.6. The variation was determined by the operators in action at different time and distance from the HDR source. Conclusion: The time and dose were estimated for a HDR unit emergency response procedure. It provided information in making optimal decision during the emergency procedure. Further investigation would be to optimize and standardize the responses for other emergency procedure by time-spatial-dose severity function.« less
Income inequality: A complex network analysis of US states
NASA Astrophysics Data System (ADS)
Gogas, Periklis; Gupta, Rangan; Miller, Stephen M.; Papadimitriou, Theophilos; Sarantitis, Georgios Antonios
2017-10-01
This study performs a long-run, inter-temporal analysis of income inequality in the US spanning the period 1916-2012. We employ both descriptive analysis and the Threshold-Minimum Dominating Set methodology from Graph Theory, to examine the evolution of inequality through time. In doing so, we use two alternative measures of inequality: the Top 1% share of income and the Gini coefficient. This provides new insight on the literature of income inequality across the US states. Several empirical findings emerge. First, a heterogeneous evolution of inequality exists across the four focal sub-periods. Second, the results differ between the inequality measures examined. Finally, we identify groups of similarly behaving states in terms of inequality. The US authorities can use these findings to identify inequality trends and innovations and/or examples to investigate the causes of inequality within the US and implement appropriate policies.
Social Network Analysis for Assessing College-Aged Adults' Health: A Systematic Review.
Patterson, Megan S; Go Odson, Patricia
2018-04-13
Social network analysis (SNA) is a useful, emerging method for studying health. College students are especially prone to social influence when it comes to health. This review aimed to identify network variables related to college student health and determine how SNA was used in the literature. A systematic review of relevant literature was conducted in October 2015. Studies employing egocentric or whole network analysis to study college student health were included. We used Garrard's Matrix Method to extract data from reviewed articles (n = 15). Drinking, smoking, aggression, homesickness, and stress were predicted by network variables in the reviewed literature. Methodological inconsistencies concerning boundary specification, data collection, nomination limits, and statistical analyses were revealed across studies. Results show the consistent relationship between network variables and college health outcomes, justifying further use of SNA to research college health. Suggestions and considerations for future use of SNA are provided.
Maternal cigarette smoking during pregnancy and criminal/deviant behavior: a meta-analysis.
Pratt, Travis C; McGloin, Jean Marie; Fearn, Noelle E
2006-12-01
A growing body of empirical literature has emerged examining the somewhat inconsistent relationship between maternal cigarette smoking (MCS) during pregnancy and children's subsequent antisocial behavior. To systematically assess what existing studies reveal regarding MCS as a criminogenic risk factor for offspring, the authors subjected this body of literature to a meta-analysis. The analysis reveals a statistically significant--yet rather small--overall mean "effect size" of the relationship between MCS and the likelihood children will engage in deviant/criminal behavior. In addition to being rather moderate in size, the MCS-crime/deviance relationship is sensitive to a number of methodological specifications across empirical studies--particularly those associated with sample characteristics. The implications of this modest, and somewhat unstable, relationship are discussed in terms of guidelines for future research on this subject and how existing theoretical perspectives may be integrated to explain the MCS-crime/deviance link.
Inaugural Genomics Automation Congress and the coming deluge of sequencing data.
Creighton, Chad J
2010-10-01
Presentations at Select Biosciences's first 'Genomics Automation Congress' (Boston, MA, USA) in 2010 focused on next-generation sequencing and the platforms and methodology around them. The meeting provided an overview of sequencing technologies, both new and emerging. Speakers shared their recent work on applying sequencing to profile cells for various levels of biomolecular complexity, including DNA sequences, DNA copy, DNA methylation, mRNA and microRNA. With sequencing time and costs continuing to drop dramatically, a virtual explosion of very large sequencing datasets is at hand, which will probably present challenges and opportunities for high-level data analysis and interpretation, as well as for information technology infrastructure.
Crowdsourcing biomedical research: leveraging communities as innovation engines
Saez-Rodriguez, Julio; Costello, James C.; Friend, Stephen H.; Kellen, Michael R.; Mangravite, Lara; Meyer, Pablo; Norman, Thea; Stolovitzky, Gustavo
2018-01-01
The generation of large-scale biomedical data is creating unprecedented opportunities for basic and translational science. Typically, the data producers perform initial analyses, but it is very likely that the most informative methods may reside with other groups. Crowdsourcing the analysis of complex and massive data has emerged as a framework to find robust methodologies. When the crowdsourcing is done in the form of collaborative scientific competitions, known as Challenges, the validation of the methods is inherently addressed. Challenges also encourage open innovation, create collaborative communities to solve diverse and important biomedical problems, and foster the creation and dissemination of well-curated data repositories. PMID:27418159
Crowdsourcing biomedical research: leveraging communities as innovation engines.
Saez-Rodriguez, Julio; Costello, James C; Friend, Stephen H; Kellen, Michael R; Mangravite, Lara; Meyer, Pablo; Norman, Thea; Stolovitzky, Gustavo
2016-07-15
The generation of large-scale biomedical data is creating unprecedented opportunities for basic and translational science. Typically, the data producers perform initial analyses, but it is very likely that the most informative methods may reside with other groups. Crowdsourcing the analysis of complex and massive data has emerged as a framework to find robust methodologies. When the crowdsourcing is done in the form of collaborative scientific competitions, known as Challenges, the validation of the methods is inherently addressed. Challenges also encourage open innovation, create collaborative communities to solve diverse and important biomedical problems, and foster the creation and dissemination of well-curated data repositories.
1979-07-01
African scenario.) The training analysis revealed some discrepancies between the list of tasks taught in FAOBC and the list of tasks emerging from the...I tD ’. 0C-) Q) 4- ) 0 N 4- _ L ~~1 CC 0 -- .0 I 4 J0C cog 1 . wi. I -4 1- Co4- ~a) U’ cu ) 0o 0 0 CDm 0 -% o c u- CO 0) -* -- cN- LO) C’I) NO 0 - CV...population density. (Refer to Figure 3-2). The African combat scenario, closely followed by the Middle Eastern scenario, was rated as being the most
Stenner, Paul H D; Bianchi, Gabriel; Popper, Miroslav; Supeková, Marianna; Luksík, Ivan; Pujol, Joan
2006-09-01
Q methodology was applied to investigate the views of young people from Catalunia, England and Slovakia regarding sexual relationships and their health implications. The Q sorts of 188 16-18-year-olds from these three diverse European regions were reduced by Q factor analysis to six clear accounts. These accounts are presented in relation to three emergent themes: (a) traditionalism/liberalism; (b) locus of responsibility; and (c) the relationship between sex and love, and these discursive themes are discussed in relation to health-salient criteria such as awareness of sex-related risk and corresponding implications for conduct.
Cultural analysis of communication behaviors among juveniles in a correctional facility.
Sanger, D D; Creswell, J W; Dworak, J; Schultz, L
2000-01-01
This study addressed communication behaviors of female juvenile delinquents in a correctional facility. Qualitative methodology was used to study 78 participants ranging in age from 13.1 to 18.9 (years; months), over a five-month period. Data collection consisted of observations, participant observation, interviews, and a review of documents. Additionally, participants were tested on the Clinical Evaluation of Language Fundamentals-3. Listening and following rules, utterance types, topics of conversion, politeness, and conversational management emerged as themes. Findings indicated that as many as 22% of participants were potential candidates for language services. Implications for speech-language pathologists (SLPs) providing communication services will be provided.
[Studies of bacterial typing with MALDI-TOF].
Culebras, Esther; Alvarez-Buylla, Adela; Jose Artacho Reinoso, M; Antonio Lepe, Jose
2016-06-01
MALDI-TOF (matrix-assisted laser desorption ionization time-of-flight) mass spectrometry has emerged as a potential tool for microbial characterization and identification in many microbiology departments. The technology is rapid, sensitive, and relatively inexpensive in terms of both the labour and costs involved. This review provides an overview on its utility for strain typing and epidemiological studies and explains the methodological approaches that can be used both for the performance of the technique and for the analysis of results. Finally, the review summarizes studies on the characterization of distinct bacterial species. Copyright © 2016 Elsevier España, S.L.U. All rights reserved.
Understanding parenting in Manitoba First nations: implications for program development.
Eni, Rachel; Rowe, Gladys
2011-01-01
This qualitative study introduced the "Manitoba First Nation Strengthening Families Maternal Child Health Pilot Project" program and evaluation methodologies. The study provided a knowledge base for programmers, evaluators, and communities to develop relevant health promotion, prevention, and intervention programming to assist in meeting health needs of pregnant women and young families. Sixty-five open-ended, semistructured interviews were completed in 13 communities. Data analysis was through grounded theory. Three major themes emerged from the data: interpersonal support and relationships; socioeconomic factors; and community initiatives. Complex structural, historical events compromise parenting; capacity and resilience are supported through informal and formal health and social supports.
ERIC Educational Resources Information Center
Delale, Feridun; Liaw, Benjamin M.; Jiji, Latif M.; Voiculescu, Ioana; Yu, Honghui
2011-01-01
From October 2003 to April 2008 a systemic reform of the Mechanical Engineering program at The City College of New York was undertaken with the goal of incorporating emerging technologies (such as nanotechnology, biotechnology, Micro-Electro-Mechanical Systems (MEMS), intelligent systems) and new teaching methodologies (such as project based…
ERIC Educational Resources Information Center
Hambleton, Ronald K., Ed.; Zaal, Jac N., Ed.
The 14 chapters of this book focus on the technical advances, advances in applied settings, and emerging topics in the testing field. Part 1 discusses methodological advances, Part 2 considers developments in applied settings, and Part 3 reviews emerging topics in the field of testing. Part 1 papers include: (1) "Advances in…
Factors influencing societal response of nanotechnology: an expert stakeholder analysis.
Gupta, Nidhi; Fischer, Arnout R H; van der Lans, Ivo A; Frewer, Lynn J
2012-05-01
Nanotechnology can be described as an emerging technology and, as has been the case with other emerging technologies such as genetic modification, different socio-psychological factors will potentially influence societal responses to its development and application. These factors will play an important role in how nanotechnology is developed and commercialised. This article aims to identify expert opinion on factors influencing societal response to applications of nanotechnology. Structured interviews with experts on nanotechnology from North West Europe were conducted using repertory grid methodology in conjunction with generalized Procrustes analysis to examine the psychological constructs underlying societal uptake of 15 key applications of nanotechnology drawn from different areas (e.g. medicine, agriculture and environment, chemical, food, military, sports, and cosmetics). Based on expert judgement, the main factors influencing societal response to different applications of nanotechnology will be the extent to which applications are perceived to be beneficial, useful, and necessary, and how 'real' and physically close to the end-user these applications are perceived to be by the public. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1007/s11051-012-0857-x) contains supplementary material, which is available to authorized users.
Gálvez, Carmen
2016-12-01
Identifying research lines is essential to understand the knowledge structure of a scientific domain. The aim of this study was to identify the main research topics of within the domain of public health, in the Revista Española de Saslud Pública during 2006-2015. Original articles included in the Social Sciences Citation Index (SSCI) database, available online through the Web of Science (WoS), were selected. The analysis units used were the keywords, KeyWords Plus (KW+), extracted automatically by SSCI. With KW+ obtained bibliometric, maps were created using a methodology based on the combination of co-word analysis, co-word analysis, clustering techniques and visualization techniques. We analyzed 512 documents, of which 176 KW+ were obtained with a frequency greater than or equal to 3. The results were bidimensional bibliometric maps with thematic groupings of KW+, representing the main research fronts: i) epidemiology, risk control programs disease and, in general, service organization and health policies; ii) infectious diseases, principally HIV; iii) a progressive increase in several lines interrelated with cardiovascular diseases (CVD); iv) a line multidimensional dedicated to different aspects associated to the quality of life related to health (HRQoL); and v) an emerging line linked to binge drinking. For the multidisciplinary and multidimensional nature of public health, the construction of bibliometric maps is an appropriate methodology to understand the knowledge structure of this scientific domain.
Statistical process control methods allow the analysis and improvement of anesthesia care.
Fasting, Sigurd; Gisvold, Sven E
2003-10-01
Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.
A methodology for evacuation design for urban areas: theoretical aspects and experimentation
NASA Astrophysics Data System (ADS)
Russo, F.; Vitetta, A.
2009-04-01
This paper proposes an unifying approach for the simulation and design of a transportation system under conditions of incoming safety and/or security. Safety and security are concerned with threats generated by very different factors and which, in turn, generate emergency conditions, such as the 9/11, Madrid and London attacks, the Asian tsunami, and the Katrina hurricane; just considering the last five years. In transportation systems, when exogenous events happen and there is a sufficient interval time between the instant when the event happens and the instant when the event has effect on the population, it is possible to reduce the negative effects with the population evacuation. For this event in every case it is possible to prepare with short and long term the evacuation. For other event it is possible also to plan the real time evacuation inside the general risk methodology. The development of models for emergency conditions in transportation systems has not received much attention in the literature. The main findings in this area are limited to only a few public research centres and private companies. In general, there is no systematic analysis of the risk theory applied in the transportation system. Very often, in practice, the vulnerability and exposure in the transportation system are considered as similar variables, or in other worse cases the exposure variables are treated as vulnerability variables. Models and algorithms specified and calibrated in ordinary conditions cannot be directly applied in emergency conditions under the usual hypothesis considered. This paper is developed with the following main objectives: (a) to formalize the risk problem with clear diversification (for the consequences) in the definition of the vulnerability and exposure in a transportation system; thus the book offers improvements over consolidated quantitative risk analysis models, especially transportation risk analysis models (risk assessment); (b) to formalize a system of models for evacuation simulation; (c) to calibrate and validate system of model for evacuation simulation from a real experimentation. In relation to the proposed objectives in this paper: (a) a general framework about risk analysis is reported in the first part, with specific methods and models to analyze urban transportation system performances in emergency conditions when exogenous phenomena occur and for the specification of the risk function; (b) a formulation of the general evacuation problem in the standard simulation context of "what if" approach is specified in the second part with reference to the model considered for the simulation of transportation system in ordinary condition; (c) a set of models specified in the second part are calibrated and validated from a real experimentation in the third part. The experimentation was developed in the central business district of an Italian village and about 1000 inhabitants were evacuated, in order to construct a complete data-base. Our experiment required that socioeconomic information (population, number employed, public buildings, schools, etc.) and transport supply characteristics (infrastructures, etc.) be measured before and during experimentation. The real data of evacuation were recorded with 30 video cameras for laboratory analysis. The results are divided into six strictly connected tasks: Demand models; Supply and supply-demand interaction models for users; Simulation of refuge areas for users; Design of path choice models for emergency vehicles; Pedestrian outflow models in a building; Planning process and guidelines.
Critical appraisal of emergency medicine education research: the best publications of 2012.
Lin, Michelle; Fisher, Jonathan; Coates, Wendy C; Farrell, Susan E; Shayne, Philip; Maggio, Lauren; Kuhn, Gloria
2014-03-01
The objective was to critically appraise and highlight medical education research published in 2012 that was methodologically superior and whose outcomes were pertinent to teaching and education in emergency medicine (EM). A search of the English language literature in 2012 querying Education Resources Information Center (ERIC), PsychInfo, PubMed, and Scopus identified EM studies using hypothesis-testing or observational investigations of educational interventions. Two reviewers independently screened all of the publications and removed articles using established exclusion criteria. This year, publications limited to a single-site survey design that measured satisfaction or self-assessment on unvalidated instruments were not formally reviewed. Six reviewers then independently ranked all remaining publications using one of two scoring systems depending on whether the study methodology was primarily qualitative or quantitative. Each scoring system had nine criteria, including four related to methodology, that were chosen a priori, to standardize evaluation by reviewers. The quantitative study scoring system was used previously to appraise medical education published annually in 2008 through 2011, while a separate, new qualitative study scoring system was derived and implemented consisting of parallel metrics. Forty-eight medical education research papers met the a priori criteria for inclusion, and 33 (30 quantitative and three qualitative studies) were reviewed. Seven quantitative and two qualitative studies met the criteria for inclusion as exemplary and are summarized in this article. This critical appraisal series aims to promote superior education research by reviewing and highlighting nine of the 48 major education research studies with relevance to EM published in 2012. Current trends and common methodologic pitfalls in the 2012 papers are noted. © 2014 by the Society for Academic Emergency Medicine.
Evaluation of complex community-based childhood obesity prevention interventions.
Karacabeyli, D; Allender, S; Pinkney, S; Amed, S
2018-05-16
Multi-setting, multi-component community-based interventions have shown promise in preventing childhood obesity; however, evaluation of these complex interventions remains a challenge. The objective of the study is to systematically review published methodological approaches to outcome evaluation for multi-setting community-based childhood obesity prevention interventions and synthesize a set of pragmatic recommendations. MEDLINE, CINAHL and PsycINFO were searched from inception to 6 July 2017. Papers were included if the intervention targeted children ≤18 years, engaged at least two community sectors and described their outcome evaluation methodology. A single reviewer conducted title and abstract scans, full article review and data abstraction. Directed content analysis was performed by three reviewers to identify prevailing themes. Thirty-three studies were included, and of these, 26 employed a quasi-experimental design; the remaining were randomized control trials. Body mass index was the most commonly measured outcome, followed by health behaviour change and psychosocial outcomes. Six themes emerged, highlighting advantages and disadvantages of active vs. passive consent, quasi-experimental vs. randomized control trials, longitudinal vs. repeat cross-sectional designs and the roles of process evaluation and methodological flexibility in evaluating complex interventions. Selection of study designs and outcome measures compatible with community infrastructure, accompanied by process evaluation, may facilitate successful outcome evaluation. © 2018 World Obesity Federation.
Wild, Verina; Carina, Fourie; Frouzakis, Regula; Clarinval, Caroline; Fässler, Margrit; Elger, Bernice; Gächter, Thomas; Leu, Agnes; Spirig, Rebecca; Kleinknecht, Michael; Radovanovic, Dragana; Mouton Dorey, Corine; Burnand, Bernard; Vader, John-Paul; Januel, Jean-Marie; Biller-Andorno, Nikola; The IDoC Group
2015-01-01
The starting point of the interdisciplinary project "Assessing the impact of diagnosis related groups (DRGs) on patient care and professional practice" (IDoC) was the lack of a systematic ethical assessment for the introduction of cost containment measures in healthcare. Our aim was to contribute to the methodological and empirical basis of such an assessment. Five sub-groups conducted separate but related research within the fields of biomedical ethics, law, nursing sciences and health services, applying a number of complementary methodological approaches. The individual research projects were framed within an overall ethical matrix. Workshops and bilateral meetings were held to identify and elaborate joint research themes. Four common, ethically relevant themes emerged in the results of the studies across sub-groups: (1.) the quality and safety of patient care, (2.) the state of professional practice of physicians and nurses, (3.) changes in incentives structure, (4.) vulnerable groups and access to healthcare services. Furthermore, much-needed data for future comparative research has been collected and some early insights into the potential impact of DRGs are outlined. Based on the joint results we developed preliminary recommendations related to conceptual analysis, methodological refinement, monitoring and implementation.
Reducing hospital associated infection: a role for social marketing.
Conway, Tony; Langley, Sue
2013-01-01
Although hand hygiene is seen as the most important method to prevent the transmission of hospital associated infection in the UK, hand hygiene compliance rates appear to remain poor. This research aims to assess the degree to which social marketing methodology can be adopted by a particular organisation to promote hand hygiene compliance. The research design is based on a conceptual framework developed from analysis of social marketing literature. Data collection involved taped interviews given by nursing staff working within a specific Hospital Directorate in Manchester, England. Supplementary data were obtained from archival records of the hand hygiene compliance rates. Findings highlighted gaps in the Directorate's approach to the promotion of hand hygiene compared to what could be using social marketing methodology. Respondents highlighted how the Directorate failed to fully optimise resources required to endorse hand hygiene practice and this resulted in poorer compliance. From the experiences and events documented, the study suggests how the emergent phenomena could be utilised by the Directorate to apply a social marketing approach which could positively influence hand hygiene compliance. The paper seeks to explore the use of social marketing in nursing to promote hand hygiene compliance and offer a conceptual framework that provides a way of measuring the strength of the impact that social marketing methodology could have.
Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.
Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian
2017-09-12
Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.
Nurse-Led Competency Model for Emergency Physicians: A Qualitative Study.
Daouk-Öyry, Lina; Mufarrij, Afif; Khalil, Maya; Sahakian, Tina; Saliba, Miriam; Jabbour, Rima; Hitti, Eveline
2017-09-01
To develop a competency model for emergency physicians from the perspective of nurses, juxtapose this model with the widely adopted Accreditation Council for Graduate Medical Education (ACGME) model, and identify competencies that might be unique to the nurses' perspective. The study relied on secondary data originally collected as part of nurses' assessment of emergency physicians' nonclinical skills in the emergency department (ED) of an academic medical center in the Middle East. Participants were 36 registered nurses who had worked in the ED for at least 2 years and had worked for at least 2 shifts per month with the physician being evaluated. Through content analysis, a nurse-led competency model was identified, including 8 core competencies encompassing 33 subcompetencies. The 8 core competencies were emotional intelligence; problem-solving and decisionmaking skills; operations management; patient focus; patient care, procedural skills, and medical knowledge; professionalism; communication skills; and team leadership and management. When the developed model was compared with the ACGME model, the 2 models diverged more than they converged. The nurses' perspective offered distinctive insight into the competencies needed for physicians in an emergency medicine environment, indicating the value of nurses' perspective and shedding light on the need for more systematic and more methodologically sound studies to examine the issue further. The differences between the models highlighted the competencies that were unique to the nurse perspective, and the similarities were indicative of the influence of different perspectives and organizational context on how competencies manifest. Copyright © 2016 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Abdallah, Mahmoud M. S.; Wegerif, Rupert B.
2014-01-01
This article discusses educational design-based research (DBR) as an emerging paradigm/methodology in educational enquiry that can be used as a mixed-method, problem-oriented research framework, and thus can act as an alternative to other traditional paradigms/methodologies prominent within the Egyptian context of educational enquiry. DBR is often…
Value Frameworks in Oncology: Comparative Analysis and Implications to the Pharmaceutical Industry.
Slomiany, Mark; Madhavan, Priya; Kuehn, Michael; Richardson, Sasha
2017-07-01
As the cost of oncology care continues to rise, composite value models that variably capture the diverse concerns of patients, physicians, payers, policymakers, and the pharmaceutical industry have begun to take shape. To review the capabilities and limitations of 5 of the most notable value frameworks in oncology that have emerged in recent years and to compare their relative value and application among the intended stakeholders. We compared the methodology of the American Society of Clinical Oncology (ASCO) Value Framework (version 2.0), the National Comprehensive Cancer Network Evidence Blocks, Memorial Sloan Kettering Cancer Center DrugAbacus, the Institute for Clinical and Economic Review Value Assessment Framework, and the European Society for Medical Oncology Magnitude of Clinical Benefit Scale, using a side-by-side comparative approach in terms of the input, scoring methodology, and output of each framework. In addition, we gleaned stakeholder insights about these frameworks and their potential real-world applications through dialogues with physicians and payers, as well as through secondary research and an aggregate analysis of previously published survey results. The analysis identified several framework-specific themes in their respective focus on clinical trial elements, breadth of evidence, evidence weighting, scoring methodology, and value to stakeholders. Our dialogues with physicians and our aggregate analysis of previous surveys revealed a varying level of awareness of, and use of, each of the value frameworks in clinical practice. For example, although the ASCO Value Framework appears nascent in clinical practice, physicians believe that the frameworks will be more useful in practice in the future as they become more established and as their outputs are more widely accepted. Along with patients and payers, who bear the burden of treatment costs, physicians and policymakers have waded into the discussion of defining value in oncology care, as well as pharmaceutical companies that seek to understand the impact of these value frameworks on each stakeholder, as they model the value and financial threshold of innovative, high-cost drugs.
Social media and outbreaks of emerging infectious diseases: A systematic review of literature.
Tang, Lu; Bie, Bijie; Park, Sung-Eun; Zhi, Degui
2018-04-05
The public often turn to social media for information during emerging infectious diseases (EIDs) outbreaks. This study identified the major approaches and assessed the rigors in published research articles on EIDs and social media. We searched 5 databases for published journal articles on EIDs and social media. We then evaluated these articles in terms of EIDs studied, social media examined, theoretical frameworks, methodologic approaches, and research findings. Thirty articles were included in the analysis (published between January 1, 2010, and March 1, 2016). EIDs that received most scholarly attention were H1N1 (or swine flu, n = 15), Ebola virus (n = 10), and H7N9 (or avian flu/bird flu, n = 2). Twitter was the most often studied social media (n = 17), followed by YouTube (n = 6), Facebook (n = 6), and blogs (n = 6). Three major approaches in this area of inquiry are identified: (1) assessment of the public's interest in and responses to EIDs, (2) examination of organizations' use of social media in communicating EIDs, and (3) evaluation of the accuracy of EID-related medical information on social media. Although academic studies of EID communication on social media are on the rise, they still suffer from a lack of theorization and a need for more methodologic rigor. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
McLaughlin, Danielle M; Cutts, Bethany B
2018-05-08
The expansion of unconventional sources of natural gas across the world has generated public controversy surrounding fracking drilling methods. Public debates continue to reverberate through policy domains despite very inconclusive biophysical evidence of net harm. As a consequence, there is a need to test the hypothesis that resistance to fracking is due to the way it redistributes economic and environmental risks. As in many other communities, opposition to fracking is common in central Westmoreland County, Pennsylvania, (USA) but the rationale underpinning opposition is poorly understood. We test the prevailing assumption in the environmental management literature that fracking opposition is motivated by knowledge deficits and/or not-in-my-backyard (NIMBY) politics. This study uses Q methodology to examine emergent perspectives and sub-discourses within the fracking opposition debate in central Westmoreland County, PA. Q methodology offers a systematic and iterative use of both quantitative and qualitative research techniques to explore frequently overlooked marginal viewpoints that are critical to understanding the fracking problem. The analysis reveals four different narratives of factors amongst people actively involved in locally opposing fracking, labeled (1) Future Fears; (2) NIMBY (3) Community Concerns; and (4) Distrust Stakeholders. The conflicts that emerge across these four factors are indicative of deeper discourse within the fracking debate that signifies diversity in motivations, values, and convictions, and suggests the inadequacy of relying on knowledge deficit and/or NIMBY explanations to fracking politics.
Ocké, Marga C
2013-05-01
This paper aims to describe different approaches for studying the overall diet with advantages and limitations. Studies of the overall diet have emerged because the relationship between dietary intake and health is very complex with all kinds of interactions. These cannot be captured well by studying single dietary components. Three main approaches to study the overall diet can be distinguished. The first method is researcher-defined scores or indices of diet quality. These are usually based on guidelines for a healthy diet or on diets known to be healthy. The second approach, using principal component or cluster analysis, is driven by the underlying dietary data. In principal component analysis, scales are derived based on the underlying relationships between food groups, whereas in cluster analysis, subgroups of the population are created with people that cluster together based on their dietary intake. A third approach includes methods that are driven by a combination of biological pathways and the underlying dietary data. Reduced rank regression defines linear combinations of food intakes that maximally explain nutrient intakes or intermediate markers of disease. Decision tree analysis identifies subgroups of a population whose members share dietary characteristics that influence (intermediate markers of) disease. It is concluded that all approaches have advantages and limitations and essentially answer different questions. The third approach is still more in an exploration phase, but seems to have great potential with complementary value. More insight into the utility of conducting studies on the overall diet can be gained if more attention is given to methodological issues.
Boller, Manuel; Fletcher, Daniel J
2012-06-01
To describe the methodology used by the Reassessment Campaign on Veterinary Resuscitation (RECOVER) to evaluate the scientific evidence relevant to small animal CPR and to compose consensus-based clinical CPR guidelines for dogs and cats. This report is part of a series of 7 articles on the RECOVER evidence and knowledge gap analysis and consensus-based small animal CPR guidelines. It describes the organizational structure of RECOVER, the evaluation process employed, consisting of standardized literature searches, the analysis of relevant articles according to study design, species and predefined quality markers, and the drafting of clinical CPR guidelines based on these data. Therefore, this article serves as the methodology section for the subsequent 6 RECOVER articles. Academia, referral practice. RECOVER is a collaborative initiative that systematically evaluated the evidence on 74 topics relevant to small animal CPR and generated 101 clinical CPR guidelines from this analysis. All primary contributors were veterinary specialists, approximately evenly split between academic institutions and private referral practices. The evidence evaluation and guideline drafting processes were conducted according to a predefined sequence of steps designed to reduce bias and increase the repeatability of the findings, including multiple levels of review, culminating in a consensus process. Many knowledge gaps were identified that will allow prioritization of research efforts in veterinary CPR. Collaborative systematic evidence review is organizationally challenging but feasible and effective in veterinary medicine. More experience is needed to refine the process. © Veterinary Emergency and Critical Care Society 2012.
NASA Technical Reports Server (NTRS)
Sanchez, Merri J.
2000-01-01
This project aimed to develop a methodology for evaluating performance and acceptability characteristics of the pressurized crew module volume suitability for zero-gravity (g) ingress of a spacecraft and to evaluate the operational acceptability of the NASA crew return vehicle (CRV) for zero-g ingress of astronaut crew, volume for crew tasks, and general crew module and seat layout. No standard or methodology has been established for evaluating volume acceptability in human spaceflight vehicles. Volume affects astronauts'ability to ingress and egress the vehicle, and to maneuver in and perform critical operational tasks inside the vehicle. Much research has been conducted on aircraft ingress, egress, and rescue in order to establish military and civil aircraft standards. However, due to the extremely limited number of human-rated spacecraft, this topic has been un-addressed. The NASA CRV was used for this study. The prototype vehicle can return a 7-member crew from the International Space Station in an emergency. The vehicle's internal arrangement must be designed to facilitate rapid zero-g ingress, zero-g maneuverability, ease of one-g egress and rescue, and ease of operational tasks in multiple acceleration environments. A full-scale crew module mockup was built and outfitted with representative adjustable seats, crew equipment, and a volumetrically equivalent hatch. Human factors testing was conducted in three acceleration environments using ground-based facilities and the KC-135 aircraft. Performance and acceptability measurements were collected. Data analysis was conducted using analysis of variance and nonparametric techniques.
Improved Vehicle Occupancy Data Collection Methods
DOT National Transportation Integrated Search
1997-04-14
This report evaluates current and emerging vehicle occupancy data collection : methodologies. Five primary methods for collecting vehicle occupancy data were : identified: the traditional roadside/windshield observation method, a recently : developed...
Support-vector-based emergent self-organising approach for emotional understanding
NASA Astrophysics Data System (ADS)
Nguwi, Yok-Yen; Cho, Siu-Yeung
2010-12-01
This study discusses the computational analysis of general emotion understanding from questionnaires methodology. The questionnaires method approaches the subject by investigating the real experience that accompanied the emotions, whereas the other laboratory approaches are generally associated with exaggerated elements. We adopted a connectionist model called support-vector-based emergent self-organising map (SVESOM) to analyse the emotion profiling from the questionnaires method. The SVESOM first identifies the important variables by giving discriminative features with high ranking. The classifier then performs the classification based on the selected features. Experimental results show that the top rank features are in line with the work of Scherer and Wallbott [(1994), 'Evidence for Universality and Cultural Variation of Differential Emotion Response Patterning', Journal of Personality and Social Psychology, 66, 310-328], which approached the emotions physiologically. While the performance measures show that using the full features for classifications can degrade the performance, the selected features provide superior results in terms of accuracy and generalisation.
[Measurement of the importance of user satisfaction dimensions in healthcare provision].
Murillo, Carles; Saurina, Carme
2013-01-01
Identifying users' perceptions of the quality of care is essential to improve health services delivery. The main objective of this article was to describe the application of a methodology to identify factors that facilitate the identification of areas for improvement. A questionnaire was applied in three health areas in Catalonia (Spain) (primary care [n=332], outpatient specialty care [n=410] and hospital emergency care [n=413]) to measure user satisfaction and assess the importance given to the aspects analyzed. The main areas for improvement in primary care identified by an importance-performance analysis involved the time devoted to patients as well as health professionals' willingness to listen to their views. In hospital emergency care, the main area of improvement was related to the hospital's physical conditions. The tools designed and implemented by the Catalan Health Service (Spain) have proved to be valid for the detection of priority areas to improve service delivery and promote regional equity. Copyright © 2012 SESPAS. Published by Elsevier Espana. All rights reserved.
Systems Biology Approaches for Discovering Biomarkers for Traumatic Brain Injury
Feala, Jacob D.; AbdulHameed, Mohamed Diwan M.; Yu, Chenggang; Dutta, Bhaskar; Yu, Xueping; Schmid, Kara; Dave, Jitendra; Tortella, Frank
2013-01-01
Abstract The rate of traumatic brain injury (TBI) in service members with wartime injuries has risen rapidly in recent years, and complex, variable links have emerged between TBI and long-term neurological disorders. The multifactorial nature of TBI secondary cellular response has confounded attempts to find cellular biomarkers for its diagnosis and prognosis or for guiding therapy for brain injury. One possibility is to apply emerging systems biology strategies to holistically probe and analyze the complex interweaving molecular pathways and networks that mediate the secondary cellular response through computational models that integrate these diverse data sets. Here, we review available systems biology strategies, databases, and tools. In addition, we describe opportunities for applying this methodology to existing TBI data sets to identify new biomarker candidates and gain insights about the underlying molecular mechanisms of TBI response. As an exemplar, we apply network and pathway analysis to a manually compiled list of 32 protein biomarker candidates from the literature, recover known TBI-related mechanisms, and generate hypothetical new biomarker candidates. PMID:23510232
NASA Astrophysics Data System (ADS)
Drewes, Andrea; Henderson, Joseph; Mouza, Chrystalla
2018-01-01
Climate change is one of the most pressing challenges facing society, and climate change educational models are emerging in response. This study investigates the implementation and enactment of a climate change professional development (PD) model for science educators and its impact on student learning. Using an intrinsic case study methodology, we focused analytic attention on how one teacher made particular pedagogical and content decisions, and the implications for student's conceptual learning. Using anthropological theories of conceptual travel, we traced salient ideas through instructional delivery and into student reasoning. Analysis showed that students gained an increased understanding of the enhanced greenhouse effect and the implications of human activity on this enhanced effect at statistically significant levels and with moderate effect sizes. However, students demonstrated a limited, though non-significant gain on the likely effects of climate change. Student reasoning on the tangible actions to deal with these problems also remained underdeveloped, reflecting omissions in both PD and teacher enactment. We discuss implications for the emerging field of climate change education.
Wong, Rene; Breiner, Petra; Mylopoulos, Maria
2014-09-01
This article reports on research into the relationships that emerged between hospital-based and community-based interprofessional diabetes programs involved in inter-agency care. Using constructivist grounded theory methodology we interviewed a purposive theoretical sample of 21 clinicians and administrators from both types of programs. Emergent themes were identified through a process of constant comparative analysis. Initial boundaries were constructed based on contrasts in beliefs, practices and expertise. In response to bureaucratic and social pressures, boundaries were redefined in a way that created role uncertainty and disempowered community programs, ultimately preventing collaboration. We illustrate the dynamic and multi-dimensional nature of social and symbolic boundaries in inter-agency diabetes care and the tacit ways in which hospitals can maintain a power position at the expense of other actors in the field. As efforts continue in Canada and elsewhere to move knowledge and resources into community sectors, we highlight the importance of hospitals seeing beyond their own interests and adopting more altruistic models of inter-agency integration.
Forbes, A; Wainwright, S P
2001-09-01
The integration of survey data with psycho-social theories is an important and emerging theme within the field of health inequalities research. This paper critically examines this approach arguing that the respective models of health inequality which these approaches promote, the related concepts of 'social cohesion' and 'social capital' suffer from serious methodological, theoretical and philosophical flaws. The critique draws particular attention to the limitations of survey-derived data and the dangers of using such data to develop complex social explanations for health inequalities. The paper discusses wider epistemological issues which emerge from the critique addressing the fundamental but neglected question of 'what is inequality'? The paper concludes by introducing a structure for questions regarding health inequalities emphasising the need for those question to be attached to real communities.
Curran, Patrick J.; Hussong, Andrea M.; Cai, Li; Huang, Wenjing; Chassin, Laurie; Sher, Kenneth J.; Zucker, Robert A.
2010-01-01
There are a number of significant challenges encountered when studying development over an extended period of time including subject attrition, changing measurement structures across group and developmental period, and the need to invest substantial time and money. Integrative data analysis is an emerging set of methodologies that overcomes many of the challenges of single sample designs through the pooling of data drawn from multiple existing developmental studies. This approach is characterized by a host of advantages, but this also introduces several new complexities that must be addressed prior to broad adoption by developmental researchers. In this paper we focus on methods for fitting measurement models and creating scale scores using data drawn from multiple longitudinal studies. We present findings from the analysis of repeated measures of internalizing symptomatology that were pooled from three existing developmental studies. We describe and demonstrate each step in the analysis and we conclude with a discussion of potential limitations and directions for future research. PMID:18331129
Beyond 'flood hotspots': Modelling emergency service accessibility during flooding in York, UK
NASA Astrophysics Data System (ADS)
Coles, Daniel; Yu, Dapeng; Wilby, Robert L.; Green, Daniel; Herring, Zara
2017-03-01
This paper describes the development of a method that couples flood modelling with network analysis to evaluate the accessibility of city districts by emergency responders during flood events. We integrate numerical modelling of flood inundation with geographical analysis of service areas for the Ambulance Service and the Fire & Rescue Service. The method was demonstrated for two flood events in the City of York, UK to assess the vulnerability of care homes and sheltered accommodation. We determine the feasibility of emergency services gaining access within the statutory 8- and 10-min targets for high-priority, life-threatening incidents 75% of the time, during flood episodes. A hydrodynamic flood inundation model (FloodMap) simulates the 2014 pluvial and 2015 fluvial flood events. Predicted floods (with depth >25 cm and areas >100 m2) were overlain on the road network to identify sites with potentially restricted access. Accessibility of the city to emergency responders during flooding was quantified and mapped using; (i) spatial coverage from individual emergency nodes within the legislated timeframes, and; (ii) response times from individual emergency service nodes to vulnerable care homes and sheltered accommodation under flood and non-flood conditions. Results show that, during the 2015 fluvial flood, the area covered by two of the three Fire & Rescue Service stations reduced by 14% and 39% respectively, while the remaining station needed to increase its coverage by 39%. This amounts to an overall reduction of 6% and 20% for modelled and observed floods respectively. During the 2014 surface water flood, 7 out of 22 care homes (32%) and 15 out of 43 sheltered accommodation nodes (35%) had modelled response times above the 8-min threshold from any Ambulance station. Overall, modelled surface water flooding has a larger spatial footprint than fluvial flood events. Hence, accessibility of emergency services may be impacted differently depending on flood mechanism. Moreover, we expect emergency services to face greater challenges under a changing climate with a growing, more vulnerable population. The methodology developed in this study could be applied to other cities, as well as for scenario-based evaluation of emergency preparedness to support strategic decision making, and in real-time forecasting to guide operational decisions where heavy rainfall lead-time and spatial resolution are sufficient.
Environment, genes, and experience: lessons from behavior genetics.
Barsky, Philipp I
2010-11-01
The article reviews the theoretical analysis of the problems inherent in studying the environment within behavior genetics across several periods in the development of environmental studies in behavior genetics and proposes some possible alternatives to traditional approaches to studying the environment in behavior genetics. The first period (from the end of the 1920s to the end of the 1970s), when the environment was not actually studied, is called pre-environmental; during this time, the basic principles and theoretical models of understanding environmental effects in behavior genetics were developed. The second period is characterized by the development of studies on environmental influences within the traditional behavior genetics paradigm; several approaches to studying the environment emerged in behavior genetics during this period, from the beginning of the 1980s until today. At the present time, the field is undergoing paradigmatic changes, concerned with methodology, theory, and mathematical models of genotype-environment interplay; this might be the beginning of a third period of development of environmental studies in behavior genetics. In another part, the methodological problems related to environmental studies in behavior genetics are discussed. Although the methodology used in differential psychology is applicable for assessment of differences between individuals, it is insufficient to explain the sources of these differences. In addition, we stress that psychoanalytic studies of twins and their experiences, initiated in the 1930s and continued episodically until the 1980s, could bring an interesting methodology and contribute to the explanation of puzzling findings from environmental studies of behavior genetics. Finally, we will conclude with implications from the results of environmental studies in behavior genetics, including methodological issues. Copyright © 2010 Elsevier Ltd. All rights reserved.
2016-06-01
characteristics, experimental design techniques, and analysis methodologies that distinguish each phase of the MBSE MEASA. To ensure consistency... methodology . Experimental design selection, simulation analysis, and trade space analysis support the final two stages. Figure 27 segments the MBSE MEASA...rounding has the potential to increase the correlation between columns of the experimental design matrix. The design methodology presented in Vieira
Correlation analysis on real-time tab-delimited network monitoring data
Pan, Aditya; Majumdar, Jahin; Bansal, Abhay; ...
2016-01-01
End-End performance monitoring in the Internet, also called PingER is a part of SLAC National Accelerator Laboratory’s research project. It was created to answer the growing need to monitor network both to analyze current performance and to designate resources to optimize execution between research centers, and the universities and institutes co-operating on present and future operations. The monitoring support reflects the broad geographical area of the collaborations and requires a comprehensive number of research and financial channels. The data architecture retrieval and methodology of the interpretation have emerged over numerous years. Analyzing this data is the main challenge due tomore » its high volume. Finally, by using correlation analysis, we can make crucial conclusions about how the network data affects the performance of the hosts and how it depends from countries to countries.« less
Reflections on Plant and Soil Nematode Ecology: Past, Present and Future
Ferris, Howard; Griffiths, Bryan S.; Porazinska, Dorota L.; Powers, Thomas O.; Wang, Koon-Hui; Tenuta, Mario
2012-01-01
The purpose of this review is to highlight key developments in nematode ecology from its beginnings to where it stands today as a discipline within nematology. Emerging areas of research appear to be driven by crop production constraints, environmental health concerns, and advances in technology. In contrast to past ecological studies which mainly focused on management of plant-parasitic nematodes, current studies reflect differential sensitivity of nematode faunae. These differences, identified in both aquatic and terrestrial environments include response to stressors, environmental conditions, and management practices. Methodological advances will continue to influence the role nematodes have in addressing the nature of interactions between organisms, and of organisms with their environments. In particular, the C. elegans genetic model, nematode faunal analysis and nematode metagenetic analysis can be used by ecologists generally and not restricted to nematologists. PMID:23482864
Value-centric design architecture based on analysis of space system characteristics
NASA Astrophysics Data System (ADS)
Xu, Q.; Hollingsworth, P.; Smith, K.
2018-03-01
Emerging design concepts such as miniaturisation, modularity, and standardisation, have contributed to the rapid development of small and inexpensive platforms, particularly cubesats. This has been stimulating an upcoming revolution in space design and development, leading satellites into the era of "smaller, faster, and cheaper". However, the current requirement-centric design philosophy, focused on bespoke monolithic systems, along with the associated development and production process does not inherently fit with the innovative modular, standardised, and mass-produced technologies. This paper presents a new categorisation, characterisation, and value-centric design architecture to address this need for both traditional and novel system designs. Based on the categorisation of system configurations, a characterisation of space systems, comprised of duplication, fractionation, and derivation, is proposed to capture the overall system configuration characteristics and promote potential hybrid designs. Complying with the definitions of the system characterisation, mathematical mapping relations between the system characterisation and the system properties are described to establish the mathematical foundation of the proposed value-centric design methodology. To illustrate the methodology, subsystem reliability relationships are therefore analysed to explore potential system configurations in the design space. The results of the applications of system characteristic analysis clearly show that the effects of different configuration characteristics on the system properties can be effectively analysed and evaluated, enabling the optimization of system configurations.
Liu, Yi; Chen, Jining; He, Weiqi; Tong, Qingyuan; Li, Wangfeng
2010-04-15
Urban planning has been widely applied as a regulatory measure to guide a city's construction and management. It represents official expectations on future population and economic growth and land use over the urban area. No doubt, significant variations often occur between planning schemes and actual development; in particular in China, the world's largest developing country experiencing rapid urbanization and industrialization. This in turn leads to difficulty in estimating the environmental consequences of the urban plan. Aiming to quantitatively analyze the uncertain environmental impacts of the urban plan's implementation, this article developed an integrated methodology combining a scenario analysis approach and a stochastic simulation technique for strategic environmental assessment (SEA). Based on industrial development scenarios, Monte Carlo sampling is applied to generate all possibilities of the spatial distribution of newly emerged industries. All related environmental consequences can be further estimated given the industrial distributions as input to environmental quality models. By applying a HSY algorithm, environmentally unacceptable urban growth, regarding both economic development and land use spatial layout, can be systematically identified, providing valuable information to urban planners and decision makers. A case study in Dalian Municipality, Northeast China, is used to illustrate applicability of this methodology. The impacts of Urban Development Plan for Dalian Municipality (2003-2020) (UDP) on atmospheric environment are also discussed in this article.
Sociological theory and Jungian psychology.
Walker, Gavin
2012-01-01
[[disenchantmentCarl JungpsychoanalysissociologyMax Weber ] In this article I seek to relate the psychology of Carl Jung to sociological theory, specifically Weber. I first present an outline of Jungian psychology. I then seek to relate this as psychology to Weber’s interpretivism. I point to basic methodological compatibilities within a Kantian frame, from which emerge central concerns with the factors limiting rationality. These generate the conceptual frameworks for parallel enquiries into the development and fate of rationality in cultural history. Religion is a major theme here: contrasts of eastern and western religion; the rise of prophetic religion and the disenchantment of modernity. Weber’s categories ‘ascetic’ and ‘mystic’ seem applicable to his own and Jung’s approaches and indeed temperaments, while a shared ironic view of rationality leads to similar visions of the disenchanted modern world. I conclude that Jung is sociologically coherent, but in an entirely different sense from Freud: rather than a constellation of family, socialization, ideology, social continuity, there is an analysis of cultural history against a background of adult normal psychology. I conclude that sociology should acknowledge Jung, but not in terms of over-arching theory. Rather Jungian insights might be used to orient new enquiries, and for reflexive analysis of sociology’s methodological debates.
Emerging Tools for Synthetic Genome Design
Lee, Bo-Rahm; Cho, Suhyung; Song, Yoseb; Kim, Sun Chang; Cho, Byung-Kwan
2013-01-01
Synthetic biology is an emerging discipline for designing and synthesizing predictable, measurable, controllable, and transformable biological systems. These newly designed biological systems have great potential for the development of cheaper drugs, green fuels, biodegradable plastics, and targeted cancer therapies over the coming years. Fortunately, our ability to quickly and accurately engineer biological systems that behave predictably has been dramatically expanded by significant advances in DNA-sequencing, DNA-synthesis, and DNA-editing technologies. Here, we review emerging technologies and methodologies in the field of building designed biological systems, and we discuss their future perspectives. PMID:23708771
Optimized maritime emergency resource allocation under dynamic demand.
Zhang, Wenfen; Yan, Xinping; Yang, Jiaqi
2017-01-01
Emergency resource is important for people evacuation and property rescue when accident occurs. The relief efforts could be promoted by a reasonable emergency resource allocation schedule in advance. As the marine environment is complicated and changeful, the place, type, severity of maritime accident is uncertain and stochastic, bringing about dynamic demand of emergency resource. Considering dynamic demand, how to make a reasonable emergency resource allocation schedule is challenging. The key problem is to determine the optimal stock of emergency resource for supplier centers to improve relief efforts. This paper studies the dynamic demand, and which is defined as a set. Then a maritime emergency resource allocation model with uncertain data is presented. Afterwards, a robust approach is developed and used to make sure that the resource allocation schedule performs well with dynamic demand. Finally, a case study shows that the proposed methodology is feasible in maritime emergency resource allocation. The findings could help emergency manager to schedule the emergency resource allocation more flexibly in terms of dynamic demand.
Optimized maritime emergency resource allocation under dynamic demand
Yan, Xinping; Yang, Jiaqi
2017-01-01
Emergency resource is important for people evacuation and property rescue when accident occurs. The relief efforts could be promoted by a reasonable emergency resource allocation schedule in advance. As the marine environment is complicated and changeful, the place, type, severity of maritime accident is uncertain and stochastic, bringing about dynamic demand of emergency resource. Considering dynamic demand, how to make a reasonable emergency resource allocation schedule is challenging. The key problem is to determine the optimal stock of emergency resource for supplier centers to improve relief efforts. This paper studies the dynamic demand, and which is defined as a set. Then a maritime emergency resource allocation model with uncertain data is presented. Afterwards, a robust approach is developed and used to make sure that the resource allocation schedule performs well with dynamic demand. Finally, a case study shows that the proposed methodology is feasible in maritime emergency resource allocation. The findings could help emergency manager to schedule the emergency resource allocation more flexibly in terms of dynamic demand. PMID:29240792
Boyle, Adrian; Abel, Gary; Raut, Pramin; Austin, Richard; Dhakshinamoorthy, Vijayasankar; Ayyamuthu, Ravi; Murdoch, Iona; Burton, Joel
2016-05-01
There is uncertainty about the best way to measure emergency department crowding. We have previously developed a consensus-based measure of crowding, the International Crowding Measure in Emergency Departments (ICMED). We aimed to obtain pilot data to evaluate the ability of a shortened form of the ICMED, the sICMED, to predict senior emergency department clinicians' concerns about crowding and danger compared with a very well-studied measure of emergency department crowding, the National Emergency Department Overcrowding Score (NEDOCS). We collected real-time observations of the sICMED and NEDOCS and compared these with clinicians' perceptions of crowding and danger on a visual analogue scale. Data were collected in four emergency departments in the East of England. Associations were explored using simple regression, random intercept models and models accounting for correlation between adjacent time points. We conducted 82 h of observation in 10 observation sets. Naive modelling suggested strong associations between sICMED and NEDOCS and clinician perceptions of crowding and danger. Further modelling showed that, due to clustering, the association between sICMED and danger persisted, but the association between these two measures and perception of crowding was no longer statistically significant. Both sICMED and NEDOCS can be collected easily in a variety of English hospitals. Further studies are required but initial results suggest both scores may have potential use for assessing crowding variation at long timescales, but are less sensitive to hour-by-hour variation. Correlation in time is an important methodological consideration which, if ignored, may lead to erroneous conclusions. Future studies should account for such correlation in both design and analysis. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
76 FR 53414 - Pacific Fishery Management Council; Public Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-26
... Trailing Actions 7. Consider Inseason Adjustments--Part I 8. Emerging Issues Under Trawl Rationalization... Run Chinook Management Issues 2. 2011 Methodology Review I. Pacific Halibut Management 1. 2012 Pacific...
Alemayehu, Demissie; Cappelleri, Joseph C
2012-07-01
Patient-reported outcomes (PROs) can play an important role in personalized medicine. PROs can be viewed as an important fundamental tool to measure the extent of disease and the effect of treatment at the individual level, because they reflect the self-reported health state of the patient directly. However, their effective integration in personalized medicine requires addressing certain conceptual and methodological challenges, including instrument development and analytical issues. To evaluate methodological issues, such as multiple comparisons, missing data, and modeling approaches, associated with the analysis of data related to PRO and personalized medicine to further our understanding on the role of PRO data in personalized medicine. There is a growing recognition of the role of PROs in medical research, but their potential use in customizing healthcare is not widely appreciated. Emerging insights into the genetic basis of PROs could potentially lead to new pathways that may improve patient care. Knowledge of the biologic pathways through which the various genetic predispositions propel people toward negative or away from positive health experiences may ultimately transform healthcare. Understanding and addressing the conceptual and methodological issues in PROs and personalized medicine are expected to enhance the emerging area of personalized medicine and to improve patient care. This article addresses relevant concerns that need to be considered for effective integration of PROs in personalized medicine, with particular reference to conceptual and analytical issues that routinely arise with personalized medicine and PRO data. Some of these issues, including multiplicity problems, handling of missing values-and modeling approaches, are common to both areas. It is hoped that this article will help to stimulate further research to advance our understanding of the role of PRO data in personalized medicine. A robust conceptual framework to incorporate PROs into personalized medicine can provide fertile opportunity to bring these two areas even closer and to enhance the way a specific treatment is attuned and delivered to address patient care and patient needs.
2012-01-01
Background In recent years, biological event extraction has emerged as a key natural language processing task, aiming to address the information overload problem in accessing the molecular biology literature. The BioNLP shared task competitions have contributed to this recent interest considerably. The first competition (BioNLP'09) focused on extracting biological events from Medline abstracts from a narrow domain, while the theme of the latest competition (BioNLP-ST'11) was generalization and a wider range of text types, event types, and subject domains were considered. We view event extraction as a building block in larger discourse interpretation and propose a two-phase, linguistically-grounded, rule-based methodology. In the first phase, a general, underspecified semantic interpretation is composed from syntactic dependency relations in a bottom-up manner. The notion of embedding underpins this phase and it is informed by a trigger dictionary and argument identification rules. Coreference resolution is also performed at this step, allowing extraction of inter-sentential relations. The second phase is concerned with constraining the resulting semantic interpretation by shared task specifications. We evaluated our general methodology on core biological event extraction and speculation/negation tasks in three main tracks of BioNLP-ST'11 (GENIA, EPI, and ID). Results We achieved competitive results in GENIA and ID tracks, while our results in the EPI track leave room for improvement. One notable feature of our system is that its performance across abstracts and articles bodies is stable. Coreference resolution results in minor improvement in system performance. Due to our interest in discourse-level elements, such as speculation/negation and coreference, we provide a more detailed analysis of our system performance in these subtasks. Conclusions The results demonstrate the viability of a robust, linguistically-oriented methodology, which clearly distinguishes general semantic interpretation from shared task specific aspects, for biological event extraction. Our error analysis pinpoints some shortcomings, which we plan to address in future work within our incremental system development methodology. PMID:22759461
Novel optoelectronic methodology for testing of MOEMS
NASA Astrophysics Data System (ADS)
Pryputniewicz, Ryszard J.; Furlong, Cosme
2003-01-01
Continued demands for delivery of high performance micro-optoelectromechanical systems (MOEMS) place unprecedented requirements on methods used in their development and operation. Metrology is a major and inseparable part of these methods. Optoelectronic methodology is an essential field of metrology. Due to its scalability, optoelectronic methodology is particularly suitable for testing of MOEMS where measurements must be made with ever increasing accuracy and precision. This was particularly evident during the last few years, characterized by miniaturization of devices, when requirements for measurements have rapidly increased as the emerging technologies introduced new products, especially, optical MEMS. In this paper, a novel optoelectronic methodology for testing of MOEMS is described and its applications are illustrated with representative examples. These examples demonstrate capability to measure submicron deformations of various components of the micromirror device, under operating conditions, and show viability of the optoelectronic methodology for testing of MOEMS.
Mohammed, Mohammed A.; Rudge, Gavin; Watson, Duncan; Wood, Gordon; Smith, Gary B.; Prytherch, David R.; Girling, Alan; Stevens, Andrew
2013-01-01
Background We explored the use of routine blood tests and national early warning scores (NEWS) reported within ±24 hours of admission to predict in-hospital mortality in emergency admissions, using empirical decision Tree models because they are intuitive and may ultimately be used to support clinical decision making. Methodology A retrospective analysis of adult emergency admissions to a large acute hospital during April 2009 to March 2010 in the West Midlands, England, with a full set of index blood tests results (albumin, creatinine, haemoglobin, potassium, sodium, urea, white cell count and an index NEWS undertaken within ±24 hours of admission). We developed a Tree model by randomly splitting the admissions into a training (50%) and validation dataset (50%) and assessed its accuracy using the concordance (c-) statistic. Emergency admissions (about 30%) did not have a full set of index blood tests and/or NEWS and so were not included in our analysis. Results There were 23248 emergency admissions with a full set of blood tests and NEWS with an in-hospital mortality of 5.69%. The Tree model identified age, NEWS, albumin, sodium, white cell count and urea as significant (p<0.001) predictors of death, which described 17 homogeneous subgroups of admissions with mortality ranging from 0.2% to 60%. The c-statistic for the training model was 0.864 (95%CI 0.852 to 0.87) and when applied to the testing data set this was 0.853 (95%CI 0.840 to 0.866). Conclusions An easy to interpret validated risk adjustment Tree model using blood test and NEWS taken within ±24 hours of admission provides good discrimination and offers a novel approach to risk adjustment which may potentially support clinical decision making. Given the nature of the clinical data, the results are likely to be generalisable but further research is required to investigate this promising approach. PMID:23734195
Nikathil, Shradha; Olaussen, Alexander; Gocentas, Robert A; Symons, Evan; Mitra, Biswadev
2017-06-01
Patient or visitor perpetrated workplace violence (WPV) has been reported to be a common occurrence within the ED. No universal definition of violence or recording of such events exists. In addition ED staff are often reluctant to report violent incidents. The true incidence of WPV is therefore unclear. This systematic review aimed to quantify WPV in EDs. The association of WPV to drug and alcohol exposure was explored. The databases MEDLINE, Embase, PsycInfo and the Cochrane Library were searched from their commencement to 10 March 2016. MeSH terms and text words for ED, violence and aggression were combined. A meta-analysis was conducted on the primary outcome variable-proportion of violent patients among total ED presentations. A secondary meta-analysis used studies reporting on proportion of drug and alcohol affected patients occurring within the violent population. The search yielded a total of 8720 records. A total of 7235 were unique and underwent abstract screening. A total of 22 studies were deemed relevant according to inclusion and exclusion criteria. Retrospective study design predominated, analysing mainly security records and incident reports. The rates of violence from individual studies ranged from 1 incident to 172 incidents per 10 000 presentations. The pooled incidence suggests there are 36 violent patients for every 10 000 presentations to the ED (95% confidence interval 0.0030-0.0043). WPV in the ED was commonly reported. There is wide heterogeneity across the study methodology, definitions and rates. More standardised recording and reporting may inform preventive measures and highlight effective management strategies. © 2017 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
Cell-selective metabolic labeling of biomolecules with bioorthogonal functionalities.
Xie, Ran; Hong, Senlian; Chen, Xing
2013-10-01
Metabolic labeling of biomolecules with bioorthogonal functionalities enables visualization, enrichment, and analysis of the biomolecules of interest in their physiological environments. This versatile strategy has found utility in probing various classes of biomolecules in a broad range of biological processes. On the other hand, metabolic labeling is nonselective with respect to cell type, which imposes limitations for studies performed in complex biological systems. Herein, we review the recent methodological developments aiming to endow metabolic labeling strategies with cell-type selectivity. The cell-selective metabolic labeling strategies have emerged from protein and glycan labeling. We envision that these strategies can be readily extended to labeling of other classes of biomolecules. Copyright © 2013 Elsevier Ltd. All rights reserved.
Deciding to Come Out to Parents: Toward a Model of Sexual Orientation Disclosure Decisions.
Grafsky, Erika L
2017-08-16
The purpose of this study was to conduct research to understand nonheterosexual youths' decision to disclose their sexual orientation information to their parents. The sample for this study includes 22 youth between the ages of 14 and 21. Constructivist grounded theory guided the qualitative methodology and data analysis. The findings from this study posit an emerging model of sexual orientation disclosure decisions comprised of four interrelated factors that influence the decision to disclose or not disclose, as well as a description of the mechanism through which disclosure either does or does not occur. Clinical implications and recommendations for further research are provided. © 2017 Family Process Institute.
Lane, Tonisha B.
2016-01-01
The current study used a case study methodological approach, including document analysis, semistructured interviews, and participant observations, to investigate how a science, technology, engineering, and mathematics (STEM) enrichment program supported retention and degree attainment of underrepresented students at a large, public, predominantly white institution. From this study, a model emerged that encompassed four components: proactive care, holistic support, community building, and catalysts for STEM identity development. These components encompassed a number of strategies and practices that were instrumental in the outcomes of program participants. This paper concludes with implications for practice, such as using models to inform program planning, assessment, and evaluation. PMID:27543638
Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A
2017-12-01
Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.
The Neighbourhood Effects on Health and Well-being (NEHW) study.
O'Campo, Patricia; Wheaton, Blair; Nisenbaum, Rosane; Glazier, Richard H; Dunn, James R; Chambers, Catharine
2015-01-01
Many cross-sectional studies of neighbourhood effects on health do not employ strong study design elements. The Neighbourhood Effects on Health and Well-being (NEHW) study, a random sample of 2412 English-speaking Toronto residents (age 25-64), utilises strong design features for sampling neighbourhoods and individuals, characterising neighbourhoods using a variety of data sources, measuring a wide range of health outcomes, and for analysing cross-level interactions. We describe here methodological issues that shaped the design and analysis features of the NEHW study to ensure that, while a cross-sectional sample, it will advance the quality of evidence emerging from observational studies. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Energy efficiency of high-rise buildings
NASA Astrophysics Data System (ADS)
Zhigulina, Anna Yu.; Ponomarenko, Alla M.
2018-03-01
The article is devoted to analysis of tendencies and advanced technologies in the field of energy supply and energy efficiency of tall buildings, to the history of the emergence of the concept of "efficiency" and its current interpretation. Also the article show the difference of evaluation criteria of the leading rating systems LEED and BREEAM. Authors reviewed the latest technologies applied in the construction of energy efficient buildings. Methodological approach to the design of tall buildings taking into account energy efficiency needs to include the primary energy saving; to seek the possibility of production and accumulation of alternative electric energy by converting energy from the sun and wind with the help of special technical devices; the application of regenerative technologies.
An intelligent algorithm for optimizing emergency department job and patient satisfaction.
Azadeh, Ali; Yazdanparast, Reza; Abdolhossein Zadeh, Saeed; Keramati, Abbas
2018-06-11
Purpose Resilience engineering, job satisfaction and patient satisfaction were evaluated and analyzed in one Tehran emergency department (ED) to determine ED strengths, weaknesses and opportunities to improve safety, performance, staff and patient satisfaction. The paper aims to discuss these issues. Design/methodology/approach The algorithm included data envelopment analysis (DEA), two artificial neural networks: multilayer perceptron and radial basis function. Data were based on integrated resilience engineering (IRE) and satisfaction indicators. IRE indicators are considered inputs and job and patient satisfaction indicators are considered output variables. Methods were based on mean absolute percentage error analysis. Subsequently, the algorithm was employed for measuring staff and patient satisfaction separately. Each indicator is also identified through sensitivity analysis. Findings The results showed that salary, wage, patient admission and discharge are the crucial factors influencing job and patient satisfaction. The results obtained by the algorithm were validated by comparing them with DEA. Practical implications The approach is a decision-making tool that helps health managers to assess and improve performance and take corrective action. Originality/value This study presents an IRE and intelligent algorithm for analyzing ED job and patient satisfaction - the first study to present an integrated IRE, neural network and mathematical programming approach for optimizing job and patient satisfaction, which simultaneously optimizes job and patient satisfaction, and IRE. The results are validated by DEA through statistical methods.
Mess management in microbial ecology: Rhetorical processes of disciplinary integration
NASA Astrophysics Data System (ADS)
McCracken, Christopher W.
As interdisciplinary work becomes more common in the sciences, research into the rhetorical processes mediating disciplinary integration becomes more vital. This dissertation, which takes as its subject the integration of microbiology and ecology, combines a postplural approach to rhetoric of science research with Victor Turner's "social drama" analysis and a third-generation activity theory methodological framework to identify conceptual and practical conflicts in interdisciplinary work and describe how, through visual and verbal communication, scientists negotiate these conflicts. First, to understand the conflicting disciplinary principles that might impede integration, the author conducts a Turnerian analysis of a disciplinary conflict that took place in the 1960s and 70s, during which American ecologists and biologists debated whether they should participate in the International Biological Program (IBP). Participation in the IBP ultimately contributed to the emergence of ecology as a discipline distinct from biology, and Turnerian social drama analysis of the debate surrounding participation lays bare the conflicting principles separating biology and ecology. Second, to answer the question of how these conflicting principles are negotiated in practice, the author reports on a yearlong qualitative study of scientists working in a microbial ecology laboratory. Focusing specifically on two case studies from this fieldwork that illustrate the key concept of textually mediated disciplinary integration, the author's analysis demonstrates how scientific objects emerge in differently situated practices, and how these objects manage to cohere despite their multiplicity through textually mediated rhetorical processes of calibration and alignment.
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES
2017-06-01
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and
A Methodology for Loading the Advanced Test Reactor Driver Core for Experiment Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cowherd, Wilson M.; Nielsen, Joseph W.; Choe, Dong O.
In support of experiments in the ATR, a new methodology was devised for loading the ATR Driver Core. This methodology will replace the existing methodology used by the INL Neutronic Analysis group to analyze experiments. Studied in this paper was the as-run analysis for ATR Cycle 152B, specifically comparing measured lobe powers and eigenvalue calculations.
Study Designs and Evaluation Models for Emergency Department Public Health Research
Broderick, Kerry B.; Ranney, Megan L.; Vaca, Federico E.; D’Onofrio, Gail; Rothman, Richard E.; Rhodes, Karin V.; Becker, Bruce; Haukoos, Jason S.
2011-01-01
Public health research requires sound design and thoughtful consideration of potential biases that may influence the validity of results. It also requires careful implementation of protocols and procedures that are likely to translate from the research environment to actual clinical practice. This article is the product of a breakout session from the 2009 Academic Emergency Medicine consensus conference entitled “Public Health in the ED: Screening, Surveillance, and Intervention” and serves to describe in detail aspects of performing emergency department (ED)-based public health research, while serving as a resource for current and future researchers. In doing so, the authors describe methodologic features of study design, participant selection and retention, and measurements and analyses pertinent to public health research. In addition, a number of recommendations related to research methods and future investigations related to public health work in the ED are provided. Public health investigators are poised to make substantial contributions to this important area of research, but this will only be accomplished by employing sound research methodology in the context of rigorous program evaluation. PMID:20053232
Emerging technologies and corporate culture at Microsoft: a methodological note.
Klein, David; Schmeling, James; Blanck, Peter
2005-01-01
This article explores factors important in the study and examination of corporate culture and change. The particular focus is on the technological methods used to conduct a study of accessible technology and corporate culture at Microsoft Corporation. Reasons for particular approaches are explained. Advantages and challenges of emerging technologies that store and retrieve information in the study of corporate culture are reviewed. 2005 John Wiley & Sons, Ltd.
Discourse Analysis and the Study of Educational Leadership
ERIC Educational Resources Information Center
Anderson, Gary; Mungal, Angus Shiva
2015-01-01
Purpose: The purpose of this paper is to provide an overview of the current and past work using discourse analysis in the field of educational administration and of discourse analysis as a methodology. Design/Methodology/Approach: Authors reviewed research in educational leadership that uses discourse analysis as a methodology. Findings: While…
76 FR 30139 - Federal Need Analysis Methodology for the 2012-2013 Award Year
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-24
... DEPARTMENT OF EDUCATION Federal Need Analysis Methodology for the 2012-2013 Award Year AGENCY: Federal Student Aid, Department of Education. ACTION: Notice of revision of the Federal Need Analysis...; 84.268; 84.379]. Federal Need Analysis Methodology for the 2012-2013 award year; Federal Pell Grant...
Orkin, Aaron M; Curran, Jeffrey D; Fortune, Melanie K; McArthur, Allison; Mew, Emma J; Ritchie, Stephen D; Van de Velde, Stijn; VanderBurgh, David
2016-05-18
The Disease Control Priorities Project recommends emergency care training for laypersons in low-resource settings, but evidence for these interventions has not yet been systematically reviewed. This review will identify the individual and community health effects of educating laypeople to deliver prehospital emergency care interventions in low-resource settings. This systematic review addresses the following question: in underserviced populations and low-resource settings (P), does first aid or emergency care training or education for laypeople (I) confer any individual or community health benefit for emergency health conditions (O), in comparison with no training or other forms of education (C)? We restrict this review to studies reporting quantitatively measurable outcomes, and search 12 electronic bibliographic databases and grey literature sources. A team of expert content and methodology reviewers will conduct title and abstract screening and full-text review, using a custom-built online platform. Two investigators will independently extract methodological variables and outcomes related to patient-level morbidity and mortality and community-level effects on resilience or emergency care capacity. Two investigators will independently assess external validity, selection bias, performance bias, measurement bias, attrition bias and confounding. We will summarise the findings using a narrative approach to highlight similarities and differences between the gathered studies. Formal ethical approval is not required. The results will be disseminated through a peer-reviewed publication and knowledge translation strategy. CRD42014009685. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Hospital-based emergency nursing in rural settings.
Brown, Jennifer F
2008-01-01
In 2006, the Institute of Medicine (IOM) released a series of reports that highlighted the urgent need for improvements in the nation's emergency health services. This news has provided new energy to a growing body of research about the development and implementation of best practices in emergency care. Despite evidence of geographical disparities in health services, relatively little attention has been focused on rural emergency services to identify environmental differences. The purpose of this chapter is to summarize the contributions of nursing research to the rural emergency services literature. The research resembles a so-called shotgun effect as the exploratory and interventional studies cover a wide range of topics without consistency or justification. Emergency nursing research has been conducted primarily in urban settings, with small samples and insufficient methodological rigor. This chapter will discuss the limitations of the research and set forth an agenda of critical topics that need to be explored related to emergency nursing in rural settings.
Antibiotic use and resistance in emerging economies: a situation analysis for Viet Nam.
Nguyen, Kinh Van; Thi Do, Nga Thuy; Chandna, Arjun; Nguyen, Trung Vu; Pham, Ca Van; Doan, Phuong Mai; Nguyen, An Quoc; Thi Nguyen, Chuc Kim; Larsson, Mattias; Escalante, Socorro; Olowokure, Babatunde; Laxminarayan, Ramanan; Gelband, Hellen; Horby, Peter; Thi Ngo, Ha Bich; Hoang, Mai Thanh; Farrar, Jeremy; Hien, Tran Tinh; Wertheim, Heiman F L
2013-12-10
Antimicrobial resistance is a major contemporary public health threat. Strategies to contain antimicrobial resistance have been comprehensively set forth, however in developing countries where the need for effective antimicrobials is greatest implementation has proved problematic. A better understanding of patterns and determinants of antibiotic use and resistance in emerging economies may permit more appropriately targeted interventions.Viet Nam, with a large population, high burden of infectious disease and relatively unrestricted access to medication, is an excellent case study of the difficulties faced by emerging economies in controlling antimicrobial resistance. Our working group conducted a situation analysis of the current patterns and determinants of antibiotic use and resistance in Viet Nam. International publications and local reports published between 1-1-1990 and 31-8-2012 were reviewed. All stakeholders analyzed the findings at a policy workshop and feasible recommendations were suggested to improve antibiotic use in Viet Nam.Here we report the results of our situation analysis focusing on: the healthcare system, drug regulation and supply; antibiotic resistance and infection control; and agricultural antibiotic use. Market reforms have improved healthcare access in Viet Nam and contributed to better health outcomes. However, increased accessibility has been accompanied by injudicious antibiotic use in hospitals and the community, with predictable escalation in bacterial resistance. Prescribing practices are poor and self-medication is common - often being the most affordable way to access healthcare. Many policies exist to regulate antibiotic use but enforcement is insufficient or lacking.Pneumococcal penicillin-resistance rates are the highest in Asia and carbapenem-resistant bacteria (notably NDM-1) have recently emerged. Hospital acquired infections, predominantly with multi-drug resistant Gram-negative organisms, place additional strain on limited resources. Widespread agricultural antibiotic use further propagates antimicrobial resistance. Future legislation regarding antibiotic access must alter incentives for purchasers and providers and ensure effective enforcement. The Ministry of Health recently initiated a national action plan and approved a multicenter health improvement project to strengthen national capacity for antimicrobial stewardship in Viet Nam. This analysis provided important input to these initiatives. Our methodologies and findings may be of use to others across the world tackling the growing threat of antibiotic resistance.
NASA Astrophysics Data System (ADS)
García-Santos, Glenda; Madruga de Brito, Mariana; Höllermann, Britta; Taft, Linda; Almoradie, Adrian; Evers, Mariele
2018-06-01
Understanding the interactions between water resources and its social dimensions is crucial for an effective and sustainable water management. The identification of sensitive control variables and feedback loops of a specific human-hydro-scape can enhance the knowledge about the potential factors and/or agents leading to the current water resources and ecosystems situation, which in turn supports the decision-making process of desirable futures. Our study presents the utility of a system dynamics modeling approach for water management and decision-making for the case of a forest ecosystem under risk of wildfires. We use the pluralistic water research concept to explore different scenarios and simulate the emergent behaviour of water interception and net precipitation after a wildfire in a forest ecosystem. Through a case study, we illustrate the applicability of this new methodology.
Emergent Aerospace Designs Using Negotiating Autonomous Agents
NASA Technical Reports Server (NTRS)
Deshmukh, Abhijit; Middelkoop, Timothy; Krothapalli, Anjaneyulu; Smith, Charles
2000-01-01
This paper presents a distributed design methodology where designs emerge as a result of the negotiations between different stake holders in the process, such as cost, performance, reliability, etc. The proposed methodology uses autonomous agents to represent design decision makers. Each agent influences specific design parameters in order to maximize their utility. Since the design parameters depend on the aggregate demand of all the agents in the system, design agents need to negotiate with others in the market economy in order to reach an acceptable utility value. This paper addresses several interesting research issues related to distributed design architectures. First, we present a flexible framework which facilitates decomposition of the design problem. Second, we present overview of a market mechanism for generating acceptable design configurations. Finally, we integrate learning mechanisms in the design process to reduce the computational overhead.
Critical appraisal of emergency medicine education research: the best publications of 2013.
Farrell, Susan E; Kuhn, Gloria J; Coates, Wendy C; Shayne, Phillip H; Fisher, Jonathan; Maggio, Lauren A; Lin, Michelle
2014-11-01
The objective was to critically appraise and highlight methodologically superior medical education research articles published in 2013 whose outcomes are pertinent to teaching and education in emergency medicine (EM). A search of the English-language literature in 2013 querying Education Resources Information Center (ERIC), PsychINFO, PubMed, and Scopus identified 251 EM-related studies using hypothesis-testing or observational investigations of educational interventions. Two reviewers independently screened all of the publications and removed articles using established exclusion criteria. Six reviewers then independently scored the remaining 43 publications using either a qualitative a or quantitative scoring system, based on the research methodology of each article. Each scoring system consisted of nine criteria. Selected criteria were based on accepted educational review literature and chosen a priori. Both scoring systems used parallel scoring metrics and have been used previously within this annual review. Forty-three medical education research papers (37 quantitative and six qualitative studies) met the a priori criteria for inclusion and were reviewed. Six quantitative and one qualitative study were scored and ranked most highly by the reviewers as exemplary and are summarized in this article. This annual critical appraisal article aims to promote superior research in EM-related education, by reviewing and highlighting seven of 43 major education research studies, meeting a priori criteria, and published in 2013. Common methodologic pitfalls in the 2013 papers are noted, and current trends in medical education research in EM are discussed. © 2014 by the Society for Academic Emergency Medicine.
Liu, Jie; Guo, Liang; Jiang, Jiping; Jiang, Dexun; Wang, Peng
2018-04-13
Aiming to minimize the damage caused by river chemical spills, efficient emergency material allocation is critical for an actual emergency rescue decision-making in a quick response. In this study, an emergency material allocation framework based on time-varying supply-demand constraint is developed to allocate emergency material, minimize the emergency response time, and satisfy the dynamic emergency material requirements in post-accident phases dealing with river chemical spills. In this study, the theoretically critical emergency response time is firstly obtained for the emergency material allocation system to select a series of appropriate emergency material warehouses as potential supportive centers. Then, an enumeration method is applied to identify the practically critical emergency response time, the optimum emergency material allocation and replenishment scheme. Finally, the developed framework is applied to a computational experiment based on south-to-north water transfer project in China. The results illustrate that the proposed methodology is a simple and flexible tool for appropriately allocating emergency material to satisfy time-dynamic demands during emergency decision-making. Therefore, the decision-makers can identify an appropriate emergency material allocation scheme in a balance between time-effective and cost-effective objectives under the different emergency pollution conditions.
Does the magnetic expansion factor play a role in solar wind acceleration?
NASA Astrophysics Data System (ADS)
Wallace, S.; Arge, C. N.; Pihlstrom, Y.
2017-12-01
For the past 25+ years, the magnetic expansion factor (fs) has been a parameter used in the calculation of terminal solar wind speed (vsw) in the Wang-Sheeley-Arge (WSA) coronal and solar wind model. The magnetic expansion factor measures the rate of flux tube expansion in cross section between the photosphere out to 2.5 solar radii (i.e., source surface), and is inversely related to vsw (Wang & Sheeley, 1990). Since the discovery of this inverse relationship, the physical role that fs plays in solar wind acceleration has been debated. In this study, we investigate whether fs plays a causal role in determining terminal solar wind speed or merely serves as proxy. To do so, we study pseudostreamers, which occur when coronal holes of the same polarity are near enough to one another to limit field line expansion. Pseudostreamers are of particular interest because despite having low fs, spacecraft observations show that solar wind emerging from these regions have slow to intermediate speeds of 350-550 km/s (Wang et al., 2012). In this work, we develop a methodology to identify pseudostreamers that are magnetically connected to satellites using WSA output produced with ADAPT input maps. We utilize this methodology to obtain the spacecraft-observed solar wind speed from the exact parcel of solar wind that left the pseudostreamer. We then compare the pseudostreamer's magnetic expansion factor with the observed solar wind speed from multiple spacecraft (i.e., ACE, STEREO-A & B, Ulysses) magnetically connected to the region. We will use this methodology to identify several cases ( 20) where spacecraft are magnetically connected to pseudostreamers, and perform a statistical analysis to determine the correlation of fs within pseudostreamers and the terminal speed of the solar wind emerging from them. This work will help determine if fs plays a physical role in the speed of solar wind originating from regions that typically produce slow wind. This work compliments previous case studies of solar wind originating from pseudostreamers (Riley et al., 2015, Riley & Luhmann 2012) and will contribute to identifying the physical properties of solar wind from these regions. Future work will explore the role of fs in modulating the fast solar wind and will involve a similar analysis for cases where spacecraft are deep within coronal holes.
Identifying Key Words in 9-1-1 Calls for Stroke: A Mixed Methods Approach.
Richards, Christopher T; Wang, Baiyang; Markul, Eddie; Albarran, Frank; Rottman, Doreen; Aggarwal, Neelum T; Lindeman, Patricia; Stein-Spencer, Leslee; Weber, Joseph M; Pearlman, Kenneth S; Tataris, Katie L; Holl, Jane L; Klabjan, Diego; Prabhakaran, Shyam
2017-01-01
Identifying stroke during a 9-1-1 call is critical to timely prehospital care. However, emergency medical dispatchers (EMDs) recognize stroke in less than half of 9-1-1 calls, potentially due to the words used by callers to communicate stroke signs and symptoms. We hypothesized that callers do not typically use words and phrases considered to be classical descriptors of stroke, such as focal neurologic deficits, but that a mixed-methods approach can identify words and phrases commonly used by 9-1-1 callers to describe acute stroke victims. We performed a mixed-method, retrospective study of 9-1-1 call audio recordings for adult patients with confirmed stroke who were transported by ambulance in a large urban city. Content analysis, a qualitative methodology, and computational linguistics, a quantitative methodology, were used to identify key words and phrases used by 9-1-1 callers to describe acute stroke victims. Because a caller's level of emotional distress contributes to the communication during a 9-1-1 call, the Emotional Content and Cooperation Score was scored by a multidisciplinary team. A total of 110 9-1-1 calls, received between June and September 2013, were analyzed. EMDs recognized stroke in 48% of calls, and the emotional state of most callers (95%) was calm. In 77% of calls in which EMDs recognized stroke, callers specifically used the word "stroke"; however, the word "stroke" was used in only 38% of calls. Vague, non-specific words and phrases were used to describe stroke victims' symptoms in 55% of calls, and 45% of callers used distractor words and phrases suggestive of non-stroke emergencies. Focal neurologic symptoms were described in 39% of calls. Computational linguistics identified 9 key words that were more commonly used in calls where the EMD identified stroke. These words were concordant with terms identified through qualitative content analysis. Most 9-1-1 callers used vague, non-specific, or distractor words and phrases and infrequently provide classic stroke descriptions during 9-1-1 calls for stroke. Both qualitative and quantitative methodologies identified similar key words and phrases associated with accurate EMD stroke recognition. This study suggests that tools incorporating commonly used words and phrases could potentially improve EMD stroke recognition.
Kellogg, Joshua J; Wallace, Emily D; Graf, Tyler N; Oberlies, Nicholas H; Cech, Nadja B
2017-10-25
Metabolomics has emerged as an important analytical technique for multiple applications. The value of information obtained from metabolomics analysis depends on the degree to which the entire metabolome is present and the reliability of sample treatment to ensure reproducibility across the study. The purpose of this study was to compare methods of preparing complex botanical extract samples prior to metabolomics profiling. Two extraction methodologies, accelerated solvent extraction and a conventional solvent maceration, were compared using commercial green tea [Camellia sinensis (L.) Kuntze (Theaceae)] products as a test case. The accelerated solvent protocol was first evaluated to ascertain critical factors influencing extraction using a D-optimal experimental design study. The accelerated solvent and conventional extraction methods yielded similar metabolite profiles for the green tea samples studied. The accelerated solvent extraction yielded higher total amounts of extracted catechins, was more reproducible, and required less active bench time to prepare the samples. This study demonstrates the effectiveness of accelerated solvent as an efficient methodology for metabolomics studies. Copyright © 2017. Published by Elsevier B.V.
Validating Competence: A New Credential for Clinical Documentation Improvement Practitioners
Ryan, Jessica; Patena, Karen; Judd, Wallace; Niederpruem, Mike
2013-01-01
As the health information management (HIM) profession continues to expand and become more specialized, there is an ever-increasing need to identify emerging HIM workforce roles that require a codified level of proficiency and professional standards. The Commission on Certification for Health Informatics and Information Management (CCHIIM) explored one such role—clinical documentation improvement (CDI) practitioner—to define the tasks and responsibilities of the job as well as the knowledge required to perform them effectively. Subject-matter experts (SMEs) defined the CDI specialty by following best practices for job analysis methodology. A random sample of 4,923 CDI-related professionals was surveyed regarding the tasks and knowledge required for the job. The survey data were used to create a weighted blueprint of the six major domains that make up the CDI practitioner role, which later formed the foundation for the clinical documentation improvement practitioner (CDIP) credential. As a result, healthcare organizations can be assured that their certified documentation improvement practitioners have demonstrated excellence in clinical care, treatment, coding guidelines, and reimbursement methodologies. PMID:23843769
Attitudes towards rotating shift work in clinical nurses: a Q-methodology study.
Ha, Eun-Ho
2015-09-01
To identify clinical nurses' attitudes towards rotating shift work. Many hospitals worldwide employ rotating shift work patterns to staff their facilities. Attitudes of clinical nurses towards rotating shift work vary. To understand clinical nurses' attitudes towards rotating shift work, Q-methodology, a method for the analysis of subjective viewpoints with the strengths of both qualitative and quantitative methods, was used. Forty-six selected Q-statements from each of the 39 participants were classified into a normal distribution using an 11-point bipolar scale. The collected data were analysed using pc-QUANL program. Three discrete factors emerged as follows: factor I (rotating shift work is frustrating: objectionable perspective), factor II (rotating shift work is satisfactory: constructive perspective) and factor III (rotating shift work is problematic, but necessary: ambivalent perspective). The subjective viewpoints of the three identified factors can be applied in developing various roster designs for nurses engaging in rotating shift work. The findings provide the baseline for nurse leaders in helping nurses adjust and deal with rotating shift work. © 2015 John Wiley & Sons Ltd.
Anthropology and Epidemiology: learning epistemological lessons through a collaborative venture
Béhague, Dominique Pareja; Gonçalves, Helen; Victora, Cesar Gomes
2009-01-01
Collaboration between anthropology and epidemiology has a long and tumultuous history. Based on empirical examples, this paper describes a number of epistemological lessons we have learned through our experience of cross-disciplinary collaboration. Although critical of both mainstream epidemiology and medical anthropology, our analysis focuses on the implications of addressing each discipline’s main epistemological differences, while addressing the goal of adopting a broader social approach to health improvement. We believe it is important to push the boundaries of research collaborations from the more standard forms of “multidisciplinarity,” to the adoption of theoretically imbued “interdisciplinarity.” The more we challenge epistemological limitations and modify ways of knowing, the more we will be able to provide in-depth explanations for the emergence of disease-patterns and thus, to problem-solve. In our experience, both institutional support and the adoption of a relativistic attitude are necessary conditions for sustained theoretical interdisciplinarity. Until researchers acknowledge that methodology is merely a human-designed tool to interpret reality, unnecessary methodological hyper-specialization will continue to alienate one field of knowledge from the other. PMID:18833344
López-Bolaños, Lizbeth; Campos-Rivera, Marisol; Villanueva-Borbolla, María Ángeles
2018-01-01
Objective. To reflect on the process of committing to participation in the implementation of a health strategic plan, using Participative Systematization of Social Experiences as a tool. Our study was a qualitative research-intervention study, based on the Dialectical Methodological Conception approach. We designed and implemented a two-day workshop, six hours daily, using Systematization methodology with a Community Work Group (CWG). During the workshop, women systematized their experience, with compromise as axis of the process. Using Grounded Theory techniques, we applied micro-analysis to data in order to identify and strengthen categories that emerged during the systematization process. We completed open and axial coding. The CWG identified that commitment and participation itself is influenced by group dynamics and structural determinants. They also reconsidered the way they understood and exercised commitment and participation, and generated knowledge, empowering them to improve their future practice. Commitment and participation were determined by group dynamics and structural factors such as socioeconomic conditions and gender roles. These determinants must be visible and understood in order to generate proposals that are aimed at strengthening the participation and organization of groups.
Neill, Carly; Leipert, Beverly D; Garcia, Alicia C; Kloseck, Marita
2011-01-01
This research investigates facilitators and barriers that rural women aged 65 to 75 years in Southwestern Ontario experience in acquiring and preparing food through the use of photovoice methodology. Eighteen participants in five rural communities used a camera and log book to document their experiences and perspectives relating to the acquisition and preparation of food, and they each participated in two focus groups to engage in critical dialogue and knowledge sharing regarding the meaning and significance of the pictures they took. Analysis of photographs, log books, and focus group data revealed 13 themes, 3 emerging as facilitators to food acquisition and preparation (availability of food, social networks and values, personal values and resources), 5 as barriers (adjusting to changing family size, winter weather, food labeling issues, grocery shopper resources, limited physical capacity), and 5 as both facilitators and barriers (economics, valuing a healthy diet, technology changes, transportation, location and nature of grocery stores). Data also revealed rurality, age, and gender as foundationally influential factors affecting rural older women's food acquisition and preparation.
Georgakis, D. Christine; Trace, David A.; Naeymi-Rad, Frank; Evens, Martha
1990-01-01
Medical expert systems require comprehensive evaluation of their diagnostic accuracy. The usefulness of these systems is limited without established evaluation methods. We propose a new methodology for evaluating the diagnostic accuracy and the predictive capacity of a medical expert system. We have adapted to the medical domain measures that have been used in the social sciences to examine the performance of human experts in the decision making process. Thus, in addition to the standard summary measures, we use measures of agreement and disagreement, and Goodman and Kruskal's λ and τ measures of predictive association. This methodology is illustrated by a detailed retrospective evaluation of the diagnostic accuracy of the MEDAS system. In a study using 270 patients admitted to the North Chicago Veterans Administration Hospital, diagnoses produced by MEDAS are compared with the discharge diagnoses of the attending physicians. The results of the analysis confirm the high diagnostic accuracy and predictive capacity of the MEDAS system. Overall, the agreement of the MEDAS system with the “gold standard” diagnosis of the attending physician has reached a 90% level.
Best practices in ranking communicable disease threats: a literature review, 2015.
O'Brien, Eleanor Charlotte; Taft, Rachel; Geary, Katie; Ciotti, Massimo; Suk, Jonathan E
2016-04-28
The threat of serious, cross-border communicable disease outbreaks in Europe poses a significant challenge to public health and emergency preparedness because the relative likelihood of these threats and the pathogens involved are constantly shifting in response to a range of changing disease drivers. To inform strategic planning by enabling effective resource allocation to manage the consequences of communicable disease outbreaks, it is useful to be able to rank and prioritise pathogens. This paper reports on a literature review which identifies and evaluates the range of methods used for risk ranking. Searches were performed across biomedical and grey literature databases, supplemented by reference harvesting and citation tracking. Studies were selected using transparent inclusion criteria and underwent quality appraisal using a bespoke checklist based on the AGREE II criteria. Seventeen studies were included in the review, covering five methodologies. A narrative analysis of the selected studies suggests that no single methodology was superior. However, many of the methods shared common components, around which a 'best-practice' framework was formulated. This approach is intended to help inform decision makers' choice of an appropriate risk-ranking study design.
Volcanic hazard assessment in western Europe
NASA Astrophysics Data System (ADS)
Chester, David K.; Dibben, Christopher J. L.; Duncan, Angus M.
2002-06-01
Volcanology has been in the past and in many respects remains a subject dominated by pure research grounded in the earth sciences. Over the past 30 years a paradigm shift has occurred in hazard assessment which has been aided by significant changes in the social theory of natural hazards and the first-hand experience gained in the 1990s by volcanologists working on projects conceived during the International Decade for Natural Disaster Reduction (IDNDR). Today much greater stress is placed on human vulnerability, the potential for marginalisation of disadvantaged individuals and social groups, and the requirement to make applied volcanology sensitive to the characteristics of local demography, economy, culture and politics. During the IDNDR a methodology, broadly similar to environmental impact analysis, has emerged as the preferred method for studying human vulnerability and risk assessment in volcanically active regions. The characteristics of this new methodology are discussed and the progress which has been made in innovating it on the European Union laboratory volcanoes located in western Europe is reviewed. Furnas (São Miguel, Azores) and Vesuvius in Italy are used as detailed case studies.
Meta-Analysis of the Effects of Xingnaojing Injection on Consciousness Disturbance
Wu, Lijun; Zhang, Hua; Xing, Yanwei; Gao, Yonghong; Li, Yanda; Ren, Xiaomeng; Li, Jie; Nie, Bo; Zhu, Lingqun; Shang, Hongcai; Gao, Ying
2016-01-01
Abstract Xingnaojing (XNJ) is commonly extracted from Angongniuhuang, a classic Chinese emergency prescription, and widely used in the treatment of nervous system disorders including consciousness disturbance in China. To evaluate the beneficial and adverse effects of XNJ injection, on consciousness disturbance. Seven major electronic databases were searched to retrieve randomized controlled trials designed to evaluate the clinical efficacy of XNJ alone or combined with Western medicine in treating consciousness disturbance caused by conditions such as high fever, poisoning, and stroke. The methodological quality of the included studies was assessed using criteria from the Cochrane Handbook for Systematic Review of Interventions, and analyzed using the RevMan 5.3.0 software. Seventeen randomized controlled trials on XNJ were included in this study and the trials generally showed low methodological quality. The results revealed that XNJ alone or in combination with other medicines and adjuvant methods had a positive effect on patients with fever-, poisoning-, and stroke-induced coma. XNJ effectively treated consciousness disturbances that were caused by high fever, poisoning, or stroke. PMID:26886655
Data and methodological problems in establishing state gasoline-conservation targets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, D.L.; Walton, G.H.
The Emergency Energy Conservation Act of 1979 gives the President the authority to set gasoline-conservation targets for states in the event of a supply shortage. This paper examines data and methodological problems associated with setting state gasoline-conservation targets. The target-setting method currently used is examined and found to have some flaws. Ways of correcting these deficiencies through the use of Box-Jenkins time-series analysis are investigated. A successful estimation of Box-Jenkins models for all states included the estimation of the magnitude of the supply shortages of 1979 in each state and a preliminary estimation of state short-run price elasticities, which weremore » found to vary about a median value of -0.16. The time-series models identified were very simple in structure and lent support to the simple consumption growth model assumed by the current target method. The authors conclude that the flaws in the current method can be remedied either by replacing the current procedures with time-series models or by using the models in conjunction with minor modifications of the current method.« less
Brown, Alison; Santilli, Mario; Scott, Belinda
2015-12-01
Governing bodies of health services need assurance that major risks to achieving the health service objectives are being controlled. Currently, the main assurance mechanisms generated within the organization are through the review of implementation of policies and procedures and review of clinical audits and quality data. The governing bodies of health services need more robust, objective data to inform their understanding of the control of clinical risks. Internal audit provides a methodological framework that provides independent and objective assurance to the governing body on the control of significant risks. The article describes the pilot of the internal audit methodology in an emergency unit in a health service. An internal auditor was partnered with a clinical expert to assess the application of clinical criteria based on best practice guidelines. The pilot of the internal audit of a clinical area was successful in identifying significant clinical risks that required further management. The application of an internal audit methodology to a clinical area is a promising mechanism to gain robust assurance at the governance level regarding the management of significant clinical risks. This approach needs further exploration and trial in a range of health care settings. © The Author 2015. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.
Handling Emergency Management in [an] Object Oriented Modeling Environment
NASA Technical Reports Server (NTRS)
Tokgoz, Berna Eren; Cakir, Volkan; Gheorghe, Adrian V.
2010-01-01
It has been understood that protection of a nation from extreme disasters is a challenging task. Impacts of extreme disasters on a nation's critical infrastructures, economy and society could be devastating. A protection plan itself would not be sufficient when a disaster strikes. Hence, there is a need for a holistic approach to establish more resilient infrastructures to withstand extreme disasters. A resilient infrastructure can be defined as a system or facility that is able to withstand damage, but if affected, can be readily and cost-effectively restored. The key issue to establish resilient infrastructures is to incorporate existing protection plans with comprehensive preparedness actions to respond, recover and restore as quickly as possible, and to minimize extreme disaster impacts. Although national organizations will respond to a disaster, extreme disasters need to be handled mostly by local emergency management departments. Since emergency management departments have to deal with complex systems, they have to have a manageable plan and efficient organizational structures to coordinate all these systems. A strong organizational structure is the key in responding fast before and during disasters, and recovering quickly after disasters. In this study, the entire emergency management is viewed as an enterprise and modelled through enterprise management approach. Managing an enterprise or a large complex system is a very challenging task. It is critical for an enterprise to respond to challenges in a timely manner with quick decision making. This study addresses the problem of handling emergency management at regional level in an object oriented modelling environment developed by use of TopEase software. Emergency Operation Plan of the City of Hampton, Virginia, has been incorporated into TopEase for analysis. The methodology used in this study has been supported by a case study on critical infrastructure resiliency in Hampton Roads.
Opportunities in SMR Emergency Planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moe, Wayne L.
2014-10-01
Using year 2014 cost information gathered from twenty different locations within the current commercial nuclear power station fleet, an assessment was performed concerning compliance costs associated with the offsite emergency Planning Standards contained in 10 CFR 50.47(b). The study was conducted to quantitatively determine the potential cost benefits realized if an emergency planning zone (EPZ) were reduced in size according to the lowered risks expected to accompany small modular reactors (SMR). Licensees are required to provide a technical basis when proposing to reduce the surrounding EPZ size to less than the 10 mile plume exposure and 50 mile ingestion pathwaymore » distances currently being used. To assist licensees in assessing the savings that might be associated with such an action, this study established offsite emergency planning costs in connection with four discrete EPZ boundary distances, i.e., site boundary, 2 miles, 5 miles and 10 miles. The boundary selected by the licensee would be based on where EPA Protective Action Guidelines are no longer likely to be exceeded. Additional consideration was directed towards costs associated with reducing the 50 mile ingestion pathway EPZ. The assessment methodology consisted of gathering actual capital costs and annual operating and maintenance costs for offsite emergency planning programs at the surveyed sites, partitioning them according to key predictive factors, and allocating those portions to individual emergency Planning Standards as a function of EPZ size. Two techniques, an offsite population-based approach and an area-based approach, were then employed to calculate the scaling factors which enabled cost projections as a function of EPZ size. Site-specific factors that influenced source data costs, such as the effects of supplemental funding to external state and local agencies for offsite response organization activities, were incorporated into the analysis to the extent those factors could be representatively apportioned.« less
Ho, Jonathan Ka-Ming; Chau, Janita Pak-Chun; Cheung, Nancy Man-Ching
2016-11-01
The Ottawa Ankle Rules provide guidelines for clinicians on the recommendation of radiographic tests to verify fractures in patients with ankle injuries. The use of the Ottawa Ankle Rules by emergency nurses has been suggested to minimise unnecessary radiographic-test requests and reduce patients' length of stay in emergency departments. However, the findings of studies in this area are inconsistent. A systematic review was conducted to synthesise the most accurate evidence available on the extent to which emergency nurses' use of the Ottawa Ankle Rules to initiate radiographic tests improves healthcare outcomes for patients with ankle injuries. The systematic review attempted to identify all relevant published and unpublished studies in English and Chinese from databases such as Ovid MEDLINE, EMBASE, ProQuest Health and Medical Complete, EBM Reviews, SPORTDiscus, CINAHL Plus, the British Nursing Index, Scopus, the Chinese Biomedical Literature Database, China Journal Net, WanFang Data, the National Central Library Periodical Literature System, HyRead, the Digital Dissertation Consortium, MedNar and Google Scholar. Two reviewers independently assessed the eligibility of all of the studies identified during the search, based on their titles and abstracts. If a study met the criteria for inclusion, or inconclusive information was available in its title and abstract, the full text was retrieved for further analysis. The methodological quality of all of the eligible studies was assessed independently by the two reviewers. The search of databases and other sources yielded 1603 records. The eligibility of 17 full-text articles was assessed, and nine studies met the inclusion criteria. All nine studies were subjected to narrative analysis, and five were meta-analysed. All of the studies investigated the use of the refined Ottawa Ankle Rules. The results indicated that emergency nurses' use of the refined Ottawa Ankle Rules minimised unnecessary radiographic-test requests and reduced patients' length of stay in emergency departments. However, the use of these rules in urgent-care departments did not reduce unnecessary radiographic-test requests or patients' length of stay. The implementation of the refined Ottawa Ankle Rules by emergency nurses with different backgrounds, including nurse practitioners or general emergency nurses was found to reduce patients' length of stay in emergency departments. The results of the systematic review suggested that a nurse-initiated radiographic test protocol should be introduced as standard practice in emergency departments. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Wright, Mark Mba
There are significant technological and systemic challenges faced by today's advanced biofuel industry. These challenges stem from the current state-of-technology and from the system (consumer market, infrastructure, environment...) in which this emerging industry is being developed. The state-of-technology will improve with continued efforts in technology development, but novel approaches are required to investigate the systemic challenges that limit the adoption of advanced biofuels. The motivation of this dissertation is to address the question of how to find cost-effective, sustainable, and environmentally responsible pathways for the production of biofuels. Economic competitiveness, long-term viability, and benign environmental impact are key for biofuels to be embraced by industry, government, and consumers. Techno-economic, location, and carbon emission analysis are research methodologies that help address each of these issues. The research approach presented in this dissertation is to combine these three methodologies into a holistic study of advanced biofuel technologies. The value of techno-economic, location, and carbon emission analysis is limited when conducted in isolation because of current public perception towards energy technologies. Energy technologies are evaluated based on multiple criteria with a significant emphasis on the three areas investigated in this study. There are important aspects within each of these fields that could significantly limit the value of advances in other fields of study. Therefore, it is necessary that future research in advanced biofuels always consider the systemic challenges faced by novel developments.
Flame filtering and perimeter localization of wildfires using aerial thermal imagery
NASA Astrophysics Data System (ADS)
Valero, Mario M.; Verstockt, Steven; Rios, Oriol; Pastor, Elsa; Vandecasteele, Florian; Planas, Eulàlia
2017-05-01
Airborne thermal infrared (TIR) imaging systems are being increasingly used for wild fire tactical monitoring since they show important advantages over spaceborne platforms and visible sensors while becoming much more affordable and much lighter than multispectral cameras. However, the analysis of aerial TIR images entails a number of difficulties which have thus far prevented monitoring tasks from being totally automated. One of these issues that needs to be addressed is the appearance of flame projections during the geo-correction of off-nadir images. Filtering these flames is essential in order to accurately estimate the geographical location of the fuel burning interface. Therefore, we present a methodology which allows the automatic localisation of the active fire contour free of flame projections. The actively burning area is detected in TIR georeferenced images through a combination of intensity thresholding techniques, morphological processing and active contours. Subsequently, flame projections are filtered out by the temporal frequency analysis of the appropriate contour descriptors. The proposed algorithm was tested on footages acquired during three large-scale field experimental burns. Results suggest this methodology may be suitable to automatise the acquisition of quantitative data about the fire evolution. As future work, a revision of the low-pass filter implemented for the temporal analysis (currently a median filter) was recommended. The availability of up-to-date information about the fire state would improve situational awareness during an emergency response and may be used to calibrate data-driven simulators capable of emitting short-term accurate forecasts of the subsequent fire evolution.
Measuring persistence: A literature review focusing on methodological issues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolfe, A.K.; Brown, M.A.; Trumble, D.
1995-03-01
This literature review was conducted as part of a larger project to produce a handbook on the measurement of persistence. The past decade has marked the development of the concept of persistence and a growing recognition that the long-term impacts of demand-side management (DSM) programs warrant careful assessment. Although Increasing attention has been paid to the topic of persistence, no clear consensus has emerged either about its definition or about the methods most appropriate for its measurement and analysis. This project strives to fill that gap by reviewing the goals, terminology, and methods of past persistence studies. It was conductedmore » from the perspective of a utility that seeks to acquire demand-side resources and is interested in their long-term durability; it was not conducted from the perspective of the individual consumer. Over 30 persistence studies, articles, and protocols were examined for this report. The review begins by discussing the underpinnings of persistence studies: namely, the definitions of persistence and the purposes of persistence studies. Then. it describes issues relevant to both the collection and analysis of data on the persistence of energy and demand savings. Findings from persistence studies also are summarized. Throughout the review, four studies are used repeatedly to illustrate different methodological and analytical approaches to persistence so that readers can track the data collection. data analysis, and findings of a set of comprehensive studies that represent alternative approaches.« less
Parker, R David; Regier, Michael; Brown, Zachary; Davis, Stephen
2015-02-01
Homelessness is a primary concern for community health. Scientific literature on homelessness is wide ranging and diverse. One opportunity to add to existing literature is the development and testing of affordable, easily implemented methods for measuring the impact of homeless on the healthcare system. Such methodological approaches rely on the strengths in a multidisciplinary approach, including providers, both healthcare and homeless services and applied clinical researchers. This paper is a proof of concept for a methodology which is easily adaptable nationwide, given the mandated implementation of homeless management information systems in the United States and other countries; medical billing systems by hospitals; and research methods of researchers. Adaptation is independent of geographic region, budget restraints, specific agency skill sets, and many other factors that impact the application of a consistent methodological science based approach to assess and address homelessness. We conducted a secondary data analysis merging data from homeless utilization and hospital case based data. These data detailed care utilization among homeless persons in a small, Appalachian city in the United States. In our sample of 269 persons who received at least one hospital based service and one homeless service between July 1, 2012 and June 30, 2013, the total billed costs were $5,979,463 with 10 people costing more than one-third ($1,957,469) of the total. Those persons were primarily men, living in an emergency shelter, with pre-existing disabling conditions. We theorize that targeted services, including Housing First, would be an effective intervention. This is proposed in a future study.
Use of a trigger tool to detect adverse drug reactions in an emergency department.
de Almeida, Silvana Maria; Romualdo, Aruana; de Abreu Ferraresi, Andressa; Zelezoglo, Giovana Roberta; Marra, Alexandre R; Edmond, Michael B
2017-11-15
Although there are systems for reporting adverse drug reactions (ADR), these safety events remain under reported. The low-cost, low-tech trigger tool method is based on the detection of events through clues, and it seems to increase the detection of adverse events compared to traditional methodologies. This study seeks to estimate the prevalence of adverse reactions to drugs in patients seeking care in the emergency department. Retrospective study from January to December, 2014, applying the Institute for Healthcare Improvement (IHI) trigger tool methodology for patients treated at the emergency room of a tertiary care hospital. The estimated prevalence of adverse reactions in patients presenting to the emergency department was 2.3% [CI 95 1.3% to 3.3%]; 28.6% of cases required hospitalization at an average cost of US$ 5698.44. The most common triggers were hydrocortisone (57% of the cases), diphenhydramine (14%) and fexofenadine (14%). Anti-infectives (19%), cardiovascular agents (14%), and musculoskeletal drugs (14%) were the most common causes of adverse reactions. According to the Naranjo Scale, 71% were classified as possible and 29% as probable. There was no association between adverse reactions and age and sex in the present study. The use of the trigger tool to identify adverse reactions in the emergency department was possible to identify a prevalence of 2.3%. It showed to be a viable method that can provide a better understanding of adverse drug reactions in this patient population.
Hammonds, Rachel; Ooms, Gorik
2014-02-27
The global response to HIV suggests the potential of an emergent global right to health norm, embracing shared global responsibility for health, to assist policy communities in framing the obligations of the domestic state and the international community. Our research explores the extent to which this global right to health norm has influenced the global policy process around maternal health rights, with a focus on universal access to emergency obstetric care. In examining the extent to which arguments stemming from a global right to health norm have been successful in advancing international policy on universal access to emergency obstetric care, we looked at the period from 1985 to 2013 period. We adopted a qualitative case study approach applying a process-tracing methodology using multiple data sources, including an extensive literature review and limited key informant interviews to analyse the international policy agenda setting process surrounding maternal health rights, focusing on emergency obstetric care. We applied John Kingdon's public policy agenda setting streams model to analyse our data. Kingdon's model suggests that to succeed as a mobilising norm, the right to health could work if it can help bring the problem, policy and political streams together, as it did with access to AIDS treatment. Our analysis suggests that despite a normative grounding in the right to health, prioritisation of the specific maternal health entitlements remains fragmented. Despite United Nations recognition of maternal mortality as a human rights issue, the relevant policy communities have not yet managed to shift the policy agenda to prioritise the global right to health norm of shared responsibility for realising access to emergency obstetric care. The experience of HIV advocates in pushing for global solutions based on right to health principles, including participation, solidarity and accountability; suggest potential avenues for utilising right to health based arguments to push for policy priority for universal access to emergency obstetric care in the post-2015 global agenda.
NASA Astrophysics Data System (ADS)
Amenda, Lisa; Pfurtscheller, Clemens
2013-04-01
By virtue of augmented settling in hazardous areas and increased asset values, natural disasters such as floods, landslides and rockfalls cause high economic losses in Alpine lateral valleys. Especially in small municipalities, indirect losses, mainly stemming from a breakdown of transport networks, and costs of emergency can reach critical levels. A quantification of these losses is necessary to estimate the worthiness of mitigation measures, to determine the appropriate level of disaster assistance and to improve risk management strategies. There are comprehensive approaches available for assessing direct losses. However, indirect losses and costs of emergency are widely not assessed and the empirical basis for estimating these costs is weak. To address the resulting uncertainties of project appraisals, a standardized methodology has been developed dealing with issues of local economic effects and emergency efforts needed. In our approach, the cost-benefit-analysis for technical mitigation of the Austrian Torrent and Avalanche Control (TAC) will be optimized and extended using the 2005-debris flow as a design event, which struggled a small town in the upper Inn valley in southwest Tyrol (Austria). Thereby, 84 buildings were affected, 430 people were evacuated and due to this, the TAC implemented protection measures for 3.75 million Euros. Upgrading the method of the TAC and analyzing to what extent the cost-benefit-ratio is about to change, is one of the main objectives of this study. For estimating short-run indirect effects and costs of emergency on the local level, data was collected via questionnaires, field mapping, guided interviews, as well as intense literature research. According to this, up-to-date calculation methods were evolved and the cost-benefit-analysis of TAC was recalculated with these new-implemented results. The cost-benefit-ratio will be more precise and specific and hence, the decision, which mitigation alternative will be carried out. Based on this, the worthiness of the mitigation measures can be determined in more detail and the proper level of emergency assistance can be calculated more adequately. By dint of this study, a better data basis will be created evaluating technical and non-technical mitigation measures, which is useful for government agencies, insurance companies and research.
Alexander, Paul E; Li, Shelly-Anne; Gionfriddo, Michael R; Stoltzfus, Rebecca J; Neumann, Ignacio; Brito, Juan P; Djulbegovic, Benjamin; Montori, Victor M; Schünemann, Holger J; Guyatt, Gordon H
2016-02-01
The World Health Organization (WHO) classifies a substantial proportion of their recommendations as strong despite low or very low confidence (certainty) in estimates of effect. Such discordant recommendations are often inconsistent with Grading of Recommendations Assessment, Development and Evaluation (GRADE) guidance. To gain the perspective of senior WHO methodology chairs regarding panels' use of GRADE, particularly regarding discordant recommendations. Senior active GRADE methodologists who had served on at least two WHO panels and were an author on at least one peer-reviewed published article on GRADE methodology. Five eligible methodologists participated in detailed semistructured interviews. Respondents answered questions regarding how they were viewed by other panelists and WHO leadership, and how they handled situations when panelists made discordant recommendations they felt were inappropriate. They also provided information on how the process can be improved. Interviews were recorded and transcribed, and inductive content analysis was used to derive codes, categories, and emergent themes. Three themes emerged from the interviews of five methodologists: (1) The perceived role of methodologists in the process, (2) Contributors to discordant recommendations, and (3) Strategies for improvement. Salient findings included (1) a perceived tension between methodologists and WHO panels as a result of panel members' resistance to adhering to GRADE guidance; (2) both financial and nonfinancial conflicts of interest among panel members as an explanation for discordant recommendations; and (3) the need for greater clarity of, and support for, the role of methodologists as co-chairs of panels. These findings suggest that the role of the GRADE methodologist as a co-chair needs to be clarified by the WHO leadership. They further suggest the need for additional training for panelists, quality monitoring, and feedback to ensure optimal use of GRADE in guideline development at WHO. Copyright © 2016 Elsevier Inc. All rights reserved.
Ward, Marie; McAuliffe, Eilish; Wakai, Abel; Geary, Una; Browne, John; Deasy, Conor; Schull, Michael; Boland, Fiona; McDaid, Fiona; Coughlan, Eoin; O'Sullivan, Ronan
2017-01-23
Early detection of patient deterioration is a key element of patient safety as it allows timely clinical intervention and potential rescue, thus reducing the risks of serious patient safety incidents. Longitudinal patient monitoring systems have been widely recommended for use to detect clinical deterioration. However, there is conflicting evidence on whether they improve patient outcomes. This may in part be related to variation in the rigour with which they are implemented and evaluated. This study aims to evaluate the implementation and effectiveness of a longitudinal patient monitoring system designed for adult patients in the unique environment of the Emergency Department (ED). A novel participatory action research (PAR) approach is taken where socio-technical systems (STS) theory and analysis informs the implementation through the improvement methodology of 'Plan Do Study Act' (PDSA) cycles. We hypothesise that conducting an STS analysis of the ED before beginning the PDSA cycles will provide for a much richer understanding of the current situation and possible challenges to implementing the ED-specific longitudinal patient monitoring system. This methodology will enable both a process and an outcome evaluation of implementing the ED-specific longitudinal patient monitoring system. Process evaluations can help distinguish between interventions that have inherent faults and those that are badly executed. Over 1.2 million patients attend EDs annually in Ireland; the successful implementation of an ED-specific longitudinal patient monitoring system has the potential to affect the care of a significant number of such patients. To the best of our knowledge, this is the first study combining PAR, STS and multiple PDSA cycles to evaluate the implementation of an ED-specific longitudinal patient monitoring system and to determine (through process and outcome evaluation) whether this system can significantly improve patient outcomes by early detection and appropriate intervention for patients at risk of clinical deterioration.
Cruz, Roberto de la; Guerrero, Pilar; Spill, Fabian; Alarcón, Tomás
2016-10-21
We propose a modelling framework to analyse the stochastic behaviour of heterogeneous, multi-scale cellular populations. We illustrate our methodology with a particular example in which we study a population with an oxygen-regulated proliferation rate. Our formulation is based on an age-dependent stochastic process. Cells within the population are characterised by their age (i.e. time elapsed since they were born). The age-dependent (oxygen-regulated) birth rate is given by a stochastic model of oxygen-dependent cell cycle progression. Once the birth rate is determined, we formulate an age-dependent birth-and-death process, which dictates the time evolution of the cell population. The population is under a feedback loop which controls its steady state size (carrying capacity): cells consume oxygen which in turn fuels cell proliferation. We show that our stochastic model of cell cycle progression allows for heterogeneity within the cell population induced by stochastic effects. Such heterogeneous behaviour is reflected in variations in the proliferation rate. Within this set-up, we have established three main results. First, we have shown that the age to the G1/S transition, which essentially determines the birth rate, exhibits a remarkably simple scaling behaviour. Besides the fact that this simple behaviour emerges from a rather complex model, this allows for a huge simplification of our numerical methodology. A further result is the observation that heterogeneous populations undergo an internal process of quasi-neutral competition. Finally, we investigated the effects of cell-cycle-phase dependent therapies (such as radiation therapy) on heterogeneous populations. In particular, we have studied the case in which the population contains a quiescent sub-population. Our mean-field analysis and numerical simulations confirm that, if the survival fraction of the therapy is too high, rescue of the quiescent population occurs. This gives rise to emergence of resistance to therapy since the rescued population is less sensitive to therapy. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Atun, Rifat A.; McKee, Martin; Drobniewski, Francis; Coker, Richard
2005-01-01
OBJECTIVE: To develop a methodology and an instrument that allow the simultaneous rapid and systematic examination of the broad public health context, the health care systems, and the features of disease-specific programmes. METHODS: Drawing on methodologies used for rapid situational assessments of vertical programmes for tackling communicable disease, we analysed programmes for the control human of immunodeficiency virus (HIV) and their health systems context in three regions in the Russian Federation. The analysis was conducted in three phases: first, analysis of published literature, documents and routine data from the regions; second, interviews with key informants, and third, further data collection and analysis. Synthesis of findings through exploration of emergent themes, with iteration, resulted in the identification of the key systems issues that influenced programme delivery. FINDINGS: We observed a complex political economy within which efforts to control HIV sit, an intricate legal environment, and a high degree of decentralization of financing and operational responsibility. Although each region displays some commonalities arising from the Soviet traditions of public health control, there are considerable variations in the epidemiological trajectories, cultural responses, the political environment, financing, organization and service delivery, and the extent of multisectoral work in response to HIV epidemics. CONCLUSION: Within a centralized, post-Soviet health system, centrally directed measures to enhance HIV control may have varying degrees of impact at the regional level. Although the central tenets of effective vertical HIV programmes may be present, local imperatives substantially influence their interpretation, operationalization and effectiveness. Systematic analysis of the context within which vertical programmes are embedded is necessary to enhance understanding of how the relevant policies are prioritized and translated to action. PMID:16283049
Amponsah, Isaac; Harrison, Kenneth W; Rizos, Dimitris C; Ziehl, Paul H
2008-01-01
There is a net emissions change when adopting new materials for use in civil infrastructure design. To evaluate the total net emissions change, one must consider changes in manufacture and associated life-cycle emissions, as well as changes in the quantity of material required. In addition, in principle one should also consider any differences in costs of the two designs because cost savings can be applied to other economic activities with associated environmental impacts. In this paper, a method is presented that combines these considerations to permit an evaluation of the net change in emissions when considering the adoption of emerging technologies/materials for civil infrastructure. The method factors in data on differences between a standard and new material for civil infrastructure, material requirements as specified in designs using both materials, and price information. The life-cycle assessment approach known as economic input-output life-cycle assessment (EIO-LCA) is utilized. A brief background on EIO-LCA is provided because its use is central to the method. The methodology is demonstrated with analysis of a switch from carbon steel to high-performance steel in military bridge design. The results are compared with a simplistic analysis that accounts for the weight reduction afforded by use of the high-performance steel but assuming no differences in manufacture.
Fall prevention strategy in an emergency department.
Muray, Mwali; Bélanger, Charles H; Razmak, Jamil
2018-02-12
Purpose The purpose of this paper is to document the need for implementing a fall prevention strategy in an emergency department (ED). The paper also spells out the research process that led to approving an assessment tool for use in hospital outpatient services. Design/methodology/approach The fall risk assessment tool was based on the Morse Fall Scale. Gender mix and age above 65 and 80 years were assessed on six risk assessment variables using χ 2 analyses. A logistic regression analysis and model were used to test predictor strength and relationships among variables. Findings In total, 5,371 (56.5 percent) geriatric outpatients were deemed to be at fall risk during the study. Women have a higher falls incidence in young and old age categories. Being on medications for patients above 80 years exposed both genders to equal fall risks. Regression analysis explained 73-98 percent of the variance in the six-variable tool. Originality/value Canadian quality and safe healthcare accreditation standards require that hospital staff develop and adhere to fall prevention policies. Anticipated physiological falls can be prevented by healthcare interventions, particularly with older people known to bear higher risk factors. An aging population is increasing healthcare volumes and medical challenges. Precautionary measures for patients with a vulnerable cognitive and physical status are essential for quality care.
Lorenz, Alyson; Dhingra, Radhika; Chang, Howard H; Bisanzio, Donal; Liu, Yang; Remais, Justin V
2014-01-01
Extrapolating landscape regression models for use in assessing vector-borne disease risk and other applications requires thoughtful evaluation of fundamental model choice issues. To examine implications of such choices, an analysis was conducted to explore the extent to which disparate landscape models agree in their epidemiological and entomological risk predictions when extrapolated to new regions. Agreement between six literature-drawn landscape models was examined by comparing predicted county-level distributions of either Lyme disease or Ixodes scapularis vector using Spearman ranked correlation. AUC analyses and multinomial logistic regression were used to assess the ability of these extrapolated landscape models to predict observed national data. Three models based on measures of vegetation, habitat patch characteristics, and herbaceous landcover emerged as effective predictors of observed disease and vector distribution. An ensemble model containing these three models improved precision and predictive ability over individual models. A priori assessment of qualitative model characteristics effectively identified models that subsequently emerged as better predictors in quantitative analysis. Both a methodology for quantitative model comparison and a checklist for qualitative assessment of candidate models for extrapolation are provided; both tools aim to improve collaboration between those producing models and those interested in applying them to new areas and research questions.
Ciudad, Antonio; Gutiérrez, Miguel; Cañas, Fernando; Gibert, Juan; Gascón, Josep; Carrasco, José-Luis; Bobes, Julio; Gómez, Juan-Carlos; Alvarez, Enrique
2005-07-01
This study investigated safety and effectiveness of olanzapine in monotherapy compared with conventional antipsychotics in treatment of acute inpatients with schizophrenia. This was a prospective, comparative, nonrandomized, open-label, multisite, observational study of Spanish inpatients with an acute episode of schizophrenia. Data included safety assessments with an extrapyramidal symptoms (EPS) questionnaire and the report of spontaneous adverse events, plus clinical assessments with the Brief Psychiatric Rating Scale (BPRS) and the Clinical Global Impressions-Severity of Illness (CGI-S). A multivariate methodology was used to more adequately determine which factors can influence safety and effectiveness of olanzapine in monotherapy. 339 patients treated with olanzapine in monotherapy (OGm) and 385 patients treated with conventional antipsychotics (CG) were included in the analysis. Treatment-emergent EPS were significantly higher in the CG (p<0.0001). Response rate was significantly higher in the OGm (p=0.005). Logistic regression analyses revealed that the only variable significantly correlated with treatment-emergent EPS and clinical response was treatment strategy, with patients in OGm having 1.5 times the probability of obtaining a clinical response and patients in CG having 5 times the risk of developing EPS. In this naturalistic study olanzapine in monotherapy was better-tolerated and at least as effective as conventional antipsychotics.
Introduction to the Special Issue on Advancing Methods for Analyzing Dialect Variation.
Clopper, Cynthia G
2017-07-01
Documenting and analyzing dialect variation is traditionally the domain of dialectology and sociolinguistics. However, modern approaches to acoustic analysis of dialect variation have their roots in Peterson and Barney's [(1952). J. Acoust. Soc. Am. 24, 175-184] foundational work on the acoustic analysis of vowels that was published in the Journal of the Acoustical Society of America (JASA) over 6 decades ago. Although Peterson and Barney (1952) were not primarily concerned with dialect variation, their methods laid the groundwork for the acoustic methods that are still used by scholars today to analyze vowel variation within and across languages. In more recent decades, a number of methodological advances in the study of vowel variation have been published in JASA, including work on acoustic vowel overlap and vowel normalization. The goal of this special issue was to honor that tradition by bringing together a set of papers describing the application of emerging acoustic, articulatory, and computational methods to the analysis of dialect variation in vowels and beyond.
Fassin, D
2003-09-01
In recent years, social capital has emerged in epidemiological studies as a new concept, improving our understanding of the relationships between social inequalities and health inequalities. This concept, borrowed from social sciences, has three distinct sociological sources. However, only the most recent theory, which emphasizes the role of civic trust and is useful for analysis at community level, has been used in epidemiological studies. Social capital poses three kinds of problem: i) theoretical problems, because it is defined by its effects rather than by its causes, and because it is presumed that these effects are positive, although they can in fact be negative; ii) methodological problems, because of the heterogeneity of empirical scales, from micro to macro, and because of the diversity of its semantic content, including contradictions; iii) political problems, because of the emphasis placed on individual responsibility and due to the imposition of a model of civic virtue, to the detriment of structural analysis.
Multivariate longitudinal data analysis with censored and intermittent missing responses.
Lin, Tsung-I; Lachos, Victor H; Wang, Wan-Lun
2018-05-08
The multivariate linear mixed model (MLMM) has emerged as an important analytical tool for longitudinal data with multiple outcomes. However, the analysis of multivariate longitudinal data could be complicated by the presence of censored measurements because of a detection limit of the assay in combination with unavoidable missing values arising when subjects miss some of their scheduled visits intermittently. This paper presents a generalization of the MLMM approach, called the MLMM-CM, for a joint analysis of the multivariate longitudinal data with censored and intermittent missing responses. A computationally feasible expectation maximization-based procedure is developed to carry out maximum likelihood estimation within the MLMM-CM framework. Moreover, the asymptotic standard errors of fixed effects are explicitly obtained via the information-based method. We illustrate our methodology by using simulated data and a case study from an AIDS clinical trial. Experimental results reveal that the proposed method is able to provide more satisfactory performance as compared with the traditional MLMM approach. Copyright © 2018 John Wiley & Sons, Ltd.
Evaluation of health information systems research in information systems research: A meta-analysis.
Haried, Peter; Claybaugh, Craig; Dai, Hua
2017-04-01
Given the importance of the health-care industry and the promise of health information systems, researchers are encouraged to build on the shoulders of giants as the saying goes. The health information systems field has a unique opportunity to learn from and extend the work that has already been done by the highly correlated information systems field. As a result, this research article presents a past, present and future meta-analysis of health information systems research in information systems journals over the 2000-2015 time period. Our analysis reviewed 126 articles on a variety of topics related to health information systems research published in the "Senior Scholars" list of the top eight ranked information systems academic journals. Across the selected information systems academic journals, our findings compare research methodologies applied, health information systems topic areas investigated and research trends. Interesting results emerge in the range and evolution of health information systems research and opportunities for health information systems researchers and practitioners to consider moving forward.
NASA Astrophysics Data System (ADS)
Forte, F.; Strobl, R. O.; Pennetta, L.
2006-07-01
The impact of calamitous meteoric events and their interaction with the geological and geomorphological environment represent a current problem of the Supersano-Ruffano-Nociglia Graben in southern Italy. Indeed, severe floods take place on a frequent basis not only in autumn and winter, but in summer also. These calamities are not only triggered by exceptional events, but are also amplified by peculiar geological and morpho-structural characteristics of the Graben. Flooding often affects vast agricultural areas and consequently, water-scooping machines cannot remove the rainwater. These events cause warnings and emergency states, involving people as well as socio economic goods. This study represents an application of a vanguard technique for loss estimation and flood vulnerability analysis, integrating a geographic information system (GIS) with aerial photos and remote sensing methods. The analysis results clearly show that the Graben area is potentially at greatest flood vulnerability, while along the Horsts the flood vulnerability is lower.
NASA Technical Reports Server (NTRS)
Sandlin, Doral R.; Bauer, Brent Alan
1993-01-01
This paper discusses the development of a FORTRAN computer code to perform agility analysis on aircraft configurations. This code is to be part of the NASA-Ames ACSYNT (AirCraft SYNThesis) design code. This paper begins with a discussion of contemporary agility research in the aircraft industry and a survey of a few agility metrics. The methodology, techniques and models developed for the code are then presented. Finally, example trade studies using the agility module along with ACSYNT are illustrated. These trade studies were conducted using a Northrop F-20 Tigershark aircraft model. The studies show that the agility module is effective in analyzing the influence of common parameters such as thrust-to-weight ratio and wing loading on agility criteria. The module can compare the agility potential between different configurations. In addition one study illustrates the module's ability to optimize a configuration's agility performance.
INTERIM ANALYSIS OF THE CONTRIBUTION OF HIGH-LEVEL EVIDENCE FOR DENGUE VECTOR CONTROL.
Horstick, Olaf; Ranzinger, Silvia Runge
2015-01-01
This interim analysis reviews the available systematic literature for dengue vector control on three levels: 1) single and combined vector control methods, with existing work on peridomestic space spraying and on Bacillus thuringiensis israelensis; further work is available soon on the use of Temephos, Copepods and larvivorous fish; 2) or for a specific purpose, like outbreak control, and 3) on a strategic level, as for example decentralization vs centralization, with a systematic review on vector control organization. Clear best practice guidelines for methodology of entomological studies are needed. There is a need to include measuring dengue transmission data. The following recommendations emerge: Although vector control can be effective, implementation remains an issue; Single interventions are probably not useful; Combinations of interventions have mixed results; Careful implementation of vector control measures may be most important; Outbreak interventions are often applied with questionable effectiveness.
Identity: a complex structure for researching students' academic behavior in science and mathematics
NASA Astrophysics Data System (ADS)
Aydeniz, Mehmet; Hodge, Lynn Liao
2011-06-01
This article is a response to Pike and Dunne's research. The focus of their analysis is on reflections of studying science post-16. Pike and Dunne draw attention to under enrollments in science, technology, engineering, and mathematics (STEM) fields, in particular, in the field of physics, chemistry and biology in the United Kingdom. We provide an analysis of how the authors conceptualize the problem of scientific career choices, the theoretical framework through which they study the problem, and the methodology they use to collect and analyze data. In addition, we examine the perspective they provide in light of new developments in the field of students' attitudes towards science and mathematics. More precisely, we draw attention to and explicate the authors' use of identity from the perspective of emerging theories that explore the relationships between the learner and culture in the context of science and mathematics.