Opinion: Clarifying Two Controversies about Information Mapping's Method.
ERIC Educational Resources Information Center
Horn, Robert E.
1992-01-01
Describes Information Mapping, a methodology for the analysis, organization, sequencing, and presentation of information and explains three major parts of the method: (1) content analysis, (2) project life-cycle synthesis and integration of the content analysis, and (3) sequencing and formatting. Major criticisms of the methodology are addressed.…
Information security system quality assessment through the intelligent tools
NASA Astrophysics Data System (ADS)
Trapeznikov, E. V.
2018-04-01
The technology development has shown the automated system information security comprehensive analysis necessity. The subject area analysis indicates the study relevance. The research objective is to develop the information security system quality assessment methodology based on the intelligent tools. The basis of the methodology is the information security assessment model in the information system through the neural network. The paper presents the security assessment model, its algorithm. The methodology practical implementation results in the form of the software flow diagram are represented. The practical significance of the model being developed is noted in conclusions.
Toddi A. Steelman; Branda Nowell; Deena Bayoumi; Sarah McCaffrey
2014-01-01
We leverage economic theory, network theory, and social network analytical techniques to bring greater conceptual and methodological rigor to understand how information is exchanged during disasters. We ask, "How can information relationships be evaluated more systematically during a disaster response?" "Infocentric analysis"a term and...
Chase: Control of Heterogeneous Autonomous Sensors for Situational Awareness
2016-08-03
remained the discovery and analysis of new foundational methodology for information collection and fusion that exercises rigorous feedback control over...simultaneously achieve quantified information and physical objectives. New foundational methodology for information collection and fusion that exercises...11.2.1. In the general area of novel stochastic systems analysis it seems appropriate to mention the pioneering work on non -Bayesian distributed learning
The methodology of semantic analysis for extracting physical effects
NASA Astrophysics Data System (ADS)
Fomenkova, M. A.; Kamaev, V. A.; Korobkin, D. M.; Fomenkov, S. A.
2017-01-01
The paper represents new methodology of semantic analysis for physical effects extracting. This methodology is based on the Tuzov ontology that formally describes the Russian language. In this paper, semantic patterns were described to extract structural physical information in the form of physical effects. A new algorithm of text analysis was described.
Oliver, Penelope; Cicerale, Sara; Pang, Edwin; Keast, Russell
2018-04-01
Temporal dominance of sensations (TDS) is a rapid descriptive method that offers a different magnitude of information to traditional descriptive analysis methodologies. This methodology considers the dynamic nature of eating, assessing sensory perception of foods as they change throughout the eating event. Limited research has applied the TDS methodology to strawberries and subsequently validated the results against Quantitative Descriptive Analysis (QDA™). The aim of this research is to compare the TDS methodology using an untrained consumer panel to the results obtained via QDA™ with a trained sensory panel. The trained panelists (n = 12, minimum 60 hr each panelist) were provided with six strawberry samples (three cultivars at two maturation levels) and applied QDA™ techniques to profile each strawberry sample. Untrained consumers (n = 103) were provided with six strawberry samples (three cultivars at two maturation levels) and required to use TDS methodology to assess the dominant sensations for each sample as they change over time. Results revealed moderately comparable product configurations produced via TDS in comparison to QDA™ (RV coefficient = 0.559), as well as similar application of the sweet attribute (correlation coefficient of 0.895 at first bite). The TDS methodology however was not in agreement with the QDA™ methodology regarding more complex flavor terms. These findings support the notion that the lack of training on the definition of terms, together with the limitations of the methodology to ignore all attributes other than those dominant, provide a different magnitude of information than the QDA™ methodology. A comparison of TDS to traditional descriptive analysis indicate that TDS provides additional information to QDA™ regarding the lingering component of eating. The QDA™ results however provide more precise detail regarding singular attributes. Therefore, the TDS methodology has an application in industry when it is important to understand the lingering profile of products. However, this methodology should not be employed as a replacement to traditional descriptive analysis methods. © 2018 Institute of Food Technologists®.
Cyber-Informed Engineering: The Need for a New Risk Informed and Design Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, Joseph Daniel; Anderson, Robert Stephen
Current engineering and risk management methodologies do not contain the foundational assumptions required to address the intelligent adversary’s capabilities in malevolent cyber attacks. Current methodologies focus on equipment failures or human error as initiating events for a hazard, while cyber attacks use the functionality of a trusted system to perform operations outside of the intended design and without the operator’s knowledge. These threats can by-pass or manipulate traditionally engineered safety barriers and present false information, invalidating the fundamental basis of a safety analysis. Cyber threats must be fundamentally analyzed from a completely new perspective where neither equipment nor human operationmore » can be fully trusted. A new risk analysis and design methodology needs to be developed to address this rapidly evolving threatscape.« less
NASA Astrophysics Data System (ADS)
Tene, Yair; Tene, Noam; Tene, G.
1993-08-01
An interactive data fusion methodology of video, audio, and nonlinear structural dynamic analysis for potential application in forensic engineering is presented. The methodology was developed and successfully demonstrated in the analysis of heavy transportable bridge collapse during preparation for testing. Multiple bridge elements failures were identified after the collapse, including fracture, cracks and rupture of high performance structural materials. Videotape recording by hand held camcorder was the only source of information about the collapse sequence. The interactive data fusion methodology resulted in extracting relevant information form the videotape and from dynamic nonlinear structural analysis, leading to full account of the sequence of events during the bridge collapse.
Depressed Mothers as Informants on Child Behavior: Methodological Issues
Ordway, Monica Roosa
2011-01-01
Mothers with depressive symptoms more frequently report behavioral problems among their children than non-depressed mothers leading to a debate regarding the accuracy of depressed mothers as informants of children’s behavior. The purpose of this integrative review was to identify methodological challenges in research related to the debate. Data were extracted from 43 papers (6 theoretical, 36 research reports, and 1 instrument scoring manual). The analysis focused on the methodologies considered when using depressed mothers as informants. Nine key themes were identified and I concluded that researchers should incorporate multiple informants, identify the characteristics of maternal depression, and incorporate advanced statistical methodology. The use of a conceptual framework to understand informant discrepancies within child behavior evaluations is suggested for future research. PMID:21964958
A New Methodology for Systematic Exploitation of Technology Databases.
ERIC Educational Resources Information Center
Bedecarrax, Chantal; Huot, Charles
1994-01-01
Presents the theoretical aspects of a data analysis methodology that can help transform sequential raw data from a database into useful information, using the statistical analysis of patents as an example. Topics discussed include relational analysis and a technology watch approach. (Contains 17 references.) (LRW)
Moreno-Conde, Alberto; Moner, David; Cruz, Wellington Dimas da; Santos, Marcelo R; Maldonado, José Alberto; Robles, Montserrat; Kalra, Dipak
2015-07-01
This systematic review aims to identify and compare the existing processes and methodologies that have been published in the literature for defining clinical information models (CIMs) that support the semantic interoperability of electronic health record (EHR) systems. Following the preferred reporting items for systematic reviews and meta-analyses systematic review methodology, the authors reviewed published papers between 2000 and 2013 that covered that semantic interoperability of EHRs, found by searching the PubMed, IEEE Xplore, and ScienceDirect databases. Additionally, after selection of a final group of articles, an inductive content analysis was done to summarize the steps and methodologies followed in order to build CIMs described in those articles. Three hundred and seventy-eight articles were screened and thirty six were selected for full review. The articles selected for full review were analyzed to extract relevant information for the analysis and characterized according to the steps the authors had followed for clinical information modeling. Most of the reviewed papers lack a detailed description of the modeling methodologies used to create CIMs. A representative example is the lack of description related to the definition of terminology bindings and the publication of the generated models. However, this systematic review confirms that most clinical information modeling activities follow very similar steps for the definition of CIMs. Having a robust and shared methodology could improve their correctness, reliability, and quality. Independently of implementation technologies and standards, it is possible to find common patterns in methods for developing CIMs, suggesting the viability of defining a unified good practice methodology to be used by any clinical information modeler. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
1981-02-01
Report 528 COST AIND I*FO•?JidTH ?i EFFECT•• ES1BS ANALYSIS (CDEA): A METiBLOBU Y FOR EVALUATIN1G A TRAINING DEMCE OPERATMDN1AL MAEA3 ],SE 3SSESS$ iElT ...8217, N. Within a military setting, the uses of training devices in performance evaluation have generally mirrored civilian uses and primarily...Technical Report 528 COST AND INFORMATION EFFECTIVENESS ANALYSIS (CIEA): A METHODOLOGY FOR EVALUATING A TRAINING DEVICE OPERATIONAL READINESS
An Approach for Implementation of Project Management Information Systems
NASA Astrophysics Data System (ADS)
Běrziša, Solvita; Grabis, Jānis
Project management is governed by project management methodologies, standards, and other regulatory requirements. This chapter proposes an approach for implementing and configuring project management information systems according to requirements defined by these methodologies. The approach uses a project management specification framework to describe project management methodologies in a standardized manner. This specification is used to automatically configure the project management information system by applying appropriate transformation mechanisms. Development of the standardized framework is based on analysis of typical project management concepts and process and existing XML-based representations of project management. A demonstration example of project management information system's configuration is provided.
Towards a Methodology for Identifying Program Constraints During Requirements Analysis
NASA Technical Reports Server (NTRS)
Romo, Lilly; Gates, Ann Q.; Della-Piana, Connie Kubo
1997-01-01
Requirements analysis is the activity that involves determining the needs of the customer, identifying the services that the software system should provide and understanding the constraints on the solution. The result of this activity is a natural language document, typically referred to as the requirements definition document. Some of the problems that exist in defining requirements in large scale software projects includes synthesizing knowledge from various domain experts and communicating this information across multiple levels of personnel. One approach that addresses part of this problem is called context monitoring and involves identifying the properties of and relationships between objects that the system will manipulate. This paper examines several software development methodologies, discusses the support that each provide for eliciting such information from experts and specifying the information, and suggests refinements to these methodologies.
NASA Astrophysics Data System (ADS)
Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.
2012-09-01
Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.
ERIC Educational Resources Information Center
Evan-Wong, Sue; de Freitas, Claudette
1995-01-01
Presents a methodology for marketing an information service which focuses on including information users in the strategic marketing planning process. Identifies the following stages of a marketing planning process: analysis of the environment, information audit, information needs assessment, market opportunity analysis, tactical marketing program,…
Methodologies for the Statistical Analysis of Memory Response to Radiation
NASA Astrophysics Data System (ADS)
Bosser, Alexandre L.; Gupta, Viyas; Tsiligiannis, Georgios; Frost, Christopher D.; Zadeh, Ali; Jaatinen, Jukka; Javanainen, Arto; Puchner, Helmut; Saigné, Frédéric; Virtanen, Ari; Wrobel, Frédéric; Dilillo, Luigi
2016-08-01
Methodologies are proposed for in-depth statistical analysis of Single Event Upset data. The motivation for using these methodologies is to obtain precise information on the intrinsic defects and weaknesses of the tested devices, and to gain insight on their failure mechanisms, at no additional cost. The case study is a 65 nm SRAM irradiated with neutrons, protons and heavy ions. This publication is an extended version of a previous study [1].
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-12
... methodological studies conducted during the Vanguard phase will inform the implementation and analysis plan for... Research Methodology Studies for the National Children's Study SUMMARY: In compliance with the requirement... Collection: Title: Environmental Science Formative Research Methodology Studies for the National Children's...
On the Application of Syntactic Methodologies in Automatic Text Analysis.
ERIC Educational Resources Information Center
Salton, Gerard; And Others
1990-01-01
Summarizes various linguistic approaches proposed for document analysis in information retrieval environments. Topics discussed include syntactic analysis; use of machine-readable dictionary information; knowledge base construction; the PLNLP English Grammar (PEG) system; phrase normalization; and statistical and syntactic phrase evaluation used…
ERIC Educational Resources Information Center
Wright, Gerald P.
2013-01-01
Despite over half a century of Project Management research, project success rates are still too low. Organizations spend a tremendous amount of valuable resources on Information Technology projects and seek to maximize the utility gained from their efforts. The author investigated the impact of software development methodology choice on ten…
Unmanned Tactical Autonomous Control and Collaboration Situation Awareness
2017-06-01
methodology framework using interdependence analysis (IA) tables for informing design requirements based on SA requirements. Future research should seek...requirements of UTACC. The authors then apply SA principles to Coactive Design in order to inform robotic design. The result is a methodology framework using...28 2. Non -intrusive Methods ................................................................29 3. Post-Mission Reviews
GetReal in network meta-analysis: a review of the methodology.
Efthimiou, Orestis; Debray, Thomas P A; van Valkenhoef, Gert; Trelle, Sven; Panayidou, Klea; Moons, Karel G M; Reitsma, Johannes B; Shang, Aijing; Salanti, Georgia
2016-09-01
Pairwise meta-analysis is an established statistical tool for synthesizing evidence from multiple trials, but it is informative only about the relative efficacy of two specific interventions. The usefulness of pairwise meta-analysis is thus limited in real-life medical practice, where many competing interventions may be available for a certain condition and studies informing some of the pairwise comparisons may be lacking. This commonly encountered scenario has led to the development of network meta-analysis (NMA). In the last decade, several applications, methodological developments, and empirical studies in NMA have been published, and the area is thriving as its relevance to public health is increasingly recognized. This article presents a review of the relevant literature on NMA methodology aiming to pinpoint the developments that have appeared in the field. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
The RAAF Logistics Study. Volume 4,
1986-10-01
Use of Issue-Based Root Definitions Application of Soft Systems Methodology to 27 Information Systems Analysis Conclusion 30 LIST OF ABBREVIATIONS 58 k...Management Control Systems’, Journal of Applied Systems Analysis, Volume 6, 1979, pages 51 to 67. 5. The soft systems methodology was developed to tackle...the soft systems methodology has many advantages whi-h recmmenrl it to this type of study area, it does not mcklel the timo ev, lut i, n :-f a system
Methodology for nonwork travel analysis in suburban communities.
DOT National Transportation Integrated Search
1994-01-01
The increase in the number of nonwork trips during the past decade has contributed substantially to congestion and to environmental problems. Data collection methodologies, descriptive information, and reliable models of nonwork travel behavior are n...
Brueton, Valerie C; Vale, Claire L; Choodari-Oskooei, Babak; Jinks, Rachel; Tierney, Jayne F
2014-11-27
Providing evidence of impact highlights the benefits of medical research to society. Such evidence is increasingly requested by research funders and commonly relies on citation analysis. However, other indicators may be more informative. Although frameworks to demonstrate the impact of clinical research have been reported, no complementary framework exists for methodological research. Therefore, we assessed the impact of methodological research projects conducted or completed between 2009 and 2012 at the UK Medical Research Council Clinical Trials Unit Hub for Trials Methodology Research Hub, with a view to developing an appropriate framework. Various approaches to the collection of data on research impact were employed. Citation rates were obtained using Web of Science (http://www.webofknowledge.com/) and analyzed descriptively. Semistructured interviews were conducted to obtain information on the rates of different types of research output that indicated impact for each project. Results were then pooled across all projects. Finally, email queries pertaining to methodology projects were collected retrospectively and their content analyzed. Simple citation analysis established the citation rates per year since publication for 74 methodological publications; however, further detailed analysis revealed more about the potential influence of these citations. Interviews that spanned 20 individual research projects demonstrated a variety of types of impact not otherwise collated, for example, applications and further developments of the research; release of software and provision of guidance materials to facilitate uptake; formation of new collaborations and broad dissemination. Finally, 194 email queries relating to 6 methodological projects were received from 170 individuals across 23 countries. They provided further evidence that the methodologies were impacting on research and research practice, both nationally and internationally. We have used the information gathered in this study to adapt an existing framework for impact of clinical research for use in methodological research. Gathering evidence on research impact of methodological research from a variety of sources has enabled us to obtain multiple indicators and thus to demonstrate broad impacts of methodological research. The adapted framework developed can be applied to future methodological research and thus provides a tool for methodologists to better assess and report research impacts.
A methodology to estimate uncertainty for emission projections through sensitivity analysis.
Lumbreras, Julio; de Andrés, Juan Manuel; Pérez, Javier; Borge, Rafael; de la Paz, David; Rodríguez, María Encarnación
2015-04-01
Air pollution abatement policies must be based on quantitative information on current and future emissions of pollutants. As emission projections uncertainties are inevitable and traditional statistical treatments of uncertainty are highly time/resources consuming, a simplified methodology for nonstatistical uncertainty estimation based on sensitivity analysis is presented in this work. The methodology was applied to the "with measures" scenario for Spain, concretely over the 12 highest emitting sectors regarding greenhouse gas and air pollutants emissions. Examples of methodology application for two important sectors (power plants, and agriculture and livestock) are shown and explained in depth. Uncertainty bands were obtained up to 2020 by modifying the driving factors of the 12 selected sectors and the methodology was tested against a recomputed emission trend in a low economic-growth perspective and official figures for 2010, showing a very good performance. A solid understanding and quantification of uncertainties related to atmospheric emission inventories and projections provide useful information for policy negotiations. However, as many of those uncertainties are irreducible, there is an interest on how they could be managed in order to derive robust policy conclusions. Taking this into account, a method developed to use sensitivity analysis as a source of information to derive nonstatistical uncertainty bands for emission projections is presented and applied to Spain. This method simplifies uncertainty assessment and allows other countries to take advantage of their sensitivity analyses.
ERIC Educational Resources Information Center
Mason, Robert M.; And Others
This document presents a research effort intended to improve the economic information available for formulating politics and making decisions related to Information Analysis Centers (IAC's) and IAC services. The project used a system of IAC information activities to analyze the functional aspects of IAC services, calculate the present value of net…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-20
..., Risk Management and Analysis (RAM) ACTION: Notice of request for public comments. SUMMARY: The... of 1995. Title of Information Collection: Risk Analysis and Management. OMB Control Number: None.... Methodology: The State Department, is implementing a Risk Analysis and Management Program to vet potential...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bri Rolston
2005-06-01
Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills,more » and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between those threats and the defensive capabilities of control systems can be analyzed. The results of the gap analysis drive changes in the cyber security of critical infrastructure networks to close the gap between current exploits and existing defenses. The analysis also provides defenders with an idea of how threat technology is evolving and how defenses will need to be modified to address these emerging trends.« less
Engineering tradeoff problems viewed as multiple objective optimizations and the VODCA methodology
NASA Astrophysics Data System (ADS)
Morgan, T. W.; Thurgood, R. L.
1984-05-01
This paper summarizes a rational model for making engineering tradeoff decisions. The model is a hybrid from the fields of social welfare economics, communications, and operations research. A solution methodology (Vector Optimization Decision Convergence Algorithm - VODCA) firmly grounded in the economic model is developed both conceptually and mathematically. The primary objective for developing the VODCA methodology was to improve the process for extracting relative value information about the objectives from the appropriate decision makers. This objective was accomplished by employing data filtering techniques to increase the consistency of the relative value information and decrease the amount of information required. VODCA is applied to a simplified hypothetical tradeoff decision problem. Possible use of multiple objective analysis concepts and the VODCA methodology in product-line development and market research are discussed.
Peláez-Coca, M. D.; Orini, M.; Lázaro, J.; Bailón, R.; Gil, E.
2013-01-01
A methodology that combines information from several nonstationary biological signals is presented. This methodology is based on time-frequency coherence, that quantifies the similarity of two signals in the time-frequency domain. A cross time-frequency analysis method, based on quadratic time-frequency distribution, has been used for combining information of several nonstationary biomedical signals. In order to evaluate this methodology, the respiratory rate from the photoplethysmographic (PPG) signal is estimated. The respiration provokes simultaneous changes in the pulse interval, amplitude, and width of the PPG signal. This suggests that the combination of information from these sources will improve the accuracy of the estimation of the respiratory rate. Another target of this paper is to implement an algorithm which provides a robust estimation. Therefore, respiratory rate was estimated only in those intervals where the features extracted from the PPG signals are linearly coupled. In 38 spontaneous breathing subjects, among which 7 were characterized by a respiratory rate lower than 0.15 Hz, this methodology provided accurate estimates, with the median error {0.00; 0.98} mHz ({0.00; 0.31}%) and the interquartile range error {4.88; 6.59} mHz ({1.60; 1.92}%). The estimation error of the presented methodology was largely lower than the estimation error obtained without combining different PPG features related to respiration. PMID:24363777
Cognitive Task Analysis of Business Jet Pilots' Weather Flying Behaviors: Preliminary Results
NASA Technical Reports Server (NTRS)
Latorella, Kara; Pliske, Rebecca; Hutton, Robert; Chrenka, Jason
2001-01-01
This report presents preliminary findings from a cognitive task analysis (CTA) of business aviation piloting. Results describe challenging weather-related aviation decisions and the information and cues used to support these decisions. Further, these results demonstrate the role of expertise in business aviation decision-making in weather flying, and how weather information is acquired and assessed for reliability. The challenging weather scenarios and novice errors identified in the results provide the basis for experimental scenarios and dependent measures to be used in future flight simulation evaluations of candidate aviation weather information systems. Finally, we analyzed these preliminary results to recommend design and training interventions to improve business aviation decision-making with weather information. The primary objective of this report is to present these preliminary findings and to document the extended CTA methodology used to elicit and represent expert business aviator decision-making with weather information. These preliminary findings will be augmented with results from additional subjects using this methodology. A summary of the complete results, absent the detailed treatment of methodology provided in this report, will be documented in a separate publication.
Landslide hazard analysis for pipelines: The case of the Simonette river crossing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grivas, D.A.; Schultz, B.C.; O`Neil, G.
1995-12-31
The overall objective of this study is to develop a probabilistic methodology to analyze landslide hazards and their effects on the safety of buried pipelines. The methodology incorporates a range of models that can accommodate differences in the ground movement modes and the amount and type of information available at various site locations. Two movement modes are considered, namely (a) instantaneous (catastrophic) slides, and (b) gradual ground movement which may result in cumulative displacements over the pipeline design life (30--40 years) that are in excess of allowable values. Probabilistic analysis is applied in each case to address the uncertainties associatedmore » with important factors that control slope stability. Availability of information ranges from relatively well studied, instrumented installations to cases where data is limited to what can be derived from topographic and geologic maps. The methodology distinguishes between procedures applied where there is little information and those that can be used when relatively extensive data is available. important aspects of the methodology are illustrated in a case study involving a pipeline located in Northern Alberta, Canada, in the Simonette river valley.« less
ERIC Educational Resources Information Center
Batallan, Graciela; Dente, Liliana; Ritta, Loreley
2017-01-01
This article aims to open up a debate on methodological aspects of ethnographic research, arguing for the legitimacy of the information produced in a research "taller" or workshop using a participatory methodology and video production as a methodological tool. Based on the theoretical foundations and analysis of a "taller"…
RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, Justin Leigh
This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.
NASA Technical Reports Server (NTRS)
Vigeant-Langlois, Laurence; Hansman, R. John, Jr.
2003-01-01
The objective of this project was to propose a means to improve aviation weather information, training procedures based on a human-centered systems approach. Methodology: cognitive analysis of pilot's tasks; trajectory-based approach to weather information; contingency planning support; and implications for improving weather information.
NASA Astrophysics Data System (ADS)
Vugteveen, Pim; van Katwijk, Marieke M.; Rouwette, Etiënne; Hanssen, Lucien
2014-02-01
Integrated Coastal Management cannot operate effectively without reliable information and knowledge on changes in the environment and on the causes of those changes. Monitoring is essential to provide data needed for a real understanding of socio-economic and ecological functioning in multi-user nature areas. We present a web-based and comprehensive assessment methodology to articulate, structure and prioritize information needs and ensuing monitoring needs. We applied this methodology in the Dutch Wadden Sea Region, which includes a designated UNESCO World Heritage nature reserve. The methodology consists of the following steps: i) exploring social-ecological issues of concern and defining the monitoring scope; ii) articulating information needs expressed as tractable questions; iii) elaborating monitoring needs; iv) grounding in scientific models and current monitoring; v) synthesizing assessment findings into target entities, i.e. analysis variables for monitoring. In this paper we focus on the first three steps. As part of our methodology we performed two online surveys amongst a broad range of stakeholders and amongst monitoring professionals. In the case of the Dutch Wadden Sea Region, main monitoring questions were related to biodiversity and food web relations; effects of fisheries and its pressures on the ecosystem; channel and port dredging; spatial planning and multifunctional use; sustainable energy production; and effects of changing storm regimes due to climate change. Subsequently we elaborated these general issues into analysis variables within five themes. The presented methodology enables large scale and unbiased involvement of stakeholders in articulating information needs in a multi-user nature reserve like the Wadden Sea. In addition the methodology facilitates the input and feedback of monitoring professionals by providing a detailed elaboration of monitoring needs.
Igras, Susan; Diakité, Mariam; Lundgren, Rebecka
2017-07-01
In West Africa, social factors influence whether couples with unmet need for family planning act on birth-spacing desires. Tékponon Jikuagou is testing a social network-based intervention to reduce social barriers by diffusing new ideas. Individuals and groups judged socially influential by their communities provide entrée to networks. A participatory social network mapping methodology was designed to identify these diffusion actors. Analysis of monitoring data, in-depth interviews, and evaluation reports assessed the methodology's acceptability to communities and staff and whether it produced valid, reliable data to identify influential individuals and groups who diffuse new ideas through their networks. Results indicated the methodology's acceptability. Communities were actively and equitably engaged. Staff appreciated its ability to yield timely, actionable information. The mapping methodology also provided valid and reliable information by enabling communities to identify highly connected and influential network actors. Consistent with social network theory, this methodology resulted in the selection of informal groups and individuals in both informal and formal positions. In-depth interview data suggest these actors were diffusing new ideas, further confirming their influence/connectivity. The participatory methodology generated insider knowledge of who has social influence, challenging commonly held assumptions. Collecting and displaying information fostered staff and community learning, laying groundwork for social change.
Protocol Analysis: A Methodology for Exploring the Information Processing of Gifted Students.
ERIC Educational Resources Information Center
Anderson, Margaret A.
1986-01-01
Protocol analysis techniques, in which subjects are taught to think aloud, can provide information on the mental operations used by gifted learners. Concerns over the use of such data are described and new directions for the technique are proposed. (CL)
NASA Technical Reports Server (NTRS)
Evers, Ken H.; Bachert, Robert F.
1987-01-01
The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.
A general methodology for population analysis
NASA Astrophysics Data System (ADS)
Lazov, Petar; Lazov, Igor
2014-12-01
For a given population with N - current and M - maximum number of entities, modeled by a Birth-Death Process (BDP) with size M+1, we introduce utilization parameter ρ, ratio of the primary birth and death rates in that BDP, which, physically, determines (equilibrium) macrostates of the population, and information parameter ν, which has an interpretation as population information stiffness. The BDP, modeling the population, is in the state n, n=0,1,…,M, if N=n. In presence of these two key metrics, applying continuity law, equilibrium balance equations concerning the probability distribution pn, n=0,1,…,M, of the quantity N, pn=Prob{N=n}, in equilibrium, and conservation law, and relying on the fundamental concepts population information and population entropy, we develop a general methodology for population analysis; thereto, by definition, population entropy is uncertainty, related to the population. In this approach, what is its essential contribution, the population information consists of three basic parts: elastic (Hooke's) or absorption/emission part, synchronization or inelastic part and null part; the first two parts, which determine uniquely the null part (the null part connects them), are the two basic components of the Information Spectrum of the population. Population entropy, as mean value of population information, follows this division of the information. A given population can function in information elastic, antielastic and inelastic regime. In an information linear population, the synchronization part of the information and entropy is absent. The population size, M+1, is the third key metric in this methodology. Namely, right supposing a population with infinite size, the most of the key quantities and results for populations with finite size, emerged in this methodology, vanish.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
LSU: The Library Space Utilization Methodology.
ERIC Educational Resources Information Center
Hall, Richard B.
A computerized research technique for measuring the space utilization of public library facilities provides a behavioral activity and occupancy analysis for library planning purposes. The library space utilization (LSU) methodology demonstrates that significant information about the functional requirements of a library can be measured and…
Heymann, Michael; Degani, Asaf
2007-04-01
We present a formal approach and methodology for the analysis and generation of user interfaces, with special emphasis on human-automation interaction. A conceptual approach for modeling, analyzing, and verifying the information content of user interfaces is discussed. The proposed methodology is based on two criteria: First, the interface must be correct--that is, given the interface indications and all related information (user manuals, training material, etc.), the user must be able to successfully perform the specified tasks. Second, the interface and related information must be succinct--that is, the amount of information (mode indications, mode buttons, parameter settings, etc.) presented to the user must be reduced (abstracted) to the minimum necessary. A step-by-step procedure for generating the information content of the interface that is both correct and succinct is presented and then explained and illustrated via two examples. Every user interface is an abstract description of the underlying system. The correspondence between the abstracted information presented to the user and the underlying behavior of a given machine can be analyzed and addressed formally. The procedure for generating the information content of user interfaces can be automated, and a software tool for its implementation has been developed. Potential application areas include adaptive interface systems and customized/personalized interfaces.
30 CFR 780.21 - Hydrologic information.
Code of Federal Regulations, 2010 CFR
2010-07-01
... contain information on water availability and alternative water sources, including the suitability of...) flooding or streamflow alteration; (D) ground water and surface water availability; and (E) other... Hydrologic information. (a) Sampling and analysis methodology. All water-quality analyses performed to meet...
Raut, Savita V; Yadav, Dinkar M
2018-03-28
This paper presents an fMRI signal analysis methodology using geometric mean curve decomposition (GMCD) and mutual information-based voxel selection framework. Previously, the fMRI signal analysis has been conducted using empirical mean curve decomposition (EMCD) model and voxel selection on raw fMRI signal. The erstwhile methodology loses frequency component, while the latter methodology suffers from signal redundancy. Both challenges are addressed by our methodology in which the frequency component is considered by decomposing the raw fMRI signal using geometric mean rather than arithmetic mean and the voxels are selected from EMCD signal using GMCD components, rather than raw fMRI signal. The proposed methodologies are adopted for predicting the neural response. Experimentations are conducted in the openly available fMRI data of six subjects, and comparisons are made with existing decomposition models and voxel selection frameworks. Subsequently, the effect of degree of selected voxels and the selection constraints are analyzed. The comparative results and the analysis demonstrate the superiority and the reliability of the proposed methodology.
Object-oriented analysis and design: a methodology for modeling the computer-based patient record.
Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L
1998-08-01
The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.
Analysis of methods. [information systems evolution environment
NASA Technical Reports Server (NTRS)
Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.
1991-01-01
Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity Relationship, Data Flow Diagrams, and Structure Charts, for inclusion in an integrated development support environment.
ERIC Educational Resources Information Center
Faggella-Luby, Michael; Lombardi, Allison; Lalor, Adam R.; Dukes, Lyman, III
2014-01-01
In order to assess the status of the research base that informs "what works" for students with disabilities in higher education, it is necessary to conduct an examination of the methodologies used in the literature. The authors of the current study analyzed the methodological trends across the thirty-year lifespan of the "Journal of…
Evaluation of the Field Test of Project Information Packages: Volume III--Resource Cost Analysis.
ERIC Educational Resources Information Center
Al-Salam, Nabeel; And Others
The third of three volumes evaluating the first year field test of the Project Information Packages (PIPs) provides a cost analysis study as a key element in the total evaluation. The resource approach to cost analysis is explained and the specific resource methodology used in the main cost analysis of the 19 PIP field-test projects detailed. The…
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Execution of a self-directed risk assessment methodology to address HIPAA data security requirements
NASA Astrophysics Data System (ADS)
Coleman, Johnathan
2003-05-01
This paper analyzes the method and training of a self directed risk assessment methodology entitled OCTAVE (Operationally Critical Threat Asset and Vulnerability Evaluation) at over 170 DOD medical treatment facilities. It focuses specifically on how OCTAVE built interdisciplinary, inter-hierarchical consensus and enhanced local capabilities to perform Health Information Assurance. The Risk Assessment Methodology was developed by the Software Engineering Institute at Carnegie Mellon University as part of the Defense Health Information Assurance Program (DHIAP). The basis for its success is the combination of analysis of organizational practices and technological vulnerabilities. Together, these areas address the core implications behind the HIPAA Security Rule and can be used to develop Organizational Protection Strategies and Technological Mitigation Plans. A key component of OCTAVE is the inter-disciplinary composition of the analysis team (Patient Administration, IT staff and Clinician). It is this unique composition of analysis team members, along with organizational and technical analysis of business practices, assets and threats, which enables facilities to create sound and effective security policies. The Risk Assessment is conducted in-house, and therefore the process, results and knowledge remain within the organization, helping to build consensus in an environment of differing organizational and disciplinary perspectives on Health Information Assurance.
30 CFR 780.21 - Hydrologic information.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 3 2011-07-01 2011-07-01 false Hydrologic information. 780.21 Section 780.21... Hydrologic information. (a) Sampling and analysis methodology. All water-quality analyses performed to meet... information on the availability of this material at NARA, call 202-741-6030, or go to: http://www.archives.gov...
30 CFR 780.21 - Hydrologic information.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 30 Mineral Resources 3 2012-07-01 2012-07-01 false Hydrologic information. 780.21 Section 780.21... Hydrologic information. (a) Sampling and analysis methodology. All water-quality analyses performed to meet... information on the availability of this material at NARA, call 202-741-6030, or go to: http://www.archives.gov...
Fadyl, Joanna K; Nicholls, David A; McPherson, Kathryn M
2013-09-01
Discourse analysis following the work of Michel Foucault has become a valuable methodology in the critical analysis of a broad range of topics relating to health. However, it can be a daunting task, in that there seems to be both a huge number of possible approaches to carrying out this type of project, and an abundance of different, often conflicting, opinions about what counts as 'Foucauldian'. This article takes the position that methodological design should be informed by ongoing discussion and applied as appropriate to a particular area of inquiry. The discussion given offers an interpretation and application of Foucault's methodological principles, integrating a reading of Foucault with applications of his work by other authors, showing how this is then applied to interrogate the practice of vocational rehabilitation. It is intended as a contribution to methodological discussion in this area, offering an interpretation of various methodological elements described by Foucault, alongside specific application of these aspects.
Treves-Kagan, Sarah; Naidoo, Evasen; Gilvydis, Jennifer M; Raphela, Elsie; Barnhart, Scott; Lippman, Sheri A
2017-09-01
Successful HIV prevention programming requires engaging communities in the planning process and responding to the social environmental factors that shape health and behaviour in a specific local context. We conducted two community-based situational analyses to inform a large, comprehensive HIV prevention programme in two rural districts of North West Province South Africa in 2012. The methodology includes: initial partnership building, goal setting and background research; 1 week of field work; in-field and subsequent data analysis; and community dissemination and programmatic incorporation of results. We describe the methodology and a case study of the approach in rural South Africa; assess if the methodology generated data with sufficient saturation, breadth and utility for programming purposes; and evaluate if this process successfully engaged the community. Between the two sites, 87 men and 105 women consented to in-depth interviews; 17 focus groups were conducted; and 13 health facilities and 7 NGOs were assessed. The methodology succeeded in quickly collecting high-quality data relevant to tailoring a comprehensive HIV programme and created a strong foundation for community engagement and integration with local health services. This methodology can be an accessible tool in guiding community engagement and tailoring future combination HIV prevention and care programmes.
A quality evaluation methodology of health web-pages for non-professionals.
Currò, Vincenzo; Buonuomo, Paola Sabrina; Onesimo, Roberta; de Rose, Paola; Vituzzi, Andrea; di Tanna, Gian Luca; D'Atri, Alessandro
2004-06-01
The proposal of an evaluation methodology for determining the quality of healthcare web sites for the dissemination of medical information to non-professionals. Three (macro) factors are considered for the quality evaluation: medical contents, accountability of the authors, and usability of the web site. Starting from two results in the literature the problem of whether or not to introduce a weighting function has been investigated. This methodology has been validated on a specialized information content, i.e., sore throats, due to the large interest such a topic enjoys with target users. The World Wide Web was accessed using a meta-search system merging several search engines. A statistical analysis was made to compare the proposed methodology with the obtained ranks of the sample web pages. The statistical analysis confirms that the variables examined (per item and sub factor) show substantially similar ranks and are capable of contributing to the evaluation of the main quality macro factors. A comparison between the aggregation functions in the proposed methodology (non-weighted averages) and the weighting functions, derived from the literature, allowed us to verify the suitability of the method. The proposed methodology suggests a simple approach which can quickly award an overall quality score for medical web sites oriented to non-professionals.
Three-Dimensional Extension of a Digital Library Service System
ERIC Educational Resources Information Center
Xiao, Long
2010-01-01
Purpose: The paper aims to provide an overall methodology and case study for the innovation and extension of a digital library, especially the service system. Design/methodology/approach: Based on the three-dimensional structure theory of the information service industry, this paper combines a comprehensive analysis with the practical experiences…
DOT National Transportation Integrated Search
1999-08-15
The Traffic Survey Unit plans to establish a methodology in which it can assign each Portable Traffic Counter (PTC) station a seasonal group profile through a means of statistical and geographical analysis. An ArcView Geographic Information Systems a...
ERIC Educational Resources Information Center
Lewis, Wiley B.
A review and analysis of Educational Resources Information Center (ERIC) publications and non-ERIC publications was made to assess availability and identify major findings, promising developments, strategies, and methodological strengths and weaknesses which exist in curricula designed for preparing food industry workers. Project national figures…
ERIC Educational Resources Information Center
Speakman, Sheree T.; And Others
1997-01-01
Examines the need for new financial reporting and analysis, starting with rethinking the school finance field, retooling the management information systems for school finance, and re-evaluating knowledge about school-site management, accounting, and reporting. Demonstrates a new reporting methodology, the Financial Analysis Model, that traces…
Review and Analysis of Curricula For Occupations in Public Services. Information Series No. 29.
ERIC Educational Resources Information Center
Lewis, Wiley B.
To identify major findings, promising developments, strategies, and methodological strengths and weaknesses which exist in curricula designed for training paraprofessional public service workers in the fields of education, social work, law enforcement, and fire fighting, a review and analysis was made of Educational Resources Information Center…
Risk-Informed External Hazards Analysis for Seismic and Flooding Phenomena for a Generic PWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parisi, Carlo; Prescott, Steve; Ma, Zhegang
This report describes the activities performed during the FY2017 for the US-DOE Light Water Reactor Sustainability Risk-Informed Safety Margin Characterization (LWRS-RISMC), Industry Application #2. The scope of Industry Application #2 is to deliver a risk-informed external hazards safety analysis for a representative nuclear power plant. Following the advancements occurred during the previous FYs (toolkits identification, models development), FY2017 focused on: increasing the level of realism of the analysis; improving the tools and the coupling methodologies. In particular the following objectives were achieved: calculation of buildings pounding and their effects on components seismic fragility; development of a SAPHIRE code PRA modelsmore » for 3-loops Westinghouse PWR; set-up of a methodology for performing static-dynamic PRA coupling between SAPHIRE and EMRALD codes; coupling RELAP5-3D/RAVEN for performing Best-Estimate Plus Uncertainty analysis and automatic limit surface search; and execute sample calculations for demonstrating the capabilities of the toolkit in performing a risk-informed external hazards safety analyses.« less
30 CFR 780.21 - Hydrologic information.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 30 Mineral Resources 3 2013-07-01 2013-07-01 false Hydrologic information. 780.21 Section 780.21... Hydrologic information. (a) Sampling and analysis methodology. All water-quality analyses performed to meet... Eastern Technical Service Center, U.S. Department of the Interior, Building 10, Parkway Center, Pittsburgh...
30 CFR 780.21 - Hydrologic information.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 30 Mineral Resources 3 2014-07-01 2014-07-01 false Hydrologic information. 780.21 Section 780.21... Hydrologic information. (a) Sampling and analysis methodology. All water-quality analyses performed to meet... Eastern Technical Service Center, U.S. Department of the Interior, Building 10, Parkway Center, Pittsburgh...
A Monte Carlo–Based Bayesian Approach for Measuring Agreement in a Qualitative Scale
Pérez Sánchez, Carlos Javier
2014-01-01
Agreement analysis has been an active research area whose techniques have been widely applied in psychology and other fields. However, statistical agreement among raters has been mainly considered from a classical statistics point of view. Bayesian methodology is a viable alternative that allows the inclusion of subjective initial information coming from expert opinions, personal judgments, or historical data. A Bayesian approach is proposed by providing a unified Monte Carlo–based framework to estimate all types of measures of agreement in a qualitative scale of response. The approach is conceptually simple and it has a low computational cost. Both informative and non-informative scenarios are considered. In case no initial information is available, the results are in line with the classical methodology, but providing more information on the measures of agreement. For the informative case, some guidelines are presented to elicitate the prior distribution. The approach has been applied to two applications related to schizophrenia diagnosis and sensory analysis. PMID:29881002
DB4US: A Decision Support System for Laboratory Information Management.
Carmona-Cejudo, José M; Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael
2012-11-14
Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources.
Digital Learning Characteristics and Principles of Information Resources Knowledge Structuring
ERIC Educational Resources Information Center
Belichenko, Margarita; Davidovitch, Nitza; Kravchenko, Yuri
2017-01-01
Analysis of principles knowledge representation in information systems led to the necessity of improving the structuring knowledge. It is caused by the development of software component and new possibilities of information technologies. The article combines methodological aspects of structuring knowledge and effective usage of information…
ERIC Educational Resources Information Center
Iivari, Juhani; Hirschheim, Rudy
1996-01-01
Analyzes and compares eight information systems (IS) development approaches: Information Modelling, Decision Support Systems, the Socio-Technical approach, the Infological approach, the Interactionist approach, the Speech Act-based approach, Soft Systems Methodology, and the Scandinavian Trade Unionist approach. Discusses the organizational roles…
Medical Professionals and Parents: A Linguistic Analysis of Communication Across Contexts.
ERIC Educational Resources Information Center
Tannen, Deborah; Wallat, Cynthia
1986-01-01
This study analyzes videotaped conversations between various family members of a child with cerebral palsy and medical professionals. The conversations are examined for information elaboration and condensation, information negotiation, as well as methodological benefits. The medical interviews elicited new information in various contexts and…
Krauter, Paula; Edwards, Donna; Yang, Lynn; Tucker, Mark
2011-09-01
Decontamination and recovery of a facility or outdoor area after a wide-area biological incident involving a highly persistent agent (eg, Bacillus anthracis spores) is a complex process that requires extensive information and significant resources, which are likely to be limited, particularly if multiple facilities or areas are affected. This article proposes a systematic methodology for evaluating information to select the decontamination or alternative treatments that optimize use of resources if decontamination is required for the facility or area. The methodology covers a wide range of approaches, including volumetric and surface decontamination, monitored natural attenuation, and seal and abandon strategies. A proposed trade-off analysis can help decision makers understand the relative appropriateness, efficacy, and labor, skill, and cost requirements of the various decontamination methods for the particular facility or area needing treatment--whether alone or as part of a larger decontamination effort. Because the state of decontamination knowledge and technology continues to evolve rapidly, the methodology presented here is designed to accommodate new strategies and materials and changing information.
NASA Technical Reports Server (NTRS)
1977-01-01
Information regarding the safety limits of hydrocarbons in liquid and gaseous oxygen, the steps taken for hydrocarbon removal from liquified gases, and the analysis of the contaminants was searched and the results are presented. The safety of hydrocarbons in gaseous systems was studied, and the latest hydrocarbon test equipment and methodology is reviewed. A detailed sampling and analysis plan is proposed to evaluate high pressure GN2 and LOX systems.
systems such as management information systems . To provide a methodology yielding quantitative results which may assist a commander and his staff in...this analysis, it is proposed that management information systems be evaluated as a whole by a technique defined as the semantic differential. Each
Factors Influencing Teachers' Engagement in Informal Learning Activities
ERIC Educational Resources Information Center
Lohman, Margaret C.
2006-01-01
Purpose: The purpose of this study is to examine factors influencing the engagement of public school teachers in informal learning activities. Design/methodology/approach: This study used a survey research design. Findings: Analysis of the data found that teachers rely to a greater degree on interactive than on independent informal learning…
Wet weather highway accident analysis and skid resistance data management system (volume I).
DOT National Transportation Integrated Search
1992-06-01
The objectives and scope of this research are to establish an effective methodology for wet weather accident analysis and to develop a database management system to facilitate information processing and storage for the accident analysis process, skid...
Pre-Clinical Testing of New Hydroxybutyrate Analogues
2012-07-01
representative of any past or future Batches/Lots. The methodology and/or techniques of analysis used to obtailrthese results may or may not be...purpose. For additional information regarding this product and its analysis , please contact your Eastman representative. This material is NOT for human...COMPOSITION INFORMATION ON INGREDIENTS (Typical romposition is given. and it may vaty. A cerlifica /8 of analysis em be provided, if avalabl8.) WeiQht
NASA Technical Reports Server (NTRS)
1979-01-01
Information to identify viable coal gasification and utilization technologies is presented. Analysis capabilities required to support design and implementation of coal based synthetic fuels complexes are identified. The potential market in the Southeast United States for coal based synthetic fuels is investigated. A requirements analysis to identify the types of modeling and analysis capabilities required to conduct and monitor coal gasification project designs is discussed. Models and methodologies to satisfy these requirements are identified and evaluated, and recommendations are developed. Requirements for development of technology and data needed to improve gasification feasibility and economies are examined.
NASA Technical Reports Server (NTRS)
Anusonti-Inthra, Phuriwat
2010-01-01
This paper presents validations of a novel rotorcraft analysis that coupled Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and Particle Vortex Transport Method (PVTM) methodologies. The CSD with associated vehicle trim analysis is used to calculate blade deformations and trim parameters. The near body CFD analysis is employed to provide detailed near body flow field information which is used to obtain high-fidelity blade aerodynamic loadings. The far field wake dominated region is simulated using the PVTM analysis which provides accurate prediction of the evolution of the rotor wake released from the near body CFD domains. A loose coupling methodology between the CSD and CFD/PVTM modules are used with appropriate information exchange amongst the CSD/CFD/PVTM modules. The coupled CSD/CFD/PVTM methodology is used to simulate various rotorcraft flight conditions (i.e. hover, transition, and high speed flights), and the results are compared with several sets of experimental data. For the hover condition, the results are compared with hover data for the HART II rotor tested at DLR Institute of Flight Systems, Germany. For the forward flight conditions, the results are validated with the UH-60A flight test data.
The Methodological Socialization of Social Science Doctoral Students in China and the USA
ERIC Educational Resources Information Center
Rhoads, Robert A.; Zheng, Mi; Sun, Xiaoyang
2017-01-01
This qualitative study reports findings from a comparative analysis of the methodological socialization of doctoral students in the social sciences at two universities: one in China and one in the USA. Relying primarily on theories of organizational socialization, the study focuses on formal and informal processes students report as part of…
ERIC Educational Resources Information Center
Sochos, Antigonos
2014-01-01
The couple relationship is an essential source of support for individuals undergoing psychological treatment and the aim of this study was to apply a new methodology in assessing the quality of such support. A theoretically informed thematic analysis of interview transcripts was conducted, triangulated by quantitative data. Twenty-one brief…
DOT National Transportation Integrated Search
1992-06-01
The objectives and scope of this research are to establish an effective methodology for wet weather accident analysis and to develop a database management system to facilitate information processing and storage for the accident analysis process, skid...
77 FR 73911 - Flightcrew Member Duty and Rest Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-12
... discussion of the methodology and information sources used in the rulemaking analysis, corrects reporting and... Aviation Administration (FAA), DOT. ACTION: Availability of Initial Supplemental Regulatory Impact Analysis. SUMMARY: The FAA is issuing an Initial Supplemental Regulatory Impact Analysis of its final rule amending...
Location-only and use-availability data: analysis methods converge
McDonald, Lyman; Manly, Bryan; Huettmann, Falk; Thogmartin, Wayne
2013-01-01
by the availability of geographical information systems (GIS), global positioning system (GPS) radiotelemetry and public online data access initiatives (e.g. global biodiversity information facility). The papers in this Special Feature are intended to present the state of the methodological art in their subject area, with particular attention paid to contrasting the advantages and disadvantages of alternative methods of analysis for data.
DB4US: A Decision Support System for Laboratory Information Management
Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael
2012-01-01
Background Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. Objective To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. Methods We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. Results DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. Conclusions The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources. PMID:23608745
Khakzad, Nima; Khan, Faisal; Amyotte, Paul
2015-07-01
Compared to the remarkable progress in risk analysis of normal accidents, the risk analysis of major accidents has not been so well-established, partly due to the complexity of such accidents and partly due to low probabilities involved. The issue of low probabilities normally arises from the scarcity of major accidents' relevant data since such accidents are few and far between. In this work, knowing that major accidents are frequently preceded by accident precursors, a novel precursor-based methodology has been developed for likelihood modeling of major accidents in critical infrastructures based on a unique combination of accident precursor data, information theory, and approximate reasoning. For this purpose, we have introduced an innovative application of information analysis to identify the most informative near accident of a major accident. The observed data of the near accident were then used to establish predictive scenarios to foresee the occurrence of the major accident. We verified the methodology using offshore blowouts in the Gulf of Mexico, and then demonstrated its application to dam breaches in the United Sates. © 2015 Society for Risk Analysis.
Evolutionary Computing Methods for Spectral Retrieval
NASA Technical Reports Server (NTRS)
Terrile, Richard; Fink, Wolfgang; Huntsberger, Terrance; Lee, Seugwon; Tisdale, Edwin; VonAllmen, Paul; Tinetti, Geivanna
2009-01-01
A methodology for processing spectral images to retrieve information on underlying physical, chemical, and/or biological phenomena is based on evolutionary and related computational methods implemented in software. In a typical case, the solution (the information that one seeks to retrieve) consists of parameters of a mathematical model that represents one or more of the phenomena of interest. The methodology was developed for the initial purpose of retrieving the desired information from spectral image data acquired by remote-sensing instruments aimed at planets (including the Earth). Examples of information desired in such applications include trace gas concentrations, temperature profiles, surface types, day/night fractions, cloud/aerosol fractions, seasons, and viewing angles. The methodology is also potentially useful for retrieving information on chemical and/or biological hazards in terrestrial settings. In this methodology, one utilizes an iterative process that minimizes a fitness function indicative of the degree of dissimilarity between observed and synthetic spectral and angular data. The evolutionary computing methods that lie at the heart of this process yield a population of solutions (sets of the desired parameters) within an accuracy represented by a fitness-function value specified by the user. The evolutionary computing methods (ECM) used in this methodology are Genetic Algorithms and Simulated Annealing, both of which are well-established optimization techniques and have also been described in previous NASA Tech Briefs articles. These are embedded in a conceptual framework, represented in the architecture of the implementing software, that enables automatic retrieval of spectral and angular data and analysis of the retrieved solutions for uniqueness.
William H. Cooke; Dennis M. Jacobs
2002-01-01
FIA annual inventories require rapid updating of pixel-based Phase 1 estimates. Scientists at the Southern Research Station are developing an automated methodology that uses a Normalized Difference Vegetation Index (NDVI) for identifying and eliminating problem FIA plots from the analysis. Problem plots are those that have questionable land useiland cover information....
The problem of resonance in technology usage
NASA Technical Reports Server (NTRS)
Sayani, H. H.; Svoboda, C. P.
1981-01-01
Various information system tools and techniques are analyzed. A case study is presented which draws together the issues raised in three distinct cases. This case study shows a typical progression from the selection of an analysis methodology, to the adoption of an automated tool for specification and documentation, and the difficulty of fitting these into an existing life cycle development methodology.
Evaluating the statistical methodology of randomized trials on dentin hypersensitivity management.
Matranga, Domenica; Matera, Federico; Pizzo, Giuseppe
2017-12-27
The present study aimed to evaluate the characteristics and quality of statistical methodology used in clinical studies on dentin hypersensitivity management. An electronic search was performed for data published from 2009 to 2014 by using PubMed, Ovid/MEDLINE, and Cochrane Library databases. The primary search terms were used in combination. Eligibility criteria included randomized clinical trials that evaluated the efficacy of desensitizing agents in terms of reducing dentin hypersensitivity. A total of 40 studies were considered eligible for assessment of quality statistical methodology. The four main concerns identified were i) use of nonparametric tests in the presence of large samples, coupled with lack of information about normality and equality of variances of the response; ii) lack of P-value adjustment for multiple comparisons; iii) failure to account for interactions between treatment and follow-up time; and iv) no information about the number of teeth examined per patient and the consequent lack of cluster-specific approach in data analysis. Owing to these concerns, statistical methodology was judged as inappropriate in 77.1% of the 35 studies that used parametric methods. Additional studies with appropriate statistical analysis are required to obtain appropriate assessment of the efficacy of desensitizing agents.
Wennerberg, Mia M T; Lundgren, Solveig M; Danielson, Ella
2012-01-01
This article describes the theoretical foundation and methodology used in a study intended to increase knowledge concerning informal caregivers' resources to health (in salutogenesis; General Resistance Resources, GRRs). A detailed description of how the approach derived from salutogenic theory was used and how it permeated the entire study, from design to findings, is provided. How participation in the study was experienced is discussed and methodological improvements and implications suggested. Using an explorative, mixed method design, data was collected through salutogenically guided interviews with 32 Swedish caregivers to older adults. A constant comparative method of analysis was used to identify caregiver-GRRs, content analysis was further used to describe how participation was experienced. The methodology unraveled GRRs caregivers used to obtain positive experiences of caregiving, but also hindrances for such usage contributing to negative experiences. Mixed data made it possible to venture beyond actual findings to derive a synthesis describing the experienced, communal context of the population reliant on these GRRs; Caregivinghood. Participating in the salutogenic data-collection was found to be a reflective, mainly positive, empowering and enlightening experience. The methodology was advantageous, even if time-consuming, as it in one study unravelled caregiver-GRRs and hindrances for their usage on individual, communal and contextual levels. It is suggested that the ability to describe Caregivinghood may be essential when developing health-promoting strategies for caregivers at individual, municipal and national levels. The methodology makes such a description possible and suggested methodological improvements may enhance its usability and adaptability to other populations.
ERIC Educational Resources Information Center
Clemens, Rachael Annette
2017-01-01
This qualitative and interpretive inquiry explores the information behavior of birthmothers surrounding the processes of decision-making, coping, and living with the act of child relinquishment to adoption. An interpretative phenomenological analysis methodology is used to reveal the phenomenon as experienced by eight birthmothers, women who…
ERIC Educational Resources Information Center
Yang, Qinghua; Yang, Fan; Zhou, Chun
2015-01-01
Purpose: The purpose of this paper is to investigate how the information about haze, a term used in China to describe the air pollution problem, is portrayed on Chinese social media by different types of organizations using the theoretical framework of the health belief model (HBM). Design/methodology/approach: A content analysis was conducted…
Roetzheim, Richard G.; Freund, Karen M.; Corle, Don K.; Murray, David M.; Snyder, Frederick R.; Kronman, Andrea C.; Jean-Pierre, Pascal; Raich, Peter C.; Holden, Alan E. C.; Darnell, Julie S.; Warren-Mears, Victoria; Patierno, Steven; Design, PNRP; Committee, Analysis
2013-01-01
Background The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, each employing its own unique study design. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from members of the PNRP Design and Analysis Committee Purpose To review possible methodologies for analyzing combined data arising from heterogeneous study designs. Methods The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. Conclusions were based on simple consensus. The five approaches reviewed included: 1) Analyzing and reporting each project separately, 2) Combining data from all projects and performing an individual-level analysis, 3) Pooling data from projects having similar study designs, 4) Analyzing pooled data using a prospective meta analytic technique, 5) Analyzing pooled data utilizing a novel simulated group randomized design. Results Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and in their impact from differing project sample sizes. Limitations The conclusions reached were based on expert opinion and not derived from actual analyses performed. Conclusions The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multi-site community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become more salient. Discussion of the analytic issues faced by the PNRP and the methodological approaches we considered may be of value to other prospective community-based research programs. PMID:22273587
NASA Astrophysics Data System (ADS)
Soilán, M.; Riveiro, B.; Sánchez-Rodríguez, A.; González-deSantos, L. M.
2018-05-01
During the last few years, there has been a huge methodological development regarding the automatic processing of 3D point cloud data acquired by both terrestrial and aerial mobile mapping systems, motivated by the improvement of surveying technologies and hardware performance. This paper presents a methodology that, in a first place, extracts geometric and semantic information regarding the road markings within the surveyed area from Mobile Laser Scanning (MLS) data, and then employs it to isolate street areas where pedestrian crossings are found and, therefore, pedestrians are more likely to cross the road. Then, different safety-related features can be extracted in order to offer information about the adequacy of the pedestrian crossing regarding its safety, which can be displayed in a Geographical Information System (GIS) layer. These features are defined in four different processing modules: Accessibility analysis, traffic lights classification, traffic signs classification, and visibility analysis. The validation of the proposed methodology has been carried out in two different cities in the northwest of Spain, obtaining both quantitative and qualitative results for pedestrian crossing classification and for each processing module of the safety assessment on pedestrian crossing environments.
Research in health sciences library and information science: a quantitative analysis.
Dimitroff, A
1992-01-01
A content analysis of research articles published between 1966 and 1990 in the Bulletin of the Medical Library Association was undertaken. Four specific questions were addressed: What subjects are of interest to health sciences librarians? Who is conducting this research? How do health sciences librarians conduct their research? Do health sciences librarians obtain funding for their research activities? Bibliometric characteristics of the research articles are described and compared to characteristics of research in library and information science as a whole in terms of subject and methodology. General findings were that most research in health sciences librarianship is conducted by librarians affiliated with academic health sciences libraries (51.8%); most deals with an applied (45.7%) or a theoretical (29.2%) topic; survey (41.0%) or observational (20.7%) research methodologies are used; descriptive quantitative analytical techniques are used (83.5%); and over 25% of research is funded. The average number of authors was 1.85, average article length was 7.25 pages, and average number of citations per article was 9.23. These findings are consistent with those reported in the general library and information science literature for the most part, although specific differences do exist in methodological and analytical areas. PMID:1422504
Development of a Methodology for Assessing Aircrew Workloads.
1981-11-01
Workload Feasibility Study. .. ...... 52 Subjects. .. .............. ........ 53 Equipment .. ............... ....... 53 Date Analysis ... analysis ; simulation; standard time systems; switching synthetic time systems; task activities; task interference; time study; tracking; workload; work sampl...standard data systems, information content analysis , work sampling and job evaluation. Con- ventional methods were found to be deficient in accounting
Hernando, David; Hernando, Alberto; Casajús, Jose A; Laguna, Pablo; Garatachea, Nuria; Bailón, Raquel
2018-05-01
Standard methodologies of heart rate variability analysis and physiological interpretation as a marker of autonomic nervous system condition have been largely published at rest, but not so much during exercise. A methodological framework for heart rate variability (HRV) analysis during exercise is proposed, which deals with the non-stationary nature of HRV during exercise, includes respiratory information, and identifies and corrects spectral components related to cardiolocomotor coupling (CC). This is applied to 23 male subjects who underwent different tests: maximal and submaximal, running and cycling; where the ECG, respiratory frequency and oxygen consumption were simultaneously recorded. High-frequency (HF) power results largely modified from estimations with the standard fixed band to those obtained with the proposed methodology. For medium and high levels of exercise and recovery, HF power results in a 20 to 40% increase. When cycling, HF power increases around 40% with respect to running, while CC power is around 20% stronger in running.
NASTRAN benefits analysis. Volume 2: Final technical report
NASA Technical Reports Server (NTRS)
1972-01-01
Baseline data are considered for comparisons of the costs and benefits of the NASA structural analysis program and to determine impacts and benefits to current users. To develop this information, questionnaires were mailed to users. Personal and telephone interviews were made to solicit further information. The questions in the questionnaire and in the interview were related to benefits derived from the programs, areas of needed improvement, and applicable usage comments. The collected information was compiled and analyzed. Methodology, analyses, and results are presented. The information is applicable to issues preceding NASTRAN Level 15.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szilard, Ronaldo Henriques
A Risk Informed Safety Margin Characterization (RISMC) toolkit and methodology are proposed for investigating nuclear power plant core, fuels design and safety analysis, including postulated Loss-of-Coolant Accident (LOCA) analysis. This toolkit, under an integrated evaluation model framework, is name LOCA toolkit for the US (LOTUS). This demonstration includes coupled analysis of core design, fuel design, thermal hydraulics and systems analysis, using advanced risk analysis tools and methods to investigate a wide range of results.
Lee, Andy H; Zhou, Xu; Kang, Deying; Luo, Yanan; Liu, Jiali; Sun, Xin
2018-01-01
Objective To assess risk of bias and to investigate methodological issues concerning the design, conduct and analysis of randomised controlled trials (RCTs) testing acupuncture for knee osteoarthritis (KOA). Methods PubMed, EMBASE, Cochrane Central Register of Controlled Trials and four major Chinese databases were searched for RCTs that investigated the effect of acupuncture for KOA. The Cochrane tool was used to examine the risk of bias of eligible RCTs. Their methodological details were examined using a standardised and pilot-tested questionnaire of 48 items, together with the association between four predefined factors and important methodological quality indicators. Results A total of 248 RCTs were eligible, of which 39 (15.7%) used computer-generated randomisation sequence. Of the 31 (12.5%) trials that stated the allocation concealment, only one used central randomisation. Twenty-five (10.1%) trials mentioned that their acupuncture procedures were standardised, but only 18 (7.3%) specified how the standardisation was achieved. The great majority of trials (n=233, 94%) stated that blinding was in place, but 204 (87.6%) did not clarify who was blinded. Only 27 (10.9%) trials specified the primary outcome, for which 7 used intention-to-treat analysis. Only 17 (6.9%) trials included details on sample size calculation; none preplanned an interim analysis and associated stopping rule. In total, 46 (18.5%) trials explicitly stated that loss to follow-up occurred, but only 6 (2.4%) provided some information to deal with the issue. No trials prespecified, conducted or reported any subgroup or adjusted analysis for the primary outcome. Conclusion The overall risk of bias was high among published RCTs testing acupuncture for KOA. Methodological limitations were present in many important aspects of design, conduct and analyses. These findings inform the development of evidence-based methodological guidance for future trials assessing the effect of acupuncture for KOA. PMID:29511016
An Analysis of Category Management of Service Contracts
2017-12-01
management teams a way to make informed , data-driven decisions. Data-driven decisions derived from clustering not only align with Category...savings. Furthermore, this methodology provides a data-driven visualization to inform sound business decisions on potential Category Management ...Category Management initiatives. The Maptitude software will allow future research to collect data and develop visualizations to inform Category
2012-03-01
Targeting Review Board OPLAN Operations Plan OPORD Operations Order OPSIT Operational Situation OSINT Open Source Intelligence OV...Analysis Evaluate FLTREPs MISREPs Unit Assign Assets Feedback Asset Shortfalls Multi-Int Collection Political & Embasy Law Enforcement HUMINT OSINT ...Embassy Information OSINT Manage Theater HUMINT Law Enforcement Collection Sort Requests Platform Information Agency Information M-I Collect
Steuten, Lotte; van de Wetering, Gijs; Groothuis-Oudshoorn, Karin; Retèl, Valesca
2013-01-01
This article provides a systematic and critical review of the evolving methods and applications of value of information (VOI) in academia and practice and discusses where future research needs to be directed. Published VOI studies were identified by conducting a computerized search on Scopus and ISI Web of Science from 1980 until December 2011 using pre-specified search terms. Only full-text papers that outlined and discussed VOI methods for medical decision making, and studies that applied VOI and explicitly discussed the results with a view to informing healthcare decision makers, were included. The included papers were divided into methodological and applied papers, based on the aim of the study. A total of 118 papers were included of which 50 % (n = 59) are methodological. A rapidly accumulating literature base on VOI from 1999 onwards for methodological papers and from 2005 onwards for applied papers is observed. Expected value of sample information (EVSI) is the preferred method of VOI to inform decision making regarding specific future studies, but real-life applications of EVSI remain scarce. Methodological challenges to VOI are numerous and include the high computational demands, dealing with non-linear models and interdependency between parameters, estimations of effective time horizons and patient populations, and structural uncertainties. VOI analysis receives increasing attention in both the methodological and the applied literature bases, but challenges to applying VOI in real-life decision making remain. For many technical and methodological challenges to VOI analytic solutions have been proposed in the literature, including leaner methods for VOI. Further research should also focus on the needs of decision makers regarding VOI.
The strategic planning of health management information systems.
Smith, J
1995-01-01
This paper discusses the roles and functions of strategic planning of information systems in health services. It selects four specialised methodologies of strategic planning for analysis with respect to their applicability in the health field. It then examines the utilisation of information planning in case studies of three health organisations (two State departments of health and community services and one acute care institution). Issues arising from the analysis concern the planning process, the use to which plans are put, and implications for management.
Varas, Lautaro R; Pontes, F C; Santos, A C F; Coutinho, L H; de Souza, G G B
2015-09-15
The ion-ion-coincidence mass spectroscopy technique brings useful information about the fragmentation dynamics of doubly and multiply charged ionic species. We advocate the use of a matrix-parameter methodology in order to represent and interpret the entire ion-ion spectra associated with the ionic dissociation of doubly charged molecules. This method makes it possible, among other things, to infer fragmentation processes and to extract information about overlapped ion-ion coincidences. This important piece of information is difficult to obtain from other previously described methodologies. A Wiley-McLaren time-of-flight mass spectrometer was used to discriminate the positively charged fragment ions resulting from the sample ionization by a pulsed 800 eV electron beam. We exemplify the application of this methodology by analyzing the fragmentation and ionic dissociation of the dimethyl disulfide (DMDS) molecule as induced by fast electrons. The doubly charged dissociation was analyzed using the Multivariate Normal Distribution. The ion-ion spectrum of the DMDS molecule was obtained at an incident electron energy of 800 eV and was matrix represented using the Multivariate Distribution theory. The proposed methodology allows us to distinguish information among [CH n SH n ] + /[CH 3 ] + (n = 1-3) fragment ions in the ion-ion coincidence spectra using ion-ion coincidence data. Using the momenta balance methodology for the inferred parameters, a secondary decay mechanism is proposed for the [CHS] + ion formation. As an additional check on the methodology, previously published data on the SiF 4 molecule was re-analyzed with the present methodology and the results were shown to be statistically equivalent. The use of a Multivariate Normal Distribution allows for the representation of the whole ion-ion mass spectrum of doubly or multiply ionized molecules as a combination of parameters and the extraction of information among overlapped data. We have successfully applied this methodology to the analysis of the fragmentation of the DMDS molecule. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
A multicriteria decision making model for assessment and selection of an ERP in a logistics context
NASA Astrophysics Data System (ADS)
Pereira, Teresa; Ferreira, Fernanda A.
2017-07-01
The aim of this work is to apply a methodology of decision support based on a multicriteria decision analyses (MCDA) model that allows the assessment and selection of an Enterprise Resource Planning (ERP) in a Portuguese logistics company by Group Decision Maker (GDM). A Decision Support system (DSS) that implements a MCDA - Multicriteria Methodology for the Assessment and Selection of Information Systems / Information Technologies (MMASSI / IT) is used based on its features and facility to change and adapt the model to a given scope. Using this DSS it was obtained the information system that best suited to the decisional context, being this result evaluated through a sensitivity and robustness analysis.
Integration, warehousing, and analysis strategies of Omics data.
Gedela, Srinubabu
2011-01-01
"-Omics" is a current suffix for numerous types of large-scale biological data generation procedures, which naturally demand the development of novel algorithms for data storage and analysis. With next generation genome sequencing burgeoning, it is pivotal to decipher a coding site on the genome, a gene's function, and information on transcripts next to the pure availability of sequence information. To explore a genome and downstream molecular processes, we need umpteen results at the various levels of cellular organization by utilizing different experimental designs, data analysis strategies and methodologies. Here comes the need for controlled vocabularies and data integration to annotate, store, and update the flow of experimental data. This chapter explores key methodologies to merge Omics data by semantic data carriers, discusses controlled vocabularies as eXtensible Markup Languages (XML), and provides practical guidance, databases, and software links supporting the integration of Omics data.
Hasan, Haroon; Muhammed, Taaha; Yu, Jennifer; Taguchi, Kelsi; Samargandi, Osama A; Howard, A Fuchsia; Lo, Andrea C; Olson, Robert; Goddard, Karen
2017-10-01
The objective of our study was to evaluate the methodological quality of systematic reviews and meta-analyses in Radiation Oncology. A systematic literature search was conducted for all eligible systematic reviews and meta-analyses in Radiation Oncology from 1966 to 2015. Methodological characteristics were abstracted from all works that satisfied the inclusion criteria and quality was assessed using the critical appraisal tool, AMSTAR. Regression analyses were performed to determine factors associated with a higher score of quality. Following exclusion based on a priori criteria, 410 studies (157 systematic reviews and 253 meta-analyses) satisfied the inclusion criteria. Meta-analyses were found to be of fair to good quality while systematic reviews were found to be of less than fair quality. Factors associated with higher scores of quality in the multivariable analysis were including primary studies consisting of randomized control trials, performing a meta-analysis, and applying a recommended guideline related to establishing a systematic review protocol and/or reporting. Systematic reviews and meta-analyses may introduce a high risk of bias if applied to inform decision-making based on AMSTAR. We recommend that decision-makers in Radiation Oncology scrutinize the methodological quality of systematic reviews and meta-analyses prior to assessing their utility to inform evidence-based medicine and researchers adhere to methodological standards outlined in validated guidelines when embarking on a systematic review. Copyright © 2017 Elsevier Ltd. All rights reserved.
[Inverse probability weighting (IPW) for evaluating and "correcting" selection bias].
Narduzzi, Silvia; Golini, Martina Nicole; Porta, Daniela; Stafoggia, Massimo; Forastiere, Francesco
2014-01-01
the Inverse probability weighting (IPW) is a methodology developed to account for missingness and selection bias caused by non-randomselection of observations, or non-random lack of some information in a subgroup of the population. to provide an overview of IPW methodology and an application in a cohort study of the association between exposure to traffic air pollution (nitrogen dioxide, NO₂) and 7-year children IQ. this methodology allows to correct the analysis by weighting the observations with the probability of being selected. The IPW is based on the assumption that individual information that can predict the probability of inclusion (non-missingness) are available for the entire study population, so that, after taking account of them, we can make inferences about the entire target population starting from the nonmissing observations alone.The procedure for the calculation is the following: firstly, we consider the entire population at study and calculate the probability of non-missing information using a logistic regression model, where the response is the nonmissingness and the covariates are its possible predictors.The weight of each subject is given by the inverse of the predicted probability. Then the analysis is performed only on the non-missing observations using a weighted model. IPW is a technique that allows to embed the selection process in the analysis of the estimates, but its effectiveness in "correcting" the selection bias depends on the availability of enough information, for the entire population, to predict the non-missingness probability. In the example proposed, the IPW application showed that the effect of exposure to NO2 on the area of verbal intelligence quotient of children is stronger than the effect showed from the analysis performed without regard to the selection processes.
System analysis through bond graph modeling
NASA Astrophysics Data System (ADS)
McBride, Robert Thomas
2005-07-01
Modeling and simulation form an integral role in the engineering design process. An accurate mathematical description of a system provides the design engineer the flexibility to perform trade studies quickly and accurately to expedite the design process. Most often, the mathematical model of the system contains components of different engineering disciplines. A modeling methodology that can handle these types of systems might be used in an indirect fashion to extract added information from the model. This research examines the ability of a modeling methodology to provide added insight into system analysis and design. The modeling methodology used is bond graph modeling. An investigation into the creation of a bond graph model using the Lagrangian of the system is provided. Upon creation of the bond graph, system analysis is performed. To aid in the system analysis, an object-oriented approach to bond graph modeling is introduced. A framework is provided to simulate the bond graph directly. Through object-oriented simulation of a bond graph, the information contained within the bond graph can be exploited to create a measurement of system efficiency. A definition of system efficiency is given. This measurement of efficiency is used in the design of different controllers of varying architectures. Optimal control of a missile autopilot is discussed within the framework of the calculated system efficiency.
William H. Cooke; Dennis M. Jacobs
2005-01-01
FIA annual inventories require rapid updating of pixel-based Phase 1 estimates. Scientists at the Southern Research Station are developing an automated methodology that uses a Normalized Difference Vegetation Index (NDVI) for identifying and eliminating problem FIA plots from the analysis. Problem plots are those that have questionable land use/land cover information....
A Methodology to Develop Entrepreneurial Networks: The Tech Ecosystem of Six African Cities
2014-11-01
Information Center. Greve, A. and Salaff, J. W. (2003), Social Networks and Entrepreneurship . Entrepreneurship Theory and Practice, 28: 1–22. doi...methodology enables us to accurately measure social capital and circumvents the massive effort of mapping an individual’s social network before...locating the social resources in it. 15. SUBJECT TERMS Network Analysis, Economic Networks, Network Topology, Network Classification 16. SECURITY
ERIC Educational Resources Information Center
Unicomb, Rachael; Colyvas, Kim; Harrison, Elisabeth; Hewat, Sally
2015-01-01
Purpose: Case-study methodology studying change is often used in the field of speech-language pathology, but it can be criticized for not being statistically robust. Yet with the heterogeneous nature of many communication disorders, case studies allow clinicians and researchers to closely observe and report on change. Such information is valuable…
An Analysis of Defense Information and Information Technology Articles: A Sixteen-Year Perspective
2009-03-01
exploratory,” or “subjective” ( Denzin & Lincoln , 2000). Existing Research This research is based on content analysis methodologies utilized by Carter...same codes ( Denzin & Lincoln , 2000). Different analysts should code the same text in a similar manner (Weber, 1990). Typically, researchers compute...chosen. Krippendorf recommends an agreement level of at least .70 (Krippendorff, 2004). Some scholars use a cut-off rate of .80 ( Denzin & Lincoln
A dictionary based informational genome analysis
2012-01-01
Background In the post-genomic era several methods of computational genomics are emerging to understand how the whole information is structured within genomes. Literature of last five years accounts for several alignment-free methods, arisen as alternative metrics for dissimilarity of biological sequences. Among the others, recent approaches are based on empirical frequencies of DNA k-mers in whole genomes. Results Any set of words (factors) occurring in a genome provides a genomic dictionary. About sixty genomes were analyzed by means of informational indexes based on genomic dictionaries, where a systemic view replaces a local sequence analysis. A software prototype applying a methodology here outlined carried out some computations on genomic data. We computed informational indexes, built the genomic dictionaries with different sizes, along with frequency distributions. The software performed three main tasks: computation of informational indexes, storage of these in a database, index analysis and visualization. The validation was done by investigating genomes of various organisms. A systematic analysis of genomic repeats of several lengths, which is of vivid interest in biology (for example to compute excessively represented functional sequences, such as promoters), was discussed, and suggested a method to define synthetic genetic networks. Conclusions We introduced a methodology based on dictionaries, and an efficient motif-finding software application for comparative genomics. This approach could be extended along many investigation lines, namely exported in other contexts of computational genomics, as a basis for discrimination of genomic pathologies. PMID:22985068
Nguyen, Ha T.; Pearce, Joshua M.; Harrap, Rob; Barber, Gerald
2012-01-01
A methodology is provided for the application of Light Detection and Ranging (LiDAR) to automated solar photovoltaic (PV) deployment analysis on the regional scale. Challenges in urban information extraction and management for solar PV deployment assessment are determined and quantitative solutions are offered. This paper provides the following contributions: (i) a methodology that is consistent with recommendations from existing literature advocating the integration of cross-disciplinary competences in remote sensing (RS), GIS, computer vision and urban environmental studies; (ii) a robust methodology that can work with low-resolution, incomprehensive data and reconstruct vegetation and building separately, but concurrently; (iii) recommendations for future generation of software. A case study is presented as an example of the methodology. Experience from the case study such as the trade-off between time consumption and data quality are discussed to highlight a need for connectivity between demographic information, electrical engineering schemes and GIS and a typical factor of solar useful roofs extracted per method. Finally, conclusions are developed to provide a final methodology to extract the most useful information from the lowest resolution and least comprehensive data to provide solar electric assessments over large areas, which can be adapted anywhere in the world. PMID:22666044
Methodology for CFD Design Analysis of National Launch System Nozzle Manifold
NASA Technical Reports Server (NTRS)
Haire, Scot L.
1993-01-01
The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.
pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis.
Giannakopoulos, Theodoros
2015-01-01
Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library.
pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis
Giannakopoulos, Theodoros
2015-01-01
Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library. PMID:26656189
GuidosToolbox: universal digital image object analysis
Peter Vogt; Kurt Riitters
2017-01-01
The increased availability of mapped environmental data calls for better tools to analyze the spatial characteristics and information contained in those maps. Publicly available, userfriendly and universal tools are needed to foster the interdisciplinary development and application of methodologies for the extraction of image object information properties contained in...
Rational Analyses of Information Foraging on the Web
ERIC Educational Resources Information Center
Pirolli, Peter
2005-01-01
This article describes rational analyses and cognitive models of Web users developed within information foraging theory. This is done by following the rational analysis methodology of (a) characterizing the problems posed by the environment, (b) developing rational analyses of behavioral solutions to those problems, and (c) developing cognitive…
2008-03-01
sponsor, Capt. Larry Fortson, for sharing a common vision; my knowledgeable committee members, Dr. Robert F. Mills and Dr. Dennis D. Strouble, for...Accounting Approaches ........................................................................ 16 vi Page Fair Market Value (FMV...22 Uniform Commercial Code ( UCC ) ....................................................................... 23
The Dynamics of Information Search Services.
ERIC Educational Resources Information Center
Lindquist, Mats G.
Computer-based information search services (ISSs) of the type that provide online literature searches are analyzed from a systems viewpoint using a continuous simulation model. The methodology applied is "system dynamics," and the system language is DYNAMO. The analysis reveals that the observed growth and stagnation of a typical ISS can…
Using Cluster Analysis for Data Mining in Educational Technology Research
ERIC Educational Resources Information Center
Antonenko, Pavlo D.; Toy, Serkan; Niederhauser, Dale S.
2012-01-01
Cluster analysis is a group of statistical methods that has great potential for analyzing the vast amounts of web server-log data to understand student learning from hyperlinked information resources. In this methodological paper we provide an introduction to cluster analysis for educational technology researchers and illustrate its use through…
Haegele, Justin A; Hodge, Samuel Russell
2015-10-01
There are basic philosophical and paradigmatic assumptions that guide scholarly research endeavors, including the methods used and the types of questions asked. Through this article, kinesiology faculty and students with interests in adapted physical activity are encouraged to understand the basic assumptions of applied behavior analysis (ABA) methodology for conducting, analyzing, and presenting research of high quality in this paradigm. The purposes of this viewpoint paper are to present information fundamental to understanding the assumptions undergirding research methodology in ABA, describe key aspects of single-subject research designs, and discuss common research designs and data-analysis strategies used in single-subject studies.
A New Methodology for Simultaneous Multi-layer Retrievals of Ice and Liquid Water Cloud Properties
NASA Astrophysics Data System (ADS)
Sourdeval, O.; Labonnote, L.; Baran, A. J.; Brogniez, G.
2014-12-01
It is widely recognized that the study of clouds has nowadays become one of the major concern of the climate research community. Consequently, a multitude of retrieval methodologies have been developed during the last decades in order to obtain accurate retrievals of cloud properties that can be supplied to climate models. Most of the current methodologies have proven to be satisfactory for separately retrieving ice or liquid cloud properties, but very few of them have attempted simultaneous retrievals of these two cloud types. Recent studies nevertheless show that the omission of one of these layers can have strong consequences on the retrievals and their accuracy. In this study, a new methodology that simultaneously retrieves the properties of ice and liquid clouds is presented. The optical thickness and the effective radius of up to two liquid cloud layers and the ice water path of one ice cloud layer are simultaneously retrieved, along with an accurate estimation of their uncertainties. Radiometric measurements ranging from the visible to the thermal infrared are used for performing the retrievals. In order to quantify the capabilities and limitations of our methodology, the results of a theoretical information content analysis are first presented. This analysis allows obtaining an a priori understanding of how much information should be expected on each of the retrieval parameters in different atmospheric conditions, and which set of channels is likely to provide this information. After such theoretical considerations, global retrievals corresponding to several months of A-Train data are presented. Comparisons of our retrievals with operational products from active and passive instruments are effectuated and show good global agreements. These comparisons are useful for validating our retrievals but also for testing how operational products can be influenced by multi-layer configurations.
ERIC Educational Resources Information Center
Mittal, Surabhi; Mehar, Mamta
2016-01-01
Purpose: The paper analyzes factors that affect the likelihood of adoption of different agriculture-related information sources by farmers. Design/Methodology/Approach: The paper links the theoretical understanding of the existing multiple sources of information that farmers use, with the empirical model to analyze the factors that affect the…
ERIC Educational Resources Information Center
Ismail, Noor Azizi
2008-01-01
Purpose: The paper's purpose is to investigate the issues of IT governance, funding and structure of a public university in Malaysia. Design/methodology/approach: The study uses a case study approach, i.e. a series of interviews with users and information services provider of campus information system. Findings: The university lacks a common…
Kushniruk, A. W.; Patel, V. L.; Cimino, J. J.
1997-01-01
This paper describes an approach to the evaluation of health care information technologies based on usability engineering and a methodological framework from the study of medical cognition. The approach involves collection of a rich set of data including video recording of health care workers as they interact with systems, such as computerized patient records and decision support tools. The methodology can be applied in the laboratory setting, typically involving subjects "thinking aloud" as they interact with a system. A similar approach to data collection and analysis can also be extended to study of computer systems in the "live" environment of hospital clinics. Our approach is also influenced from work in the area of cognitive task analysis, which aims to characterize the decision making and reasoning of subjects of varied levels of expertise as they interact with information technology in carrying out representative tasks. The stages involved in conducting cognitively-based usability analyses are detailed and the application of such analysis in the iterative process of system and interface development is discussed. PMID:9357620
Washington, Simon; Oh, Jutaek
2006-03-01
Transportation professionals are sometimes required to make difficult transportation safety investment decisions in the face of uncertainty. In particular, an engineer may be expected to choose among an array of technologies and/or countermeasures to remediate perceived safety problems when: (1) little information is known about the countermeasure effects on safety; (2) information is known but from different regions, states, or countries where a direct generalization may not be appropriate; (3) where the technologies and/or countermeasures are relatively untested, or (4) where costs prohibit the full and careful testing of each of the candidate countermeasures via before-after studies. The importance of an informed and well-considered decision based on the best possible engineering knowledge and information is imperative due to the potential impact on the numbers of human injuries and deaths that may result from these investments. This paper describes the formalization and application of a methodology to evaluate the safety benefit of countermeasures in the face of uncertainty. To illustrate the methodology, 18 countermeasures for improving safety of at grade railroad crossings (AGRXs) in the Republic of Korea are considered. Akin to "stated preference" methods in travel survey research, the methodology applies random selection and laws of large numbers to derive accident modification factor (AMF) densities from expert opinions. In a full Bayesian analysis framework, the collective opinions in the form of AMF densities (data likelihood) are combined with prior knowledge (AMF density priors) for the 18 countermeasures to obtain 'best' estimates of AMFs (AMF posterior credible intervals). The countermeasures are then compared and recommended based on the largest safety returns with minimum risk (uncertainty). To the author's knowledge the complete methodology is new and has not previously been applied or reported in the literature. The results demonstrate that the methodology is able to discern anticipated safety benefit differences across candidate countermeasures. For the 18 at grade railroad crossings considered in this analysis, it was found that the top three performing countermeasures for reducing crashes are in-vehicle warning systems, obstacle detection systems, and constant warning time systems.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
Exploring Teacher-Student Interactions and Moral Reasoning Practices in Drama Classrooms
ERIC Educational Resources Information Center
Freebody, Kelly
2010-01-01
The research reported here brings together three settings of conceptual and methodological inquiry: the sociological setting of socio-economic theory; the curricular/pedagogic setting of educational drama; and the analytic setting of ethnomethodolgically informed analyses of conversation analysis and membership categorisation analysis. Students…
Pervez, Zeeshan; Ahmad, Mahmood; Khattak, Asad Masood; Lee, Sungyoung; Chung, Tae Choong
2016-01-01
Privacy-aware search of outsourced data ensures relevant data access in the untrusted domain of a public cloud service provider. Subscriber of a public cloud storage service can determine the presence or absence of a particular keyword by submitting search query in the form of a trapdoor. However, these trapdoor-based search queries are limited in functionality and cannot be used to identify secure outsourced data which contains semantically equivalent information. In addition, trapdoor-based methodologies are confined to pre-defined trapdoors and prevent subscribers from searching outsourced data with arbitrarily defined search criteria. To solve the problem of relevant data access, we have proposed an index-based privacy-aware search methodology that ensures semantic retrieval of data from an untrusted domain. This method ensures oblivious execution of a search query and leverages authorized subscribers to model conjunctive search queries without relying on predefined trapdoors. A security analysis of our proposed methodology shows that, in a conspired attack, unauthorized subscribers and untrusted cloud service providers cannot deduce any information that can lead to the potential loss of data privacy. A computational time analysis on commodity hardware demonstrates that our proposed methodology requires moderate computational resources to model a privacy-aware search query and for its oblivious evaluation on a cloud service provider.
Pervez, Zeeshan; Ahmad, Mahmood; Khattak, Asad Masood; Lee, Sungyoung; Chung, Tae Choong
2016-01-01
Privacy-aware search of outsourced data ensures relevant data access in the untrusted domain of a public cloud service provider. Subscriber of a public cloud storage service can determine the presence or absence of a particular keyword by submitting search query in the form of a trapdoor. However, these trapdoor-based search queries are limited in functionality and cannot be used to identify secure outsourced data which contains semantically equivalent information. In addition, trapdoor-based methodologies are confined to pre-defined trapdoors and prevent subscribers from searching outsourced data with arbitrarily defined search criteria. To solve the problem of relevant data access, we have proposed an index-based privacy-aware search methodology that ensures semantic retrieval of data from an untrusted domain. This method ensures oblivious execution of a search query and leverages authorized subscribers to model conjunctive search queries without relying on predefined trapdoors. A security analysis of our proposed methodology shows that, in a conspired attack, unauthorized subscribers and untrusted cloud service providers cannot deduce any information that can lead to the potential loss of data privacy. A computational time analysis on commodity hardware demonstrates that our proposed methodology requires moderate computational resources to model a privacy-aware search query and for its oblivious evaluation on a cloud service provider. PMID:27571421
The HIV Cure Research Agenda: The Role of Mathematical Modelling and Cost-Effectiveness Analysis.
Freedberg, Kenneth A; Possas, Cristina; Deeks, Steven; Ross, Anna Laura; Rosettie, Katherine L; Di Mascio, Michele; Collins, Chris; Walensky, Rochelle P; Yazdanpanah, Yazdan
The research agenda towards an HIV cure is building rapidly. In this article, we discuss the reasons for and methodological approach to using mathematical modeling and cost-effectiveness analysis in this agenda. We provide a brief description of the proof of concept for cure and the current directions of cure research. We then review the types of clinical economic evaluations, including cost analysis, cost-benefit analysis, and cost-effectiveness analysis. We describe the use of mathematical modeling and cost-effectiveness analysis early in the HIV epidemic as well as in the era of combination antiretroviral therapy. We then highlight the novel methodology of Value of Information analysis and its potential role in the planning of clinical trials. We close with recommendations for modeling and cost-effectiveness analysis in the HIV cure agenda.
Safety assessment on pedestrian crossing environments using MLS data.
Soilán, Mario; Riveiro, Belén; Sánchez-Rodríguez, Ana; Arias, Pedro
2018-02-01
In the framework of infrastructure analysis and maintenance in an urban environment, it is important to address the safety of every road user. This paper presents a methodology for the evaluation of several safety indicators on pedestrian crossing environments using geometric and radiometric information extracted from 3D point clouds collected by a Mobile Mapping System (MMS). The methodology is divided in four main modules which analyze the accessibility of the crossing area, the presence of traffic lights and traffic signs, and the visibility between a driver and a pedestrian on the proximities of a pedestrian crossing. The outputs of the analysis are exported to a Geographic Information System (GIS) where they are visualized and can be further processed in the context of city management. The methodology has been tested on approximately 30 pedestrian crossings in cluttered urban environments of two different cities. Results show that MMS are a valid mean to assess the safety of a specific urban environment, regarding its geometric conditions. Remarkable results are presented on traffic light classification, with a global F-score close to 95%. Copyright © 2017 Elsevier Ltd. All rights reserved.
Which technology to investigate visual perception in sport: video vs. virtual reality.
Vignais, Nicolas; Kulpa, Richard; Brault, Sébastien; Presse, Damien; Bideau, Benoit
2015-02-01
Visual information uptake is a fundamental element of sports involving interceptive tasks. Several methodologies, like video and methods based on virtual environments, are currently employed to analyze visual perception during sport situations. Both techniques have advantages and drawbacks. The goal of this study is to determine which of these technologies may be preferentially used to analyze visual information uptake during a sport situation. To this aim, we compared a handball goalkeeper's performance using two standardized methodologies: video clip and virtual environment. We examined this performance for two response tasks: an uncoupled task (goalkeepers show where the ball ends) and a coupled task (goalkeepers try to intercept the virtual ball). Variables investigated in this study were percentage of correct zones, percentage of correct responses, radial error and response time. The results showed that handball goalkeepers were more effective, more accurate and started to intercept earlier when facing a virtual handball thrower than when facing the video clip. These findings suggested that the analysis of visual information uptake for handball goalkeepers was better performed by using a 'virtual reality'-based methodology. Technical and methodological aspects of these findings are discussed further. Copyright © 2014 Elsevier B.V. All rights reserved.
Lunar Exploration Architecture Level Key Drivers and Sensitivities
NASA Technical Reports Server (NTRS)
Goodliff, Kandyce; Cirillo, William; Earle, Kevin; Reeves, J. D.; Shyface, Hilary; Andraschko, Mark; Merrill, R. Gabe; Stromgren, Chel; Cirillo, Christopher
2009-01-01
Strategic level analysis of the integrated behavior of lunar transportation and lunar surface systems architecture options is performed to assess the benefit, viability, affordability, and robustness of system design choices. This analysis employs both deterministic and probabilistic modeling techniques so that the extent of potential future uncertainties associated with each option are properly characterized. The results of these analyses are summarized in a predefined set of high-level Figures of Merit (FOMs) so as to provide senior NASA Constellation Program (CxP) and Exploration Systems Mission Directorate (ESMD) management with pertinent information to better inform strategic level decision making. The strategic level exploration architecture model is designed to perform analysis at as high a level as possible but still capture those details that have major impacts on system performance. The strategic analysis methodology focuses on integrated performance, affordability, and risk analysis, and captures the linkages and feedbacks between these three areas. Each of these results leads into the determination of the high-level FOMs. This strategic level analysis methodology has been previously applied to Space Shuttle and International Space Station assessments and is now being applied to the development of the Constellation Program point-of-departure lunar architecture. This paper provides an overview of the strategic analysis methodology and the lunar exploration architecture analyses to date. In studying these analysis results, the strategic analysis team has identified and characterized key drivers affecting the integrated architecture behavior. These key drivers include inclusion of a cargo lander, mission rate, mission location, fixed-versus- variable costs/return on investment, and the requirement for probabilistic analysis. Results of sensitivity analysis performed on lunar exploration architecture scenarios are also presented.
NASA Astrophysics Data System (ADS)
Coca-Domínguez, Oswaldo; Ricaurte-Villota, Constanza; Morales-Giraldo, David; Rangel-Buitrago, Nelson
2014-05-01
Analysis of hazards and vulnerability associated to coastal erosion along coastlines is a first issue in order to establish plans for adaptation to climate change in coastal areas. La Barra Town, Buenaventura (Pacific ocean of Colombia) and Providence - Santa Catalina Islands (Colombian Caribbean) were selected to develop a detailed analysis of coastal erosion hazard and vulnerability from different perspectives: i) physical (hazard) , ii) social , iii) conservation approach and iv) cultural heritage (Raizal). The analysis was made by a semi quantitative approximation method, applying variables associated with the intrinsic coastal zone properties (i.e. type of beach, exposure of the coast to waves, etc.). Coastal erosion data and associated variables as well land use; conservation and heritage data were used to carry out a further detailed analysis of the human - structural vulnerability and exposure to hazards. The data shows erosion rates close to -17 m yr-1 in La Barra Town (highlighting their critical condition and urgent relocation process), while in some sectors of Providence Island, such as Old Town, erosion rate was -5 m yr-1. The observed erosion process affects directly the land use and the local and regional economy. The differences between indexes and the structural and physical vulnerability as well the use of methodological variables are presented in the context of each region. In this work, all the information was worked using a GIS environment since this allows editing and updating the information continuously. The application of this methodology generates useful information in order to promote risk management as well prevention, mitigation and reduction plans. In both areas the adaptation must be a priority strategy to be considered, including relocation alternatives and sustainable protection with the support of studies of uses and future outlooks in the coast. The methodology is framed into the use of GIS tools and it highlights their benefits in the analysis of information.
America's Children & the Information Superhighway: A Briefing Book and National Action Agenda.
ERIC Educational Resources Information Center
Children's Partnership, Santa Monica, CA.
A study was conducted to determine how the information highway affects today's children and to develop a set of national children's goals and an action plan for achieving them. The study's methodology included a review of relevant child development and telecommunications literature, identification of model programs, analysis of experiences with…
Bossard, B.; Renard, J. M.; Capelle, P.; Paradis, P.; Beuscart, M. C.
2000-01-01
Investing in information technology has become a crucial process in hospital management today. Medical and administrative managers are faced with difficulties in measuring medical information technology costs and benefits due to the complexity of the domain. This paper proposes a preimplementation methodology for evaluating and appraising material, process and human costs and benefits. Based on the users needs and organizational process analysis, the methodology provides an evaluative set of financial and non financial indicators which can be integrated in a decision making and investment evaluation process. We describe the first results obtained after a few months of operation for the Computer-Based Patient Record (CPR) project. Its full acceptance, in spite of some difficulties, encourages us to diffuse the method for the entire project. PMID:11079851
I. DEVELOPMENTAL METHODOLOGY AS A CENTRAL SUBDISCIPLINE OF DEVELOPMENTAL SCIENCE.
Card, Noel A
2017-06-01
This first chapter introduces the main goals of the monograph and previews the remaining chapters. The goals of this monograph are to provide summaries of our current understanding of advanced developmental methodologies, provide information that can advance our understanding of human development, identify shortcomings in our understanding of developmental methodology, and serve as a flagpost for organizing developmental methodology as a subdiscipline within the broader field of developmental science. The remaining chapters in this monograph address issues in design (sampling and big data), longitudinal data analysis, and issues of replication and research accumulation. The final chapter describes the history of developmental methodology, considers how the previous chapters in this monograph fit within this subdiscipline, and offers recommendations for further advancement. © 2017 The Society for Research in Child Development, Inc.
Advanced GIS Exercise: Predicting Rainfall Erosivity Index Using Regression Analysis
ERIC Educational Resources Information Center
Post, Christopher J.; Goddard, Megan A.; Mikhailova, Elena A.; Hall, Steven T.
2006-01-01
Graduate students from a variety of agricultural and natural resource fields are incorporating geographic information systems (GIS) analysis into their graduate research, creating a need for teaching methodologies that help students understand advanced GIS topics for use in their own research. Graduate-level GIS exercises help students understand…
29 CFR 1910.119 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2011 CFR
2011-07-01
... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...
29 CFR 1910.119 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2010 CFR
2010-07-01
... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...
22 CFR 124.13 - Procurement by United States persons in foreign countries (offshore procurement).
Code of Federal Regulations, 2012 CFR
2012-04-01
... (build-to-print means producing an end-item (i.e., system, subsystem or component) from technical... of any information which discloses design methodology, engineering analysis, detailed process...
22 CFR 124.13 - Procurement by United States persons in foreign countries (offshore procurement).
Code of Federal Regulations, 2011 CFR
2011-04-01
... (build-to-print means producing an end-item (i.e., system, subsystem or component) from technical... of any information which discloses design methodology, engineering analysis, detailed process...
22 CFR 124.13 - Procurement by United States persons in foreign countries (offshore procurement).
Code of Federal Regulations, 2014 CFR
2014-04-01
... (build-to-print means producing an end-item (i.e., system, subsystem or component) from technical... of any information which discloses design methodology, engineering analysis, detailed process...
22 CFR 124.13 - Procurement by United States persons in foreign countries (offshore procurement).
Code of Federal Regulations, 2010 CFR
2010-04-01
... (build-to-print means producing an end-item (i.e., system, subsystem or component) from technical... of any information which discloses design methodology, engineering analysis, detailed process...
22 CFR 124.13 - Procurement by United States persons in foreign countries (offshore procurement).
Code of Federal Regulations, 2013 CFR
2013-04-01
... (build-to-print means producing an end-item (i.e., system, subsystem or component) from technical... of any information which discloses design methodology, engineering analysis, detailed process...
INHALATION EXPOSURE-RESPONSE ASSESSMENTS FOR FIVE CHEMICALS
Inhalation exposure-response assessments for five chemicals (acrolein, ethylene oxide, hexachlorocyclopentadiene, hydrogen sulfide, and phosgene) for less-than-lifetime durations are being developed to inform the development of the Inhalation Exposure-Response Analysis Methodolog...
A human factors methodology for real-time support applications
NASA Technical Reports Server (NTRS)
Murphy, E. D.; Vanbalen, P. M.; Mitchell, C. M.
1983-01-01
A general approach to the human factors (HF) analysis of new or existing projects at NASA/Goddard is delineated. Because the methodology evolved from HF evaluations of the Mission Planning Terminal (MPT) and the Earth Radiation Budget Satellite Mission Operations Room (ERBS MOR), it is directed specifically to the HF analysis of real-time support applications. Major topics included for discussion are the process of establishing a working relationship between the Human Factors Group (HFG) and the project, orientation of HF analysts to the project, human factors analysis and review, and coordination with major cycles of system development. Sub-topics include specific areas for analysis and appropriate HF tools. Management support functions are outlined. References provide a guide to sources of further information.
Methodological issues in medical workforce analysis: implications for regional Australia.
Hays, R B; Veitch, P C; Franklin, L; Crossland, L
1998-02-01
Medical workforce data have a profound impact on health policy formulation, but derived doctor population ratios (DPR) are often more relevant to plotting national trends than providing a detailed regional or local workforce perspective. Regional workforce data may be more useful if national approaches are augmented by local information. In developing a detailed workforce analysis for one region of Australia, the authors encountered several challenging methodological issues, including the accuracy of medical workforce databases, clarity of definition of community boundaries, interpretation of workforce definitions and the difficulty accounting for local community needs. This paper discusses the implications for regional workforce research.
An Analysis Methodology for the Gamma-ray Large Area Space Telescope
NASA Technical Reports Server (NTRS)
Morris, Robin D.; Cohen-Tanugi, Johann
2004-01-01
The Large Area Telescope (LAT) instrument on the Gamma Ray Large Area Space Telescope (GLAST) has been designed to detect high-energy gamma rays and determine their direction of incidence and energy. We propose a reconstruction algorithm based on recent advances in statistical methodology. This method, alternative to the standard event analysis inherited from high energy collider physics experiments, incorporates more accurately the physical processes occurring in the detector, and makes full use of the statistical information available. It could thus provide a better estimate of the direction and energy of the primary photon.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less
Non-linear forecasting in high-frequency financial time series
NASA Astrophysics Data System (ADS)
Strozzi, F.; Zaldívar, J. M.
2005-08-01
A new methodology based on state space reconstruction techniques has been developed for trading in financial markets. The methodology has been tested using 18 high-frequency foreign exchange time series. The results are in apparent contradiction with the efficient market hypothesis which states that no profitable information about future movements can be obtained by studying the past prices series. In our (off-line) analysis positive gain may be obtained in all those series. The trading methodology is quite general and may be adapted to other financial time series. Finally, the steps for its on-line application are discussed.
Zhang, Kejiang; Achari, Gopal; Pei, Yuansheng
2010-10-01
Different types of uncertain information-linguistic, probabilistic, and possibilistic-exist in site characterization. Their representation and propagation significantly influence the management of contaminated sites. In the absence of a framework with which to properly represent and integrate these quantitative and qualitative inputs together, decision makers cannot fully take advantage of the available and necessary information to identify all the plausible alternatives. A systematic methodology was developed in the present work to incorporate linguistic, probabilistic, and possibilistic information into the Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), a subgroup of Multi-Criteria Decision Analysis (MCDA) methods for ranking contaminated sites. The identification of criteria based on the paradigm of comparative risk assessment provides a rationale for risk-based prioritization. Uncertain linguistic, probabilistic, and possibilistic information identified in characterizing contaminated sites can be properly represented as numerical values, intervals, probability distributions, and fuzzy sets or possibility distributions, and linguistic variables according to their nature. These different kinds of representation are first transformed into a 2-tuple linguistic representation domain. The propagation of hybrid uncertainties is then carried out in the same domain. This methodology can use the original site information directly as much as possible. The case study shows that this systematic methodology provides more reasonable results. © 2010 SETAC.
NASA Technical Reports Server (NTRS)
Rasmussen, Robert; Bennett, Matthew
2006-01-01
The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.
2010-08-01
a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. a ...SECURITY CLASSIFICATION OF: This study presents a methodology for computing stochastic sensitivities with respect to the design variables, which are the...Random Variables Report Title ABSTRACT This study presents a methodology for computing stochastic sensitivities with respect to the design variables
NASA Technical Reports Server (NTRS)
Howard, R. A.; North, D. W.; Pezier, J. P.
1975-01-01
A new methodology is proposed for integrating planetary quarantine objectives into space exploration planning. This methodology is designed to remedy the major weaknesses inherent in the current formulation of planetary quarantine requirements. Application of the methodology is illustrated by a tutorial analysis of a proposed Jupiter Orbiter mission. The proposed methodology reformulates planetary quarantine planning as a sequential decision problem. Rather than concentrating on a nominal plan, all decision alternatives and possible consequences are laid out in a decision tree. Probabilities and values are associated with the outcomes, including the outcome of contamination. The process of allocating probabilities, which could not be made perfectly unambiguous and systematic, is replaced by decomposition and optimization techniques based on principles of dynamic programming. Thus, the new methodology provides logical integration of all available information and allows selection of the best strategy consistent with quarantine and other space exploration goals.
2011-01-01
Background Current methodological guidelines provide advice about the assessment of sub-group analysis within RCTs, but do not specify explicit criteria for assessment. Our objective was to provide researchers with a set of criteria that will facilitate the grading of evidence for moderators, in systematic reviews. Method We developed a set of criteria from methodological manuscripts (n = 18) using snowballing technique, and electronic database searches. Criteria were reviewed by an international Delphi panel (n = 21), comprising authors who have published methodological papers in this area, and researchers who have been active in the study of sub-group analysis in RCTs. We used the Research ANd Development/University of California Los Angeles appropriateness method to assess consensus on the quantitative data. Free responses were coded for consensus and disagreement. In a subsequent round additional criteria were extracted from the Cochrane Reviewers' Handbook, and the process was repeated. Results The recommendations are that meta-analysts report both confirmatory and exploratory findings for sub-groups analysis. Confirmatory findings must only come from studies in which a specific theory/evidence based a-priori statement is made. Exploratory findings may be used to inform future/subsequent trials. However, for inclusion in the meta-analysis of moderators, the following additional criteria should be applied to each study: Baseline factors should be measured prior to randomisation, measurement of baseline factors should be of adequate reliability and validity, and a specific test of the interaction between baseline factors and interventions must be presented. Conclusions There is consensus from a group of 21 international experts that methodological criteria to assess moderators within systematic reviews of RCTs is both timely and necessary. The consensus from the experts resulted in five criteria divided into two groups when synthesising evidence: confirmatory findings to support hypotheses about moderators and exploratory findings to inform future research. These recommendations are discussed in reference to previous recommendations for evaluating and reporting moderator studies. PMID:21281501
Jia, Pengli; Tang, Li; Yu, Jiajie; Lee, Andy H; Zhou, Xu; Kang, Deying; Luo, Yanan; Liu, Jiali; Sun, Xin
2018-03-06
To assess risk of bias and to investigate methodological issues concerning the design, conduct and analysis of randomised controlled trials (RCTs) testing acupuncture for knee osteoarthritis (KOA). PubMed, EMBASE, Cochrane Central Register of Controlled Trials and four major Chinese databases were searched for RCTs that investigated the effect of acupuncture for KOA. The Cochrane tool was used to examine the risk of bias of eligible RCTs. Their methodological details were examined using a standardised and pilot-tested questionnaire of 48 items, together with the association between four predefined factors and important methodological quality indicators. A total of 248 RCTs were eligible, of which 39 (15.7%) used computer-generated randomisation sequence. Of the 31 (12.5%) trials that stated the allocation concealment, only one used central randomisation. Twenty-five (10.1%) trials mentioned that their acupuncture procedures were standardised, but only 18 (7.3%) specified how the standardisation was achieved. The great majority of trials (n=233, 94%) stated that blinding was in place, but 204 (87.6%) did not clarify who was blinded. Only 27 (10.9%) trials specified the primary outcome, for which 7 used intention-to-treat analysis. Only 17 (6.9%) trials included details on sample size calculation; none preplanned an interim analysis and associated stopping rule. In total, 46 (18.5%) trials explicitly stated that loss to follow-up occurred, but only 6 (2.4%) provided some information to deal with the issue. No trials prespecified, conducted or reported any subgroup or adjusted analysis for the primary outcome. The overall risk of bias was high among published RCTs testing acupuncture for KOA. Methodological limitations were present in many important aspects of design, conduct and analyses. These findings inform the development of evidence-based methodological guidance for future trials assessing the effect of acupuncture for KOA. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Haneda, Kiyofumi; Umeda, Tokuo; Koyama, Tadashi; Harauchi, Hajime; Inamura, Kiyonari
2002-01-01
The target of our study is to establish the methodology for analyzing level of security requirements, for searching suitable security measures and for optimizing security distribution to every portion of medical practice. Quantitative expression must be introduced to our study as possible for the purpose of easy follow up of security procedures and easy evaluation of security outcomes or results. Results of system analysis by fault tree analysis (FTA) clarified that subdivided system elements in detail contribute to much more accurate analysis. Such subdivided composition factors very much depended on behavior of staff, interactive terminal devices, kinds of service, and routes of network. As conclusion, we found the methods to analyze levels of security requirements for each medical information systems employing FTA, basic events for each composition factor and combination of basic events. Methods for searching suitable security measures were found. Namely risk factors for each basic event, number of elements for each composition factor and candidates of security measure elements were found. Method to optimize the security measures for each medical information system was proposed. Namely optimum distribution of risk factors in terms of basic events were figured out, and comparison of them between each medical information systems became possible.
Segmentation-free image processing and analysis of precipitate shapes in 2D and 3D
NASA Astrophysics Data System (ADS)
Bales, Ben; Pollock, Tresa; Petzold, Linda
2017-06-01
Segmentation based image analysis techniques are routinely employed for quantitative analysis of complex microstructures containing two or more phases. The primary advantage of these approaches is that spatial information on the distribution of phases is retained, enabling subjective judgements of the quality of the segmentation and subsequent analysis process. The downside is that computing micrograph segmentations with data from morphologically complex microstructures gathered with error-prone detectors is challenging and, if no special care is taken, the artifacts of the segmentation will make any subsequent analysis and conclusions uncertain. In this paper we demonstrate, using a two phase nickel-base superalloy microstructure as a model system, a new methodology for analysis of precipitate shapes using a segmentation-free approach based on the histogram of oriented gradients feature descriptor, a classic tool in image analysis. The benefits of this methodology for analysis of microstructure in two and three-dimensions are demonstrated.
Clerici, Nicola; Bodini, Antonio; Ferrarini, Alessandro
2004-10-01
In order to achieve improved sustainability, local authorities need to use tools that adequately describe and synthesize environmental information. This article illustrates a methodological approach that organizes a wide suite of environmental indicators into few aggregated indices, making use of correlation, principal component analysis, and fuzzy sets. Furthermore, a weighting system, which includes stakeholders' priorities and ambitions, is applied. As a case study, the described methodology is applied to the Reggio Emilia Province in Italy, by considering environmental information from 45 municipalities. Principal component analysis is used to condense an initial set of 19 indicators into 6 fundamental dimensions that highlight patterns of environmental conditions at the provincial scale. These dimensions are further aggregated in two indices of environmental performance through fuzzy sets. The simple form of these indices makes them particularly suitable for public communication, as they condensate a wide set of heterogeneous indicators. The main outcomes of the analysis and the potential applications of the method are discussed.
Risk analysis within environmental impact assessment of proposed construction activity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeleňáková, Martina; Zvijáková, Lenka
Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less
New methodologies for multi-scale time-variant reliability analysis of complex lifeline networks
NASA Astrophysics Data System (ADS)
Kurtz, Nolan Scot
The cost of maintaining existing civil infrastructure is enormous. Since the livelihood of the public depends on such infrastructure, its state must be managed appropriately using quantitative approaches. Practitioners must consider not only which components are most fragile to hazard, e.g. seismicity, storm surge, hurricane winds, etc., but also how they participate on a network level using network analysis. Focusing on particularly damaged components does not necessarily increase network functionality, which is most important to the people that depend on such infrastructure. Several network analyses, e.g. S-RDA, LP-bounds, and crude-MCS, and performance metrics, e.g. disconnection bounds and component importance, are available for such purposes. Since these networks are existing, the time state is also important. If networks are close to chloride sources, deterioration may be a major issue. Information from field inspections may also have large impacts on quantitative models. To address such issues, hazard risk analysis methodologies for deteriorating networks subjected to seismicity, i.e. earthquakes, have been created from analytics. A bridge component model has been constructed for these methodologies. The bridge fragilities, which were constructed from data, required a deeper level of analysis as these were relevant for specific structures. Furthermore, chloride-induced deterioration network effects were investigated. Depending on how mathematical models incorporate new information, many approaches are available, such as Bayesian model updating. To make such procedures more flexible, an adaptive importance sampling scheme was created for structural reliability problems. Additionally, such a method handles many kinds of system and component problems with singular or multiple important regions of the limit state function. These and previously developed analysis methodologies were found to be strongly sensitive to the network size. Special network topologies may be more or less computationally difficult, while the resolution of the network also has large affects. To take advantage of some types of topologies, network hierarchical structures with super-link representation have been used in the literature to increase the computational efficiency by analyzing smaller, densely connected networks; however, such structures were based on user input and subjective at times. To address this, algorithms must be automated and reliable. These hierarchical structures may indicate the structure of the network itself. This risk analysis methodology has been expanded to larger networks using such automated hierarchical structures. Component importance is the most important objective from such network analysis; however, this may only provide the information of which bridges to inspect/repair earliest and little else. High correlations influence such component importance measures in a negative manner. Additionally, a regional approach is not appropriately modelled. To investigate a more regional view, group importance measures based on hierarchical structures have been created. Such structures may also be used to create regional inspection/repair approaches. Using these analytical, quantitative risk approaches, the next generation of decision makers may make both component and regional-based optimal decisions using information from both network function and further effects of infrastructure deterioration.
ERIC Educational Resources Information Center
Sugimoto, Cassidy R.
2010-01-01
This study evaluated the development of scholars within the field of information and library science (ILS) with an emphasis on mentoring, collaboration, and interdisciplinarity in the process of doctoral education. Using methodological triangulation of more than 200 questionnaires, 30 interviews, and the bibliometric analysis of 97 dissertation…
Library and Information Resources and Users of Digital Resources in the Humanities
ERIC Educational Resources Information Center
Warwick, Claire; Terras, Melissa; Galina, Isabel; Huntington, Paul; Pappa, Nikoleta
2008-01-01
Purpose: The purpose of this article is to discuss the results of the Log Analysis of Internet Resources in the Arts and Humanities (LAIRAH) study. It aims to concentrate upon the use and importance of information resources, physical research centres and digital finding aids in scholarly research. Design/methodology/approach: Results are presented…
ERIC Educational Resources Information Center
Tamosiunas, Teodoras
2006-01-01
Purpose: The purpose of the research is to investigate how particular information from the environment serves as didactic material for students of Siauliai University Faculty of Social Sciences in learning to carry out scientific analysis and theoretical generalization of data in their theses. Methodology: The main sources--Internet databases,…
Program: A Record of the First 40 Years of Electronic Library and Information Systems
ERIC Educational Resources Information Center
Tedd, Lucy A.
2006-01-01
Purpose: To provide a broad overview of the history of the journal Program: electronic library and information systems and its contents over its first 40 years. Design/methodology/approach: Analysis of content from the original published material, as well as from abstracting and indexing publications and from minutes of Editorial Board meetings.…
Loucka, Martin; Payne, Sheila; Brearley, Sarah
2014-01-01
A number of research projects have been conducted that aim to gather data on the international development of palliative care. These data are important for policy makers and palliative care advocates. The aim of this article was to provide a critical comparative analysis of methodological approaches used to assess the development and status of palliative care services and infrastructure at an international level. A selective literature review that focused on the methodological features of eight identified reports was undertaken. Reviewed reports were found to differ in adopted methodologies and provided uneven amounts of methodological information. Five major methodological limitations were identified (lack of theory, use of experts as source of information, grey literature, difficulties in ranking, and the problematic nature of data on service provision). A set of recommendations on how to deal with these issues in future research is provided. Measuring the international development of palliative care is a difficult and challenging task. The results of this study could be used to improve the validity of future research in this field. Copyright © 2014 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.
Bendeck, Murielle; Serrano-Blanco, Antoni; García-Alonso, Carlos; Bonet, Pere; Jordà, Esther; Sabes-Figuera, Ramon; Salvador-Carulla, Luis
2013-04-01
Cost of illness (COI) studies are carried out under conditions of uncertainty and with incomplete information. There are concerns regarding their generalisability, accuracy and usability in evidence-informed care. A hybrid methodology is used to estimate the regional costs of depression in Catalonia (Spain) following an integrative approach. The cross-design synthesis included nominal groups and quantitative analysis of both top-down and bottom-up studies, and incorporated primary and secondary data from different sources of information in Catalonia. Sensitivity analysis used probabilistic Monte Carlo simulation modelling. A dissemination strategy was planned, including a standard form adapted from cost-effectiveness studies to summarise methods and results. The method used allows for a comprehensive estimate of the cost of depression in Catalonia. Health officers and decision-makers concluded that this methodology provided useful information and knowledge for evidence-informed planning in mental health. The mix of methods, combined with a simulation model, contributed to a reduction in data gaps and, in conditions of uncertainty, supplied more complete information on the costs of depression in Catalonia. This approach to COI should be differentiated from other COI designs to allow like-with-like comparisons. A consensus on COI typology, procedures and dissemination is needed.
Information Management for a Large Multidisciplinary Project
NASA Technical Reports Server (NTRS)
Jones, Kennie H.; Randall, Donald P.; Cronin, Catherine K.
1992-01-01
In 1989, NASA's Langley Research Center (LaRC) initiated the High-Speed Airframe Integration Research (HiSAIR) Program to develop and demonstrate an integrated environment for high-speed aircraft design using advanced multidisciplinary analysis and optimization procedures. The major goals of this program were to evolve the interactions among disciplines and promote sharing of information, to provide a timely exchange of information among aeronautical disciplines, and to increase the awareness of the effects each discipline has upon other disciplines. LaRC historically has emphasized the advancement of analysis techniques. HiSAIR was founded to synthesize these advanced methods into a multidisciplinary design process emphasizing information feedback among disciplines and optimization. Crucial to the development of such an environment are the definition of the required data exchanges and the methodology for both recording the information and providing the exchanges in a timely manner. These requirements demand extensive use of data management techniques, graphic visualization, and interactive computing. HiSAIR represents the first attempt at LaRC to promote interdisciplinary information exchange on a large scale using advanced data management methodologies combined with state-of-the-art, scientific visualization techniques on graphics workstations in a distributed computing environment. The subject of this paper is the development of the data management system for HiSAIR.
Implications of Contingency Planning Support for Weather and Icing Information
NASA Technical Reports Server (NTRS)
Vigeant-Langlois, Laurence; Hansman, R. John, Jr.
2003-01-01
A human-centered systems analysis was applied to the adverse aircraft weather encounter problem in order to identify desirable functions of weather and icing information. The importance of contingency planning was identified as emerging from a system safety design methodology as well as from results of other aviation decision-making studies. The relationship between contingency planning support and information on regions clear of adverse weather was investigated in a scenario- based analysis. A rapid prototype example of the key elements in the depiction of icing conditions was developed in a case study, and the implications for the components of the icing information system were articulated.
Kohli, R; Tan, J K; Piontek, F A; Ziege, D E; Groot, H
1999-08-01
Changes in health care delivery, reimbursement schemes, and organizational structure have required health organizations to manage the costs of providing patient care while maintaining high levels of clinical and patient satisfaction outcomes. Today, cost information, clinical outcomes, and patient satisfaction results must become more fully integrated if strategic competitiveness and benefits are to be realized in health management decision making, especially in multi-entity organizational settings. Unfortunately, traditional administrative and financial systems are not well equipped to cater to such information needs. This article presents a framework for the acquisition, generation, analysis, and reporting of cost information with clinical outcomes and patient satisfaction in the context of evolving health management and decision-support system technology. More specifically, the article focuses on an enhanced costing methodology for determining and producing improved, integrated cost-outcomes information. Implementation issues and areas for future research in cost-information management and decision-support domains are also discussed.
Amezquita-Sanchez, Juan P; Adeli, Anahita; Adeli, Hojjat
2016-05-15
Mild cognitive impairment (MCI) is a cognitive disorder characterized by memory impairment, greater than expected by age. A new methodology is presented to identify MCI patients during a working memory task using MEG signals. The methodology consists of four steps: In step 1, the complete ensemble empirical mode decomposition (CEEMD) is used to decompose the MEG signal into a set of adaptive sub-bands according to its contained frequency information. In step 2, a nonlinear dynamics measure based on permutation entropy (PE) analysis is employed to analyze the sub-bands and detect features to be used for MCI detection. In step 3, an analysis of variation (ANOVA) is used for feature selection. In step 4, the enhanced probabilistic neural network (EPNN) classifier is applied to the selected features to distinguish between MCI and healthy patients. The usefulness and effectiveness of the proposed methodology are validated using the sensed MEG data obtained experimentally from 18 MCI and 19 control patients. Copyright © 2016 Elsevier B.V. All rights reserved.
Evaluation of health information systems research in information systems research: A meta-analysis.
Haried, Peter; Claybaugh, Craig; Dai, Hua
2017-04-01
Given the importance of the health-care industry and the promise of health information systems, researchers are encouraged to build on the shoulders of giants as the saying goes. The health information systems field has a unique opportunity to learn from and extend the work that has already been done by the highly correlated information systems field. As a result, this research article presents a past, present and future meta-analysis of health information systems research in information systems journals over the 2000-2015 time period. Our analysis reviewed 126 articles on a variety of topics related to health information systems research published in the "Senior Scholars" list of the top eight ranked information systems academic journals. Across the selected information systems academic journals, our findings compare research methodologies applied, health information systems topic areas investigated and research trends. Interesting results emerge in the range and evolution of health information systems research and opportunities for health information systems researchers and practitioners to consider moving forward.
Roetzheim, Richard G; Freund, Karen M; Corle, Don K; Murray, David M; Snyder, Frederick R; Kronman, Andrea C; Jean-Pierre, Pascal; Raich, Peter C; Holden, Alan Ec; Darnell, Julie S; Warren-Mears, Victoria; Patierno, Steven
2012-04-01
The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, with similar clinical criteria but with different study designs. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed-upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from the members of the PNRP Design and Analysis Committee. To review possible methodologies for analyzing combined data arising from heterogeneous study designs. The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. The conclusions were based on simple consensus. The five approaches reviewed included the following: (1) analyzing and reporting each project separately, (2) combining data from all projects and performing an individual-level analysis, (3) pooling data from projects having similar study designs, (4) analyzing pooled data using a prospective meta-analytic technique, and (5) analyzing pooled data utilizing a novel simulated group-randomized design. Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and to accommodate differing project sample sizes. The conclusions reached were based on expert opinion and not derived from actual analyses performed. The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multisite community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become more salient. Discussion of the analytic issues faced by the PNRP and the methodological approaches we considered may be of value to other prospective community-based research programs.
Analysis of the Impacts of an Early Start for Compliance with the Kyoto Protocol
1999-01-01
This report describes the Energy Information Administration's analysis of the impacts of an early start, using the same methodology as in Impacts of the Kyoto Protocol on U.S. Energy Markets and Economic Activity, with only those changes in assumptions caused by the early start date.
CODAP: Programmer Notes for the Subroutine Library on the Univac 1108.
ERIC Educational Resources Information Center
Weissmuller, Johnny J.; And Others
The Comprehensive Occupational Data Analysis Programs (CODAP) package is a highly interactive and efficient system of computer routines for analyzing, organizing, and reporting occupational information. Since its inception in 1960, CODAP has grown in tandem with advances in job analysis methodology and is now capable of answering most of the wide…
The US EPA’s ToxCastTM program seeks to combine advances in high-throughput screening technology with methodologies from statistics and computer science to develop high-throughput decision support tools for assessing chemical hazard and risk. To develop new methods of analysis of...
Okada, Sachiko; Nagase, Keisuke; Ito, Ayako; Ando, Fumihiko; Nakagawa, Yoshiaki; Okamoto, Kazuya; Kume, Naoto; Takemura, Tadamasa; Kuroda, Tomohiro; Yoshihara, Hiroyuki
2014-01-01
Comparison of financial indices helps to illustrate differences in operations and efficiency among similar hospitals. Outlier data tend to influence statistical indices, and so detection of outliers is desirable. Development of a methodology for financial outlier detection using information systems will help to reduce the time and effort required, eliminate the subjective elements in detection of outlier data, and improve the efficiency and quality of analysis. The purpose of this research was to develop such a methodology. Financial outliers were defined based on a case model. An outlier-detection method using the distances between cases in multi-dimensional space is proposed. Experiments using three diagnosis groups indicated successful detection of cases for which the profitability and income structure differed from other cases. Therefore, the method proposed here can be used to detect outliers. Copyright © 2013 John Wiley & Sons, Ltd.
Formal verification of human-automation interaction
NASA Technical Reports Server (NTRS)
Degani, Asaf; Heymann, Michael
2002-01-01
This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.
NASA Astrophysics Data System (ADS)
Banfi, F.
2017-08-01
Architecture, Engineering and Construction (AEC) industry is facing a great process re-engineering of the management procedures for new constructions, and recent studies show a significant increase of the benefits obtained through the use of Building Information Modelling (BIM) methodologies. This innovative approach needs new developments for information and communication technologies (ICT) in order to improve cooperation and interoperability among different actors and scientific disciplines. Accordingly, BIM could be described as a new tool capable of collect/analyse a great quantity of information (Big data) and improve the management of building during its life of cycle (LC). The main aim of this research is, in addition to a reduction in production times, reduce physical and financial resources (economic impact), to demonstrate how technology development can support a complex generative process with new digital tools (modelling impact). This paper reviews recent BIMs of different historical Italian buildings such as Basilica of Collemaggio in L'Aquila, Masegra Castle in Sondrio, Basilica of Saint Ambrose in Milan and Visconti Bridge in Lecco and carries out a methodological analysis to optimize output information and results combining different data and modelling techniques into a single hub (cloud service) through the use of new Grade of Generation (GoG) and Information (GoI) (management impact). Finally, this study shows the need to orient GoG and GoI for a different type of analysis, which requires a high Grade of Accuracy (GoA) and an Automatic Verification System (AVS ) at the same time.
NASA Astrophysics Data System (ADS)
Bell, A.; Tang, G.; Yang, P.; Wu, D.
2017-12-01
Due to their high spatial and temporal coverage, cirrus clouds have a profound role in regulating the Earth's energy budget. Variability of their radiative, geometric, and microphysical properties can pose significant uncertainties in global climate model simulations if not adequately constrained. Thus, the development of retrieval methodologies able to accurately retrieve ice cloud properties and present associated uncertainties is essential. The effectiveness of cirrus cloud retrievals relies on accurate a priori understanding of ice radiative properties, as well as the current state of the atmosphere. Current studies have implemented information content theory analyses prior to retrievals to quantify the amount of information that should be expected on parameters to be retrieved, as well as the relative contribution of information provided by certain measurement channels. Through this analysis, retrieval algorithms can be designed in a way to maximize the information in measurements, and therefore ensure enough information is present to retrieve ice cloud properties. In this study, we present such an information content analysis to quantify the amount of information to be expected in retrievals of cirrus ice water path and particle effective diameter using sub-millimeter and thermal infrared radiometry. Preliminary results show these bands to be sensitive to changes in ice water path and effective diameter, and thus lend confidence their ability to simultaneously retrieve these parameters. Further quantification of sensitivity and the information provided from these bands can then be used to design and optimal retrieval scheme. While this information content analysis is employed on a theoretical retrieval combining simulated radiance measurements, the methodology could in general be applicable to any instrument or retrieval approach.
Delakis, Ioannis; Wise, Robert; Morris, Lauren; Kulama, Eugenia
2015-11-01
The purpose of this work was to evaluate the contrast-detail performance of full field digital mammography (FFDM) systems using ideal (Hotelling) observer Signal-to-Noise Ratio (SNR) methodology and ascertain whether it can be considered an alternative to the conventional, automated analysis of CDMAM phantom images. Five FFDM units currently used in the national breast screening programme were evaluated, which differed with respect to age, detector, Automatic Exposure Control (AEC) and target/filter combination. Contrast-detail performance was analysed using CDMAM and ideal observer SNR methodology. The ideal observer SNR was calculated for input signal originating from gold discs of varying thicknesses and diameters, and then used to estimate the threshold gold thickness for each diameter as per CDMAM analysis. The variability of both methods and the dependence of CDMAM analysis on phantom manufacturing discrepancies also investigated. Results from both CDMAM and ideal observer methodologies were informative differentiators of FFDM systems' contrast-detail performance, displaying comparable patterns with respect to the FFDM systems' type and age. CDMAM results suggested higher threshold gold thickness values compared with the ideal observer methodology, especially for small-diameter details, which can be attributed to the behaviour of the CDMAM phantom used in this study. In addition, ideal observer methodology results showed lower variability than CDMAM results. The Ideal observer SNR methodology can provide a useful metric of the FFDM systems' contrast detail characteristics and could be considered a surrogate for conventional, automated analysis of CDMAM images. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Assessing the Need for Semi-Dependent Housing for the Elderly
ERIC Educational Resources Information Center
Newcomer, Robert J.; And Others
1976-01-01
The need for quantitative information on the depth of the semi-dependent housing problems of the elderly is significant. This paper reports the findings and methodology of a 21-state market feasibility analysis. (Author)
Contaminated site cleanups involving complex activities may benefit from a detailed environmental footprint analysis to inform decision-making about application of suitable best management practices for greener cleanups.
Konstantinidis, Georgios; Anastassopoulos, George C; Karakos, Alexandros S; Anagnostou, Emmanouil; Danielides, Vasileios
2012-04-01
The aim of this study is to present our perspectives on healthcare analysis and design and the lessons learned from our experience with the development of a distributed, object-oriented Clinical Information System (CIS). In order to overcome known issues regarding development, implementation and finally acceptance of a CIS by the physicians we decided to develop a novel object-oriented methodology by integrating usability principles and techniques in a simplified version of a well established software engineering process (SEP), the Unified Process (UP). A multilayer architecture has been defined and implemented with the use of a vendor application framework. Our first experiences from a pilot implementation of our CIS are positive. This approach allowed us to gain a socio-technical understanding of the domain and enabled us to identify all the important factors that define both the structure and the behavior of a Health Information System.
Applying thematic analysis theory to practice: a researcher's experience.
Tuckett, Anthony G
2005-01-01
This article describes an experience of thematic analysis. In order to answer the question 'What does analysis look like in practice?' it describes in brief how the methodology of grounded theory, the epistemology of social constructionism, and the theoretical stance of symbolic interactionism inform analysis. Additionally, analysis is examined by evidencing the systematic processes--here termed organising, coding, writing, theorising, and reading--that led the researcher to develop a final thematic schema.
NASA Astrophysics Data System (ADS)
McKinney, D. C.; Cuellar, A. D.
2015-12-01
Climate change has accelerated glacial retreat in high altitude glaciated regions of Nepal leading to the growth and formation of glacier lakes. Glacial lake outburst floods (GLOF) are sudden events triggered by an earthquake, moraine failure or other shock that causes a sudden outflow of water. These floods are catastrophic because of their sudden onset, the difficulty predicting them, and enormous quantity of water and debris rapidly flooding downstream areas. Imja Lake in the Himalaya of Nepal has experienced accelerated growth since it first appeared in the 1960s. Communities threatened by a flood from Imja Lake have advocated for projects to adapt to the increasing threat of a GLOF. Nonetheless, discussions surrounding projects for Imja have not included a rigorous analysis of the potential consequences of a flood, probability of an event, or costs of mitigation projects in part because this information is unknown or uncertain. This work presents a demonstration of a decision making methodology developed to rationally analyze the risks posed by Imja Lake and the various adaptation projects proposed using available information. In this work the authors use decision analysis, data envelopement analysis (DEA), and sensitivity analysis to assess proposed adaptation measures that would mitigate damage in downstream communities from a GLOF. We use an existing hydrodynamic model of the at-risk area to determine how adaptation projects will affect downstream flooding and estimate fatalities using an empirical method developed for dam failures. The DEA methodology allows us to estimate the value of a statistical life implied by each project given the cost of the project and number of lives saved to determine which project is the most efficient. In contrast the decision analysis methodology requires fatalities to be assigned a cost but allows the inclusion of uncertainty in the decision making process. We compare the output of these two methodologies and determine the sensitivity of the conclusions to changes in uncertain input parameters including project cost, value of a statistical life, and time to a GLOF event.
A methodological systematic review of what's wrong with meta-ethnography reporting.
France, Emma F; Ring, Nicola; Thomas, Rebecca; Noyes, Jane; Maxwell, Margaret; Jepson, Ruth
2014-11-19
Syntheses of qualitative studies can inform health policy, services and our understanding of patient experience. Meta-ethnography is a systematic seven-phase interpretive qualitative synthesis approach well-suited to producing new theories and conceptual models. However, there are concerns about the quality of meta-ethnography reporting, particularly the analysis and synthesis processes. Our aim was to investigate the application and reporting of methods in recent meta-ethnography journal papers, focusing on the analysis and synthesis process and output. Methodological systematic review of health-related meta-ethnography journal papers published from 2012-2013. We searched six electronic databases, Google Scholar and Zetoc for papers using key terms including 'meta-ethnography.' Two authors independently screened papers by title and abstract with 100% agreement. We identified 32 relevant papers. Three authors independently extracted data and all authors analysed the application and reporting of methods using content analysis. Meta-ethnography was applied in diverse ways, sometimes inappropriately. In 13% of papers the approach did not suit the research aim. In 66% of papers reviewers did not follow the principles of meta-ethnography. The analytical and synthesis processes were poorly reported overall. In only 31% of papers reviewers clearly described how they analysed conceptual data from primary studies (phase 5, 'translation' of studies) and in only one paper (3%) reviewers explicitly described how they conducted the analytic synthesis process (phase 6). In 38% of papers we could not ascertain if reviewers had achieved any new interpretation of primary studies. In over 30% of papers seminal methodological texts which could have informed methods were not cited. We believe this is the first in-depth methodological systematic review of meta-ethnography conduct and reporting. Meta-ethnography is an evolving approach. Current reporting of methods, analysis and synthesis lacks clarity and comprehensiveness. This is a major barrier to use of meta-ethnography findings that could contribute significantly to the evidence base because it makes judging their rigour and credibility difficult. To realise the high potential value of meta-ethnography for enhancing health care and understanding patient experience requires reporting that clearly conveys the methodology, analysis and findings. Tailored meta-ethnography reporting guidelines, developed through expert consensus, could improve reporting.
System architectures for telerobotic research
NASA Technical Reports Server (NTRS)
Harrison, F. Wallace
1989-01-01
Several activities are performed related to the definition and creation of telerobotic systems. The effort and investment required to create architectures for these complex systems can be enormous; however, the magnitude of process can be reduced if structured design techniques are applied. A number of informal methodologies supporting certain aspects of the design process are available. More recently, prototypes of integrated tools supporting all phases of system design from requirements analysis to code generation and hardware layout have begun to appear. Activities related to system architecture of telerobots are described, including current activities which are designed to provide a methodology for the comparison and quantitative analysis of alternative system architectures.
Perception: a concept analysis.
McDonald, Susan M
2012-02-01
Concept analysis methodology by Walker and Avant (2005) was used to define, describe, and delimit the concept of perception. Nursing literature in the Medline database was searched for definitions of "perception." Definitions, uses, and defining attributes of perception were identified; model and contrary cases were developed; and antecedents, consequences, and empirical referents were determined. An operational definition for the concept was developed. Nurses need to be cognizant of how perceptual differences impact the delivery of nursing care. In research, a mixed methodology approach may yield a richer description of the phenomenon and provide useful information for clinical practice. © 2011, The Author. International Journal of Nursing Knowledge © 2011, NANDA International.
NASA Astrophysics Data System (ADS)
Vazquez Rascon, Maria de Lourdes
This thesis focuses on the implementation of a participatory and transparent decision making tool about the wind farm projects. This tool is based on an (argumentative) framework that reflects the stakeholder's values systems involved in these projects and it employs two multicriteria methods: the multicriteria decision aide and the participatory geographical information systems, making it possible to represent this value systems by criteria and indicators to be evaluated. The stakeholder's values systems will allow the inclusion of environmental, economic and social-cultural aspects of wind energy projects and, thus, a sustainable development wind projects vision. This vision will be analyzed using the 16 sustainable principles included in the Quebec's Sustainable Development Act. Four specific objectives have been instrumented to favor a logical completion work, and to ensure the development of a successfultool : designing a methodology to couple the MCDA and participatory GIS, testing the developed methodology by a case study, making a robustness analysis to address strategic issues and analyzing the strengths, weaknesses, opportunities and threads of the developed methodology. Achieving the first goal allowed us to obtain a decision-making tool called Territorial Intelligence Modeling for Energy Development (TIMED approach). The TIMED approach is visually represented by a figure expressing the idea of a co-construction decision and where ail stakeholders are the focus of this methodology. TIMED is composed of four modules: Multi-Criteria decision analysis, participatory geographic Information systems, active involvement of the stakeholders and scientific knowledge/local knowledge. The integration of these four modules allows for the analysis of different implementation scenarios of wind turbines in order to choose the best one based on a participatory and transparent decision-making process that takes into account stakeholders' concerns. The second objective enabled the testing of TIMED in an ex-post experience of a wind farm in operation since 2006. In this test, II people participated representing four stakeholder' categories: the private sector, the public sector, experts and civil society. This test allowed us to analyze the current situation in which wind projects are currently developed in Quebec. The concerns of some stakeholders regarding situations that are not considered in the current context were explored through a third goal. This third objective allowed us to make simulations taking into account the assumptions of strategic levels. Examples of the strategic level are the communication tools used to approach the host community and the park property type. Finally, the fourth objective, a SWOT analysis with the participation of eight experts, allowed us to verify the extent to which TIMED approach succeeded in constructing four fields for participatory decision-making: physical, intellectual, emotional and procedural. From these facts, 116 strengths, 28 weaknesses, 32 constraints and 54 opportunities were identified. Contributions, applications, limitations and extensions of this research are based on giving a participatory decision-making methodology taking into account socio-cultural, environmental and economic variables; making reflection sessions on a wind farm in operation; acquiring MCDA knowledge for participants involved in testing the proposed methodology; taking into account the physical, intellectual, emotional and procedural spaces to al1iculate a participatory decision; using the proposed methodology in renewable energy sources other than wind; the need to an interdisciplinary team for the methodology application; access to quality data; access to information technologies; the right to public participation; the neutrality of experts; the relationships between experts and non-experts; cultural constraints; improvement of designed indicators; the implementation of a Web platform for participatory decision-making and writing a manual on the use of the developed methodology. Keywords: wind farm, multicriteria decision, geographic information systems, TIMED approach, sustainable wind energy projects development, renewable energy, social participation, robustness concern, SWOT analysis.
Evaluation methodologies for an advanced information processing system
NASA Technical Reports Server (NTRS)
Schabowsky, R. S., Jr.; Gai, E.; Walker, B. K.; Lala, J. H.; Motyka, P.
1984-01-01
The system concept and requirements for an Advanced Information Processing System (AIPS) are briefly described, but the emphasis of this paper is on the evaluation methodologies being developed and utilized in the AIPS program. The evaluation tasks include hardware reliability, maintainability and availability, software reliability, performance, and performability. Hardware RMA and software reliability are addressed with Markov modeling techniques. The performance analysis for AIPS is based on queueing theory. Performability is a measure of merit which combines system reliability and performance measures. The probability laws of the performance measures are obtained from the Markov reliability models. Scalar functions of this law such as the mean and variance provide measures of merit in the AIPS performability evaluations.
BIM Methodology Approach to Infrastructure Design: Case Study of Paniga Tunnel
NASA Astrophysics Data System (ADS)
Osello, Anna; Rapetti, Niccolò; Semeraro, Francesco
2017-10-01
Nowadays, the implementation of Building Information Modelling (BIM) in civil design represent a new challenge for the AECO (Architecture, Engineering, Construction, Owner and Operator) world, which will involve the interest of many researchers in the next years. It is due to the incentives of Public Administration and European Directives that aim to improve the efficiency and to enhance a better management of the complexity of infrastructure projects. For these reasons, the goal of this research is to propose a methodology for the use of BIM in a tunnel project, analysing the definition of a correct level of detail (LOD) and the possibility to share information via interoperability for FEM analysis.
Fontecha, Jesús; Hervás, Ramón; Mondéjar, Tania; González, Iván; Bravo, José
2015-10-01
One of the main challenges on Ambient Assisted Living (AAL) is to reach an appropriate acceptance level of the assistive systems, as well as to analyze and monitor end user tasks in a feasible and efficient way. The development and evaluation of AAL solutions based on user-centered perspective help to achive these goals. In this work, we have designed a methodology to integrate and develop analytics user-centered tools into assistive systems. An analysis software tool gathers information of end users from adapted psychological questionnaires and naturalistic observation of their own context. The aim is to enable an in-deep analysis focused on improving the life quality of elderly people and their caregivers.
Building toy models of proteins using coevolutionary information
NASA Astrophysics Data System (ADS)
Cheng, Ryan; Raghunathan, Mohit; Onuchic, Jose
2015-03-01
Recent developments in global statistical methodologies have advanced the analysis of large collections of protein sequences for coevolutionary information. Coevolution between amino acids in a protein arises from compensatory mutations that are needed to maintain the stability or function of a protein over the course of evolution. This gives rise to quantifiable correlations between amino acid positions within the multiple sequence alignment of a protein family. Here, we use Direct Coupling Analysis (DCA) to infer a Potts model Hamiltonian governing the correlated mutations in a protein family to obtain the sequence-dependent interaction energies of a toy protein model. We demonstrate that this methodology predicts residue-residue interaction energies that are consistent with experimental mutational changes in protein stabilities as well as other computational methodologies. Furthermore, we demonstrate with several examples that DCA could be used to construct a structure-based model that quantitatively agrees with experimental data on folding mechanisms. This work serves as a potential framework for generating models of proteins that are enriched by evolutionary data that can potentially be used to engineer key functional motions and interactions in protein systems. This research has been supported by the NSF INSPIRE award MCB-1241332 and by the CTBP sponsored by the NSF (Grant PHY-1427654).
Makar, Susan; Malanowski, Amanda; Rapp, Katie
2016-01-01
The Information Services Office (ISO) of the National Institute of Standards and Technology (NIST) proactively sought out an opportunity to present the findings of a study that showed the impact of NIST’s forensic research output to its internal customers and outside researchers. ISO analyzed the impact of NIST’s contributions to the peer-reviewed forensic journal literature through citation analysis and network visualizations. The findings of this study were compiled into a poster that was presented during the Forensics@NIST Symposium in December 2014. ISO’s study informed the forensic research community where NIST has had some of the greatest scholarly impact. This paper describes the methodology used to assess the impact of NIST’s forensic publications and shares the results, outcomes, and impacts of ISO’s study and poster presentation. This methodology is adaptable and applicable to other research fields and to other libraries. It has improved the recognition of ISO’s capabilities within NIST and resulted in application of the methodology to additional scientific disciplines. PMID:27956754
NASA Technical Reports Server (NTRS)
Olivas, J. D.; Melroy, P.; McDanels, S.; Wallace, T.; Zapata, M. C.
2006-01-01
In connection with the accident investigation of the space shuttle Columbia, an analysis methodology utilizing well established microscopic and spectroscopic techniques was implemented for evaluating the environment to which the exterior fused silica glass was exposed. Through the implementation of optical microscopy, scanning electron microscopy, energy dispersive spectroscopy, transmission electron microscopy, and electron diffraction, details emerged regarding the manner in which a charred metallic deposited layer formed on top of the exposed glass. Due to nature of the substrate and the materials deposited, the methodology proved to allow for a more detailed analysis of the vehicle breakup. By contrast, similar analytical methodologies on metallic substrates have proven to be challenging due to strong potential for error resulting from substrate contamination. This information proved to be valuable to not only those involved in investigating the break up of Columbia, but also provides a potential guide for investigating future high altitude and high energy accidents.
A methodology for the assessment of flood hazards at the regional scale
NASA Astrophysics Data System (ADS)
Gallina, Valentina; Torresan, Silvia; Critto, Andrea; Zabeo, Alex; Semenzin, Elena; Marcomini, Antonio
2013-04-01
In recent years, the frequency of water-related disasters has increased and recent flood events in Europe (e.g. 2002 in Central Europe, 2007 in UK, 2010 in Italy) caused physical-environmental and socio-economic damages. Specifically, floods are the most threatening water-related disaster that affects humans, their lives and properties. Within the KULTURisk project (FP7) a Regional Risk Assessment (RRA) methodology is proposed to evaluate the benefits of risk prevention in terms of reduced environmental risks due to floods. The method is based on the KULTURisk framework and allows the identification and prioritization of targets (i.e. people, buildings, infrastructures, agriculture, natural and semi-natural systems, cultural heritages) and areas at risk from floods in the considered region by comparing the baseline scenario (i.e. current state) with alternative scenarios (i.e. where different structural and/or non-structural measures are planned). The RRA methodology is flexible and can be adapted to different case studies (i.e. large rivers, alpine/mountain catchments, urban areas and coastal areas) and spatial scales (i.e. from the large river to the urban scale). The final aim of RRA is to help decision-makers in examining the possible environmental risks associated with uncertain future flood hazards and in identifying which prevention scenario could be the most suitable one. The RRA methodology employs Multi-Criteria Decision Analysis (MCDA functions) in order to integrate stakeholder preferences and experts judgments into the analysis. Moreover, Geographic Information Systems (GISs) are used to manage, process, analyze, and map data to facilitate the analysis and the information sharing with different experts and stakeholders. In order to characterize flood risks, the proposed methodology integrates the output of hydrodynamic models with the analysis of site-specific bio-geophysical and socio-economic indicators (e.g. slope of the territory, land cover, population density, economic activities) of several case studies in order to develop risk maps that identify and prioritize relative hot-spot areas and targets at risk at the regional scale. The main outputs of the RRA are receptor-based maps of risks useful to communicate the potential implications of floods in non-monetary terms to stakeholders and administrations. These maps can be a basis for the management of flood risks as they can provide information about the indicative number of inhabitants, the type of economic activities, natural systems and cultural heritages potentially affected by flooding. Moreover, they can provide suitable information about flood risk in the considered area in order to define priorities for prevention measures, for land use planning and management. Finally, the outputs of the RRA methodology can be used as data input in the Socio- Economic Regional Risk Assessment methodology for the economic evaluation of different damages (e.g. tangible costs, intangible costs) and for the social assessment considering the benefits of the human dimension of vulnerability (i.e. adaptive and coping capacity). Within the KULTURisk project, the methodology has been applied and validated in several European case studies. Moreover, its generalization to address other types of natural hazards (e.g. earthquakes, forest fires) will be evaluated. The preliminary results of the RRA application in the KULTURisk project will be here presented and discussed.
ERIC Educational Resources Information Center
Hsieh, Chueh-An; Maier, Kimberly S.
2009-01-01
The capacity of Bayesian methods in estimating complex statistical models is undeniable. Bayesian data analysis is seen as having a range of advantages, such as an intuitive probabilistic interpretation of the parameters of interest, the efficient incorporation of prior information to empirical data analysis, model averaging and model selection.…
General RMP Guidance - Chapter 4: Offsite Consequence Analysis
This chapter provides basic compliance information, not modeling methodologies, for people who plan to do their own air dispersion modeling. OCA is a required part of the risk management program, and involves worst-case and alternative release scenarios.
DOT National Transportation Integrated Search
2005-01-01
In 2003, an Internet-based Geotechnical Database Management System (GDBMS) was developed for the Virginia Department of Transportation (VDOT) using distributed Geographic Information System (GIS) methodology for data management, archival, retrieval, ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, S.M.
1997-04-30
This chapter provides information on the physical, chemical, and biological characteristics of the waste stored at the 616 NRDWSF. A waste analysis plan is included that describes the methodology used for determining waste types.
Moving beyond the Bar Plot and the Line Graph to Create Informative and Attractive Graphics
ERIC Educational Resources Information Center
Larson-Hall, Jenifer
2017-01-01
Graphics are often mistaken for a mere frill in the methodological arsenal of data analysis when in fact they can be one of the simplest and at the same time most powerful methods of communicating statistical information (Tufte, 2001). The first section of the article argues for the statistical necessity of graphs, echoing and amplifying similar…
ERIC Educational Resources Information Center
Shubik, Martin
The main problem in computer gaming research is the initial decision of choosing the type of gaming method to be used. Free-form games lead to exciting open-ended confrontations that generate much information. However, they do not easily lend themselves to analysis because they generate far too much information and their results are seldom…
Bayesian data analysis for newcomers.
Kruschke, John K; Liddell, Torrin M
2018-02-01
This article explains the foundational concepts of Bayesian data analysis using virtually no mathematical notation. Bayesian ideas already match your intuitions from everyday reasoning and from traditional data analysis. Simple examples of Bayesian data analysis are presented that illustrate how the information delivered by a Bayesian analysis can be directly interpreted. Bayesian approaches to null-value assessment are discussed. The article clarifies misconceptions about Bayesian methods that newcomers might have acquired elsewhere. We discuss prior distributions and explain how they are not a liability but an important asset. We discuss the relation of Bayesian data analysis to Bayesian models of mind, and we briefly discuss what methodological problems Bayesian data analysis is not meant to solve. After you have read this article, you should have a clear sense of how Bayesian data analysis works and the sort of information it delivers, and why that information is so intuitive and useful for drawing conclusions from data.
LC-MS based analysis of endogenous steroid hormones in human hair.
Gao, Wei; Kirschbaum, Clemens; Grass, Juliane; Stalder, Tobias
2016-09-01
The quantification of endogenous steroid hormone concentrations in hair is increasingly used as a method for obtaining retrospective information on long-term integrated hormone exposure. Several different analytical procedures have been employed for hair steroid analysis, with liquid chromatography-mass spectrometry (LC-MS) being recognized as a particularly powerful analytical tool. Several methodological aspects affect the performance of LC-MS systems for hair steroid analysis, including sample preparation and pretreatment, steroid extraction, post-incubation purification, LC methodology, ionization techniques and MS specifications. Here, we critically review the differential value of such protocol variants for hair steroid hormones analysis, focusing on both analytical quality and practical feasibility issues. Our results show that, when methodological challenges are adequately addressed, LC-MS protocols can not only yield excellent sensitivity and specificity but are also characterized by relatively simple sample processing and short run times. This makes LC-MS based hair steroid protocols particularly suitable as a high-quality option for routine application in research contexts requiring the processing of larger numbers of samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ferrer, Laetitia; Curt, Corinne; Tacnet, Jean-Marc
2018-04-01
Major hazard prevention is a main challenge given that it is specifically based on information communicated to the public. In France, preventive information is notably provided by way of local regulatory documents. Unfortunately, the law requires only few specifications concerning their content; therefore one can question the impact on the general population relative to the way the document is concretely created. Ergo, the purpose of our work is to propose an analytical methodology to evaluate preventive risk communication document effectiveness. The methodology is based on dependability approaches and is applied in this paper to the Document d'Information Communal sur les Risques Majeurs (DICRIM; in English, Municipal Information Document on Major Risks). DICRIM has to be made by mayors and addressed to the public to provide information on major hazards affecting their municipalities. An analysis of law compliance of the document is carried out thanks to the identification of regulatory detection elements. These are applied to a database of 30 DICRIMs. This analysis leads to a discussion on points such as usefulness of the missing elements. External and internal function analysis permits the identification of the form and content requirements and service and technical functions of the document and its components (here its sections). Their results are used to carry out an FMEA (failure modes and effects analysis), which allows us to define the failure and to identify detection elements. This permits the evaluation of the effectiveness of form and content of each components of the document. The outputs are validated by experts from the different fields investigated. Those results are obtained to build, in future works, a decision support model for the municipality (or specialised consulting firms) in charge of drawing up documents.
Calhoun, Aaron W; Rider, Elizabeth A; Peterson, Eleanor; Meyer, Elaine C
2010-09-01
Multi-rater assessment with gap analysis is a powerful method for assessing communication skills and self-insight, and enhancing self-reflection. We demonstrate the use of this methodology. The Program for the Approach to Complex Encounters (PACE) is an interdisciplinary simulation-based communication skills program. Encounters are assessed using an expanded Kalamazoo Consensus Statement Essential Elements Checklist adapted for multi-rater feedback and gap analysis. Data from a representative conversation were analyzed. Likert and forced-choice data with gap analysis are used to assess performance. Participants were strong in Demonstrating Empathy and Providing Closure, and needed to improve Relationship Building, Gathering Information, and understanding the Patient's/Family's Perspective. Participants under-appraised their abilities in Relationship Building, Providing Closure, and Demonstrating Empathy, as well as their overall performance. The conversion of these results into verbal feedback is discussed. We describe an evaluation methodology using multi-rater assessment with gap analysis to assess communication skills and self-insight. This methodology enables faculty to identify undervalued skills and perceptual blind spots, provide comprehensive, data driven, feedback, and encourage reflection. Implementation of graphical feedback forms coupled with one-on-one discussion using the above methodology has the potential to enhance trainee self-awareness and reflection, improving the impact of educational programs. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.
Unsupervised user similarity mining in GSM sensor networks.
Shad, Shafqat Ali; Chen, Enhong
2013-01-01
Mobility data has attracted the researchers for the past few years because of its rich context and spatiotemporal nature, where this information can be used for potential applications like early warning system, route prediction, traffic management, advertisement, social networking, and community finding. All the mentioned applications are based on mobility profile building and user trend analysis, where mobility profile building is done through significant places extraction, user's actual movement prediction, and context awareness. However, significant places extraction and user's actual movement prediction for mobility profile building are a trivial task. In this paper, we present the user similarity mining-based methodology through user mobility profile building by using the semantic tagging information provided by user and basic GSM network architecture properties based on unsupervised clustering approach. As the mobility information is in low-level raw form, our proposed methodology successfully converts it to a high-level meaningful information by using the cell-Id location information rather than previously used location capturing methods like GPS, Infrared, and Wifi for profile mining and user similarity mining.
The Tacitness of Tacitus. A Methodological Approach to European Thought. No. 46.
ERIC Educational Resources Information Center
Bierschenk, Bernhard
This study measured the analysis of verbal flows by means of volume-elasticity measures and the analysis of information flow structures and their representations in the form of a metaphysical cube. A special purpose system of computer programs (PERTEX) was used to establish the language space in which the textual flow patterns occurred containing…
Ideologies of English in a Chinese High School EFL Textbook: A Critical Discourse Analysis
ERIC Educational Resources Information Center
Xiong, Tao; Qian, Yamin
2012-01-01
In this article we examine ideologies of English in present-day China with a special focus on textbook discourse. The research framework is informed by critical theories on language and education. Critical discourse analysis is applied as a methodological approach characterized by a socially committed attitude in the explanation and interpretation…
Global, Local, and Graphical Person-Fit Analysis Using Person-Response Functions
ERIC Educational Resources Information Center
Emons, Wilco H. M.; Sijtsma, Klaas; Meijer, Rob R.
2005-01-01
Person-fit statistics test whether the likelihood of a respondent's complete vector of item scores on a test is low given the hypothesized item response theory model. This binary information may be insufficient for diagnosing the cause of a misfitting item-score vector. The authors propose a comprehensive methodology for person-fit analysis in the…
Policy Implications Analysis: A Methodological Advancement for Policy Research and Evaluation.
ERIC Educational Resources Information Center
Madey, Doren L.; Stenner, A. Jackson
Policy Implications Analysis (PIA) is a tool designed to maximize the likelihood that an evaluation report will have an impact on decision-making. PIA was designed to help people planning and conducting evaluations tailor their information so that it has optimal potential for being used and acted upon. This paper describes the development and…
ERIC Educational Resources Information Center
Torrens, Paul M.; Griffin, William A.
2013-01-01
The authors describe an observational and analytic methodology for recording and interpreting dynamic microprocesses that occur during social interaction, making use of space--time data collection techniques, spatial-statistical analysis, and visualization. The scheme has three investigative foci: Structure, Activity Composition, and Clustering.…
ERIC Educational Resources Information Center
Fidalgo, Reyes
The focus of this highly quantitative investigation is the analysis of the differences observed, both in methodological approach and in language choice in the classroom, during five years of classroom visitations. The study reflects information collected from two very different institutions with radically distinct populations: a rural setting with…
2012-06-01
Visa Investigate Data Breach March 30, 2012 Visa and MasterCard are investigating whether a data security breach at one of the main companies that...30). MasterCard and Visa Investigate Data Breach . New York Times . Stamatis, D. (2003). Failure Mode Effect Analysis: FMEA from Theory to Execution
Methodological Choices in the Content Analysis of Textbooks for Measuring Alignment with Standards
ERIC Educational Resources Information Center
Polikoff, Morgan S.; Zhou, Nan; Campbell, Shauna E.
2015-01-01
With the recent adoption of the Common Core standards in many states, there is a need for quality information about textbook alignment to standards. While there are many existing content analysis procedures, these generally have little, if any, validity or reliability evidence. One exception is the Surveys of Enacted Curriculum (SEC), which has…
Simulation of Attacks for Security in Wireless Sensor Network.
Diaz, Alvaro; Sanchez, Pablo
2016-11-18
The increasing complexity and low-power constraints of current Wireless Sensor Networks (WSN) require efficient methodologies for network simulation and embedded software performance analysis of nodes. In addition, security is also a very important feature that has to be addressed in most WSNs, since they may work with sensitive data and operate in hostile unattended environments. In this paper, a methodology for security analysis of Wireless Sensor Networks is presented. The methodology allows designing attack-aware embedded software/firmware or attack countermeasures to provide security in WSNs. The proposed methodology includes attacker modeling and attack simulation with performance analysis (node's software execution time and power consumption estimation). After an analysis of different WSN attack types, an attacker model is proposed. This model defines three different types of attackers that can emulate most WSN attacks. In addition, this paper presents a virtual platform that is able to model the node hardware, embedded software and basic wireless channel features. This virtual simulation analyzes the embedded software behavior and node power consumption while it takes into account the network deployment and topology. Additionally, this simulator integrates the previously mentioned attacker model. Thus, the impact of attacks on power consumption and software behavior/execution-time can be analyzed. This provides developers with essential information about the effects that one or multiple attacks could have on the network, helping them to develop more secure WSN systems. This WSN attack simulator is an essential element of the attack-aware embedded software development methodology that is also introduced in this work.
Vallet, Guy; Perrin, Anne; Keller, Christiane; Fieschi, Marius
2006-03-01
For the past eight years, the Ministry of Health has released information about the services and quality of care in public hospitals, in response to the increasing concern about hospital performance expressed by patient associations. The press publishes hospital ratings based on this information. This survey asked hospital administrators about their views of communication on this topic. This survey, conducted from 7 October through 20 November 2004, sent a two-page open questionnaire to a variety of hospital executive personnel - medical directors, chief administrators, medical school deans, and public information officers - to determine their views on this subject. The response rate was 34%. Without contesting either the legitimacy of the expectation for information or the transparency owed to patients, health professionals expressed the need to know in advance the "rules of the game" and the methodology of the rating techniques to be used. Most reported few changes in their professional behavior due to these publications, the methodology and criteria of which they contested. They suggested changes including different criteria and indicators for the rating, the ability to contest the conclusions drawn from the PMSI data, and the need for preliminary work to define criteria by working groups composed of physicians, other professionals, and even those outside the health field. On the other hand, only half were willing to participate in such a working group. These hospital managers see a need for specialists in the analysis of hospital data, who can clarify the meaning of the statistics and improve the public's understanding of them, now shaped by the mass media's failure to provide meaningful analysis.
Information technology security system engineering methodology
NASA Technical Reports Server (NTRS)
Childs, D.
2003-01-01
A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.
NASA Astrophysics Data System (ADS)
Jamal, Wasifa; Das, Saptarshi; Maharatna, Koushik; Pan, Indranil; Kuyucu, Doga
2015-09-01
Degree of phase synchronization between different Electroencephalogram (EEG) channels is known to be the manifestation of the underlying mechanism of information coupling between different brain regions. In this paper, we apply a continuous wavelet transform (CWT) based analysis technique on EEG data, captured during face perception tasks, to explore the temporal evolution of phase synchronization, from the onset of a stimulus. Our explorations show that there exists a small set (typically 3-5) of unique synchronized patterns or synchrostates, each of which are stable of the order of milliseconds. Particularly, in the beta (β) band, which has been reported to be associated with visual processing task, the number of such stable states has been found to be three consistently. During processing of the stimulus, the switching between these states occurs abruptly but the switching characteristic follows a well-behaved and repeatable sequence. This is observed in a single subject analysis as well as a multiple-subject group-analysis in adults during face perception. We also show that although these patterns remain topographically similar for the general category of face perception task, the sequence of their occurrence and their temporal stability varies markedly between different face perception scenarios (stimuli) indicating toward different dynamical characteristics for information processing, which is stimulus-specific in nature. Subsequently, we translated these stable states into brain complex networks and derived informative network measures for characterizing the degree of segregated processing and information integration in those synchrostates, leading to a new methodology for characterizing information processing in human brain. The proposed methodology of modeling the functional brain connectivity through the synchrostates may be viewed as a new way of quantitative characterization of the cognitive ability of the subject, stimuli and information integration/segregation capability.
Sequence information gain based motif analysis.
Maynou, Joan; Pairó, Erola; Marco, Santiago; Perera, Alexandre
2015-11-09
The detection of regulatory regions in candidate sequences is essential for the understanding of the regulation of a particular gene and the mechanisms involved. This paper proposes a novel methodology based on information theoretic metrics for finding regulatory sequences in promoter regions. This methodology (SIGMA) has been tested on genomic sequence data for Homo sapiens and Mus musculus. SIGMA has been compared with different publicly available alternatives for motif detection, such as MEME/MAST, Biostrings (Bioconductor package), MotifRegressor, and previous work such Qresiduals projections or information theoretic based detectors. Comparative results, in the form of Receiver Operating Characteristic curves, show how, in 70% of the studied Transcription Factor Binding Sites, the SIGMA detector has a better performance and behaves more robustly than the methods compared, while having a similar computational time. The performance of SIGMA can be explained by its parametric simplicity in the modelling of the non-linear co-variability in the binding motif positions. Sequence Information Gain based Motif Analysis is a generalisation of a non-linear model of the cis-regulatory sequences detection based on Information Theory. This generalisation allows us to detect transcription factor binding sites with maximum performance disregarding the covariability observed in the positions of the training set of sequences. SIGMA is freely available to the public at http://b2slab.upc.edu.
NASA Astrophysics Data System (ADS)
Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha
2016-04-01
This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.
Feature Mining and Health Assessment for Gearboxes Using Run-Up/Coast-Down Signals
Zhao, Ming; Lin, Jing; Miao, Yonghao; Xu, Xiaoqiang
2016-01-01
Vibration signals measured in the run-up/coast-down (R/C) processes usually carry rich information about the health status of machinery. However, a major challenge in R/C signals analysis lies in how to exploit more diagnostic information, and how this information could be properly integrated to achieve a more reliable maintenance decision. Aiming at this problem, a framework of R/C signals analysis is presented for the health assessment of gearbox. In the proposed methodology, we first investigate the data preprocessing and feature selection issues for R/C signals. Based on that, a sparsity-guided feature enhancement scheme is then proposed to extract the weak phase jitter associated with gear defect. In order for an effective feature mining and integration under R/C, a generalized phase demodulation technique is further established to reveal the evolution of modulation feature with operating speed and rotation angle. The experimental results indicate that the proposed methodology could not only detect the presence of gear damage, but also offer a novel insight into the dynamic behavior of gearbox. PMID:27827831
Feature Mining and Health Assessment for Gearboxes Using Run-Up/Coast-Down Signals.
Zhao, Ming; Lin, Jing; Miao, Yonghao; Xu, Xiaoqiang
2016-11-02
Vibration signals measured in the run-up/coast-down (R/C) processes usually carry rich information about the health status of machinery. However, a major challenge in R/C signals analysis lies in how to exploit more diagnostic information, and how this information could be properly integrated to achieve a more reliable maintenance decision. Aiming at this problem, a framework of R/C signals analysis is presented for the health assessment of gearbox. In the proposed methodology, we first investigate the data preprocessing and feature selection issues for R/C signals. Based on that, a sparsity-guided feature enhancement scheme is then proposed to extract the weak phase jitter associated with gear defect. In order for an effective feature mining and integration under R/C, a generalized phase demodulation technique is further established to reveal the evolution of modulation feature with operating speed and rotation angle. The experimental results indicate that the proposed methodology could not only detect the presence of gear damage, but also offer a novel insight into the dynamic behavior of gearbox.
Algebra for Enterprise Ontology: towards analysis and synthesis of enterprise models
NASA Astrophysics Data System (ADS)
Suga, Tetsuya; Iijima, Junichi
2018-03-01
Enterprise modeling methodologies have made enterprises more likely to be the object of systems engineering rather than craftsmanship. However, the current state of research in enterprise modeling methodologies lacks investigations of the mathematical background embedded in these methodologies. Abstract algebra, a broad subfield of mathematics, and the study of algebraic structures may provide interesting implications in both theory and practice. Therefore, this research gives an empirical challenge to establish an algebraic structure for one aspect model proposed in Design & Engineering Methodology for Organizations (DEMO), which is a major enterprise modeling methodology in the spotlight as a modeling principle to capture the skeleton of enterprises for developing enterprise information systems. The results show that the aspect model behaves well in the sense of algebraic operations and indeed constructs a Boolean algebra. This article also discusses comparisons with other modeling languages and suggests future work.
1992-12-01
describing how. 5. EDDA . EDDA is an attempt to add mathematical formalism to SADT. Because it is based on SADT, it cannot easily represent any other...design methodology. EDDA has two forms: G- EDDA , the standard graphical version of SADT, and S- EDDA , a textual language that partially represents the...used. "* EDDA only supports the SADT methodology and is too limited in scope to be useful in our research. "* SAMM lacks the semantic richness of
Hydrogen Hazards Assessment Protocol (HHAP): Approach and Methodology
NASA Technical Reports Server (NTRS)
Woods, Stephen
2009-01-01
This viewgraph presentation reviews the approach and methodology to develop a assessment protocol for hydrogen hazards. Included in the presentation are the reasons to perform hazards assessment, the types of hazard assessments that exist, an analysis of hydrogen hazards, specific information about the Hydrogen Hazards Assessment Protocol (HHAP). The assessment is specifically tailored for hydrogen behavior. The end product of the assesment is a compilation of hazard, mitigations and associated factors to facilitate decision making and achieve the best practice.
ERIC Educational Resources Information Center
Qin, Jian; Jurisica, Igor; Liddy, Elizabeth D.; Jansen, Bernard J; Spink, Amanda; Priss, Uta; Norton, Melanie J.
2000-01-01
These six articles discuss knowledge discovery in databases (KDD). Topics include data mining; knowledge management systems; applications of knowledge discovery; text and Web mining; text mining and information retrieval; user search patterns through Web log analysis; concept analysis; data collection; and data structure inconsistency. (LRW)
Lorence, Daniel; Abraham, Joanna
2006-01-01
Medical and health-related searches pose a special case of risk when using the web as an information resource. Uninsured consumers, lacking access to a trained provider, will often rely on information from the internet for self-diagnosis and treatment. In areas where treatments are uncertain or controversial, most consumers lack the knowledge to make an informed decision. This exploratory technology assessment examines the use of Keyword Effectiveness Indexing (KEI) analysis as a potential tool for profiling information search and keyword retrieval patterns. Results demonstrate that the KEI methodology can be useful in identifying e-health search patterns, but is limited by semantic or text-based web environments.
Lichacz, Frederick M J
2008-10-01
The present study represents a preliminary examination of the relationship between situation awareness (SA) and confidence within a distributed information-sharing environment using the calibration methodology. The calibration methodology uses the indices of calibration, resolution and over/under-confidence to examine the relationship between the accuracy of the responses and the degree of confidence that one has in these responses, which leads to a measure of an operator's meta-SA. The results of this study revealed that, although the participants were slightly overconfident in their responses, overall they demonstrated good meta-SA. That is, the participants' subjective probability judgements corresponded to their pattern of SA response accuracy. It is concluded that the use of calibration analysis represents a better methodology for expanding our understanding of the relationship between SA and confidence and ultimately how this relationship can impact decision-making and performance in applied settings than can be achieved by examining SA measures alone.
The Reference Encounter Model.
ERIC Educational Resources Information Center
White, Marilyn Domas
1983-01-01
Develops model of the reference interview which explicitly incorporates human information processing, particularly schema ideas presented by Marvin Minsky and other theorists in cognitive processing and artificial intelligence. Questions are raised concerning use of content analysis of transcribed verbal protocols as methodology for studying…
The role of modelling in prioritising and planning clinical trials.
Chilcott, J; Brennan, A; Booth, A; Karnon, J; Tappenden, P
2003-01-01
To identify the role of modelling in planning and prioritising trials. The review focuses on modelling methods used in the construction of disease models and on methods for their analysis and interpretation. Searches were initially developed in MEDLINE and then translated into other databases. Systematic reviews of the methodological and case study literature were undertaken. Search strategies focused on the intersection between three domains: modelling, health technology assessment and prioritisation. The review found that modelling can extend the validity of trials by: generalising from trial populations to specific target groups; generalising to other settings and countries; extrapolating trial outcomes to the longer term; linking intermediate outcome measures to final outcomes; extending analysis to the relevant comparators; adjusting for prognostic factors in trials; and synthesising research results. The review suggested that modelling may offer greatest benefits where the impact of a technology occurs over a long duration, where disease/technology characteristics are not observable, where there are long lead times in research, or for rapidly changing technologies. It was also found that modelling can inform the key parameters for research: sample size, trial duration and population characteristics. One-way, multi-way and threshold sensitivity analysis have been used in informing these aspects but are flawed. The payback approach has been piloted and while there have been weaknesses in its implementation, the approach does have potential. Expected value of information analysis is the only existing methodology that has been applied in practice and can address all these issues. The potential benefit of this methodology is that the value of research is directly related to its impact on technology commissioning decisions, and is demonstrated in real and absolute rather than relative terms; it assesses the technical efficiency of different types of research. Modelling is not a substitute for data collection. However, modelling can identify trial designs of low priority in informing health technology commissioning decisions. Good practice in undertaking and reporting economic modelling studies requires further dissemination and support, specifically in sensitivity analyses, model validation and the reporting of assumptions. Case studies of the payback approach using stochastic sensitivity analyses should be developed. Use of overall expected value of perfect information should be encouraged in modelling studies seeking to inform prioritisation and planning of health technology assessments. Research is required to assess if the potential benefits of value of information analysis can be realised in practice; on the definition of an adequate objective function; on methods for analysing computationally expensive models; and on methods for updating prior probability distributions.
Work Domain Analysis Methodology for Development of Operational Concepts for Advanced Reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hugo, Jacques
2015-05-01
This report describes a methodology to conduct a Work Domain Analysis in preparation for the development of operational concepts for new plants. This method has been adapted from the classical method described in the literature in order to better deal with the uncertainty and incomplete information typical of first-of-a-kind designs. The report outlines the strategy for undertaking a Work Domain Analysis of a new nuclear power plant and the methods to be used in the development of the various phases of the analysis. Basic principles are described to the extent necessary to explain why and how the classical method wasmore » adapted to make it suitable as a tool for the preparation of operational concepts for a new nuclear power plant. Practical examples are provided of the systematic application of the method and the various presentation formats in the operational analysis of advanced reactors.« less
Assessing the impact of healthcare research: A systematic review of methodological frameworks.
Cruz Rivera, Samantha; Kyte, Derek G; Aiyegbusi, Olalekan Lee; Keeley, Thomas J; Calvert, Melanie J
2017-08-01
Increasingly, researchers need to demonstrate the impact of their research to their sponsors, funders, and fellow academics. However, the most appropriate way of measuring the impact of healthcare research is subject to debate. We aimed to identify the existing methodological frameworks used to measure healthcare research impact and to summarise the common themes and metrics in an impact matrix. Two independent investigators systematically searched the Medical Literature Analysis and Retrieval System Online (MEDLINE), the Excerpta Medica Database (EMBASE), the Cumulative Index to Nursing and Allied Health Literature (CINAHL+), the Health Management Information Consortium, and the Journal of Research Evaluation from inception until May 2017 for publications that presented a methodological framework for research impact. We then summarised the common concepts and themes across methodological frameworks and identified the metrics used to evaluate differing forms of impact. Twenty-four unique methodological frameworks were identified, addressing 5 broad categories of impact: (1) 'primary research-related impact', (2) 'influence on policy making', (3) 'health and health systems impact', (4) 'health-related and societal impact', and (5) 'broader economic impact'. These categories were subdivided into 16 common impact subgroups. Authors of the included publications proposed 80 different metrics aimed at measuring impact in these areas. The main limitation of the study was the potential exclusion of relevant articles, as a consequence of the poor indexing of the databases searched. The measurement of research impact is an essential exercise to help direct the allocation of limited research resources, to maximise research benefit, and to help minimise research waste. This review provides a collective summary of existing methodological frameworks for research impact, which funders may use to inform the measurement of research impact and researchers may use to inform study design decisions aimed at maximising the short-, medium-, and long-term impact of their research.
Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan; ...
2017-08-23
A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan
A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less
A methodology to event reconstruction from trace images.
Milliet, Quentin; Delémont, Olivier; Sapin, Eric; Margot, Pierre
2015-03-01
The widespread use of digital imaging devices for surveillance (CCTV) and entertainment (e.g., mobile phones, compact cameras) has increased the number of images recorded and opportunities to consider the images as traces or documentation of criminal activity. The forensic science literature focuses almost exclusively on technical issues and evidence assessment [1]. Earlier steps in the investigation phase have been neglected and must be considered. This article is the first comprehensive description of a methodology to event reconstruction using images. This formal methodology was conceptualised from practical experiences and applied to different contexts and case studies to test and refine it. Based on this practical analysis, we propose a systematic approach that includes a preliminary analysis followed by four main steps. These steps form a sequence for which the results from each step rely on the previous step. However, the methodology is not linear, but it is a cyclic, iterative progression for obtaining knowledge about an event. The preliminary analysis is a pre-evaluation phase, wherein potential relevance of images is assessed. In the first step, images are detected and collected as pertinent trace material; the second step involves organising and assessing their quality and informative potential. The third step includes reconstruction using clues about space, time and actions. Finally, in the fourth step, the images are evaluated and selected as evidence. These steps are described and illustrated using practical examples. The paper outlines how images elicit information about persons, objects, space, time and actions throughout the investigation process to reconstruct an event step by step. We emphasise the hypothetico-deductive reasoning framework, which demonstrates the contribution of images to generating, refining or eliminating propositions or hypotheses. This methodology provides a sound basis for extending image use as evidence and, more generally, as clues in investigation and crime reconstruction processes. Copyright © 2015 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.
Wolfslehner, Bernhard; Seidl, Rupert
2010-12-01
The decision-making environment in forest management (FM) has changed drastically during the last decades. Forest management planning is facing increasing complexity due to a widening portfolio of forest goods and services, a societal demand for a rational, transparent decision process and rising uncertainties concerning future environmental conditions (e.g., climate change). Methodological responses to these challenges include an intensified use of ecosystem models to provide an enriched, quantitative information base for FM planning. Furthermore, multi-criteria methods are increasingly used to amalgamate information, preferences, expert judgments and value expressions, in support of the participatory and communicative dimensions of modern forestry. Although the potential of combining these two approaches has been demonstrated in a number of studies, methodological aspects in interfacing forest ecosystem models (FEM) and multi-criteria decision analysis (MCDA) are scarcely addressed explicitly. In this contribution we review the state of the art in FEM and MCDA in the context of FM planning and highlight some of the crucial issues when combining ecosystem and preference modeling. We discuss issues and requirements in selecting approaches suitable for supporting FM planning problems from the growing body of FEM and MCDA concepts. We furthermore identify two major challenges in a harmonized application of FEM-MCDA: (i) the design and implementation of an indicator-based analysis framework capturing ecological and social aspects and their interactions relevant for the decision process, and (ii) holistic information management that supports consistent use of different information sources, provides meta-information as well as information on uncertainties throughout the planning process.
NASA Astrophysics Data System (ADS)
Wolfslehner, Bernhard; Seidl, Rupert
2010-12-01
The decision-making environment in forest management (FM) has changed drastically during the last decades. Forest management planning is facing increasing complexity due to a widening portfolio of forest goods and services, a societal demand for a rational, transparent decision process and rising uncertainties concerning future environmental conditions (e.g., climate change). Methodological responses to these challenges include an intensified use of ecosystem models to provide an enriched, quantitative information base for FM planning. Furthermore, multi-criteria methods are increasingly used to amalgamate information, preferences, expert judgments and value expressions, in support of the participatory and communicative dimensions of modern forestry. Although the potential of combining these two approaches has been demonstrated in a number of studies, methodological aspects in interfacing forest ecosystem models (FEM) and multi-criteria decision analysis (MCDA) are scarcely addressed explicitly. In this contribution we review the state of the art in FEM and MCDA in the context of FM planning and highlight some of the crucial issues when combining ecosystem and preference modeling. We discuss issues and requirements in selecting approaches suitable for supporting FM planning problems from the growing body of FEM and MCDA concepts. We furthermore identify two major challenges in a harmonized application of FEM-MCDA: (i) the design and implementation of an indicator-based analysis framework capturing ecological and social aspects and their interactions relevant for the decision process, and (ii) holistic information management that supports consistent use of different information sources, provides meta-information as well as information on uncertainties throughout the planning process.
Semantic integration of gene expression analysis tools and data sources using software connectors
2013-01-01
Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data. PMID:24341380
Semantic integration of gene expression analysis tools and data sources using software connectors.
Miyazaki, Flávia A; Guardia, Gabriela D A; Vêncio, Ricardo Z N; de Farias, Cléver R G
2013-10-25
The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heterogeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.
Initialization and Setup of the Coastal Model Test Bed: STWAVE
2017-01-01
Laboratory (CHL) Field Research Facility (FRF) in Duck , NC. The improved evaluation methodology will promote rapid enhancement of model capability and focus...Blanton 2008) study . This regional digital elevation model (DEM), with a cell size of 10 m, was generated from numerous datasets collected at different...INFORMATION: For additional information, contact Spicer Bak, Coastal Observation and Analysis Branch, Coastal and Hydraulics Laboratory, 1261 Duck Road
Preferences of Young Adults Regarding Dissemination of Online Vitamin D Information.
Goodman, Samantha Erin; Sheeshka, Judy
2016-12-01
To explore strategies for disseminating online information about vitamin D to young adults. Participants were 50 males and females aged 18-25 years, living in Ontario, Canada. Eight focus groups (4 male; 4 female) were conducted; participants also completed a socio-demographic questionnaire. Audio files were transcribed verbatim; thematic analysis was used to identify key themes. Thematic analysis revealed that an effective educational intervention geared towards this population should be simple, brief, interesting, personally relevant, credible, and include incentives. Feedback regarding intervention methodology could be used to inform interventions aiming to increase intake of vitamin D or other nutrients among young adults.
Object-oriented analysis and design of an ECG storage and retrieval system integrated with an HIS.
Wang, C; Ohe, K; Sakurai, T; Nagase, T; Kaihara, S
1996-03-01
For a hospital information system, object-oriented methodology plays an increasingly important role, especially for the management of digitized data, e.g., the electrocardiogram, electroencephalogram, electromyogram, spirogram, X-ray, CT and histopathological images, which are not yet computerized in most hospitals. As a first step in an object-oriented approach to hospital information management and storing medical data in an object-oriented database, we connected electrocardiographs to a hospital network and established the integration of ECG storage and retrieval systems with a hospital information system. In this paper, the object-oriented analysis and design of the ECG storage and retrieval systems is reported.
Initial Analysis of and Predictive Model Development for Weather Reroute Advisory Use
NASA Technical Reports Server (NTRS)
Arneson, Heather M.
2016-01-01
In response to severe weather conditions, traffic management coordinators specify reroutes to route air traffic around affected regions of airspace. Providing analysis and recommendations of available reroute options would assist the traffic management coordinators in making more efficient rerouting decisions. These recommendations can be developed by examining historical data to determine which previous reroute options were used in similar weather and traffic conditions. Essentially, using previous information to inform future decisions. This paper describes the initial steps and methodology used towards this goal. A method to extract relevant features from the large volume of weather data to quantify the convective weather scenario during a particular time range is presented. Similar routes are clustered. A description of the algorithm to identify which cluster of reroute advisories were actually followed by pilots is described. Models built for fifteen of the top twenty most frequently used reroute clusters correctly predict the use of the cluster for over 60 of the test examples. Results are preliminary but indicate that the methodology is worth pursuing with modifications based on insight gained from this analysis.
NASA Technical Reports Server (NTRS)
Sperry, S. L.
1982-01-01
The planning process for a statewide reclamation plan of Ohio abandoned minelands in response to the Federal Surface Mining Control and Reclamation Act of 1977 included: (1) the development of a screening and ranking methodology; (2) the establishment of a statewide review of major watersheds affected by mining; (3) the development of an immediate action process; and (4) a prototypical study of a priority watershed demonstrating the data collection, analysis, display and evaluation to be used for the remaining state watersheds. Historical methods for satisfying map information analysis and evaluation, as well as current methodologies being used were discussed. Various computer mapping and analysis programs were examined for their usability in evaluating the priority reclamation sites. Hand methods were chosen over automated procedures; intuitive evaluation was the primary reason.
Near ground level sensing for spatial analysis of vegetation
NASA Technical Reports Server (NTRS)
Sauer, Tom; Rasure, John; Gage, Charlie
1991-01-01
Measured changes in vegetation indicate the dynamics of ecological processes and can identify the impacts from disturbances. Traditional methods of vegetation analysis tend to be slow because they are labor intensive; as a result, these methods are often confined to small local area measurements. Scientists need new algorithms and instruments that will allow them to efficiently study environmental dynamics across a range of different spatial scales. A new methodology that addresses this problem is presented. This methodology includes the acquisition, processing, and presentation of near ground level image data and its corresponding spatial characteristics. The systematic approach taken encompasses a feature extraction process, a supervised and unsupervised classification process, and a region labeling process yielding spatial information.
Getting the big picture in community science: methods that capture context.
Luke, Douglas A
2005-06-01
Community science has a rich tradition of using theories and research designs that are consistent with its core value of contextualism. However, a survey of empirical articles published in the American Journal of Community Psychology shows that community scientists utilize a narrow range of statistical tools that are not well suited to assess contextual data. Multilevel modeling, geographic information systems (GIS), social network analysis, and cluster analysis are recommended as useful tools to address contextual questions in community science. An argument for increased methodological consilience is presented, where community scientists are encouraged to adopt statistical methodology that is capable of modeling a greater proportion of the data than is typical with traditional methods.
Methodological update in Medicina Intensiva.
García Garmendia, J L
2018-04-01
Research in the critically ill is complex by the heterogeneity of patients, the difficulties to achieve representative sample sizes and the number of variables simultaneously involved. However, the quantity and quality of records is high as well as the relevance of the variables used, such as survival. The methodological tools have evolved to offering new perspectives and analysis models that allow extracting relevant information from the data that accompanies the critically ill patient. The need for training in methodology and interpretation of results is an important challenge for the intensivists who wish to be updated on the research developments and clinical advances in Intensive Medicine. Copyright © 2017 Elsevier España, S.L.U. y SEMNIM. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-17
...: Proposed Collection; Comment Request--Generic Clearance to Conduct Methodological Testing, Surveys, Focus... proposed information collection. This information collection will conduct research by methodological... Methodological Testing, Surveys, Focus Groups, and Related Tools to Improve the Management of Federal Nutrition...
Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P
2016-10-01
An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.
22 CFR 124.2 - Exemptions for training and military service.
Code of Federal Regulations, 2012 CFR
2012-04-01
..., software source code, design methodology, engineering analysis or manufacturing know-how such as that... underlying engineering methods and design philosophy utilized (i.e., the “why” or information that explains the rationale for particular design decision, engineering feature, or performance requirement...
22 CFR 124.2 - Exemptions for training and military service.
Code of Federal Regulations, 2014 CFR
2014-04-01
..., software source code, design methodology, engineering analysis or manufacturing know-how such as that... underlying engineering methods and design philosophy utilized (i.e., the “why” or information that explains the rationale for particular design decision, engineering feature, or performance requirement...
22 CFR 124.2 - Exemptions for training and military service.
Code of Federal Regulations, 2013 CFR
2013-04-01
..., software source code, design methodology, engineering analysis or manufacturing know-how such as that... underlying engineering methods and design philosophy utilized (i.e., the “why” or information that explains the rationale for particular design decision, engineering feature, or performance requirement...
22 CFR 124.2 - Exemptions for training and military service.
Code of Federal Regulations, 2011 CFR
2011-04-01
..., software source code, design methodology, engineering analysis or manufacturing know-how such as that... underlying engineering methods and design philosophy utilized (i.e., the “why” or information that explains the rationale for particular design decision, engineering feature, or performance requirement...
ISSUES IN THE STATISTICAL ANALYSIS OF SMALL-AREA HEALTH DATA. (R825173)
The availability of geographically indexed health and population data, with advances in computing, geographical information systems and statistical methodology, have opened the way for serious exploration of small area health statistics based on routine data. Such analyses may be...
NASA Astrophysics Data System (ADS)
Brook, A.; Cristofani, E.; Vandewal, M.; Matheis, C.; Jonuscheit, J.; Beigang, R.
2012-05-01
The present study proposes a fully integrated, semi-automatic and near real-time mode-operated image processing methodology developed for Frequency-Modulated Continuous-Wave (FMCW) THz images with the center frequencies around: 100 GHz and 300 GHz. The quality control of aeronautics composite multi-layered materials and structures using Non-Destructive Testing is the main focus of this work. Image processing is applied on the 3-D images to extract useful information. The data is processed by extracting areas of interest. The detected areas are subjected to image analysis for more particular investigation managed by a spatial model. Finally, the post-processing stage examines and evaluates the spatial accuracy of the extracted information.
Combined Volatolomics for Monitoring of Human Body Chemistry
Broza, Yoav Y.; Zuri, Liat; Haick, Hossam
2014-01-01
Analysis of volatile organic compounds (VOCs) is a promising approach for non-invasive, fast and potentially inexpensive diagnostics. Here, we present a new methodology for profiling the body chemistry by using the volatile fraction of molecules in various body fluids. Using mass spectrometry and cross-reactive nanomaterial-based sensors array, we demonstrate that simultaneous VOC detection from breath and skin would provide complementary, non-correlated information of the body's volatile metabolites profile. Eventually with further wide population validation studies, such a methodology could provide more accurate monitoring of pathological changes compared to the information provided by a single body fluid. The qualitative and quantitative methods presented here offers a variety of options for novel mapping of the metabolic properties of complex organisms, including humans. PMID:24714440
Combined volatolomics for monitoring of human body chemistry.
Broza, Yoav Y; Zuri, Liat; Haick, Hossam
2014-04-09
Analysis of volatile organic compounds (VOCs) is a promising approach for non-invasive, fast and potentially inexpensive diagnostics. Here, we present a new methodology for profiling the body chemistry by using the volatile fraction of molecules in various body fluids. Using mass spectrometry and cross-reactive nanomaterial-based sensors array, we demonstrate that simultaneous VOC detection from breath and skin would provide complementary, non-correlated information of the body's volatile metabolites profile. Eventually with further wide population validation studies, such a methodology could provide more accurate monitoring of pathological changes compared to the information provided by a single body fluid. The qualitative and quantitative methods presented here offers a variety of options for novel mapping of the metabolic properties of complex organisms, including humans.
ERIC Educational Resources Information Center
Camparo, James; Camparo, Lorinda B.
2013-01-01
Though ubiquitous, Likert scaling's traditional mode of analysis is often unable to uncover all of the valid information in a data set. Here, the authors discuss a solution to this problem based on methodology developed by quantum physicists: the state multipole method. The authors demonstrate the relative ease and value of this method by…
Risk analysis based on hazards interactions
NASA Astrophysics Data System (ADS)
Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost
2017-04-01
Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).
Sabharwal, Sanjeeve; Carter, Alexander; Darzi, Lord Ara; Reilly, Peter; Gupte, Chinmay M
2015-06-01
Approximately 76,000 people a year sustain a hip fracture in the UK and the estimated cost to the NHS is £1.4 billion a year. Health economic evaluations (HEEs) are one of the methods employed by decision makers to deliver healthcare policy supported by clinical and economic evidence. The objective of this study was to (1) identify and characterize HEEs for the management of patients with hip fractures, and (2) examine their methodological quality. A literature search was performed in MEDLINE, EMBASE and the NHS Economic Evaluation Database. Studies that met the specified definition for a HEE and evaluated hip fracture management were included. Methodological quality was assessed using the Consensus on Health Economic Criteria (CHEC). Twenty-seven publications met the inclusion criteria of this study and were included in our descriptive and methodological analysis. Domains of methodology that performed poorly included use of an appropriate time horizon (66.7% of studies), incremental analysis of costs and outcomes (63%), future discounting (44.4%), sensitivity analysis (40.7%), declaration of conflicts of interest (37%) and discussion of ethical considerations (29.6%). HEEs for patients with hip fractures are increasing in publication in recent years. Most of these studies fail to adopt a societal perspective and key aspects of their methodology are poor. The development of future HEEs in this field must adhere to established principles of methodology, so that better quality research can be used to inform health policy on the management of patients with a hip fracture. Copyright © 2014 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Pellejero-Ibanez, Marcos; Chuang, Chia-Hsun; Rubiño-Martín, J. A.; Cuesta, Antonio J.; Wang, Yuting; Zhao, Gongbo; Ross, Ashley J.; Rodríguez-Torres, Sergio; Prada, Francisco; Slosar, Anže; Vazquez, Jose A.; Alam, Shadab; Beutler, Florian; Eisenstein, Daniel J.; Gil-Marín, Héctor; Grieb, Jan Niklas; Ho, Shirley; Kitaura, Francisco-Shu; Percival, Will J.; Rossi, Graziano; Salazar-Albornoz, Salvador; Samushia, Lado; Sánchez, Ariel G.; Satpathy, Siddharth; Seo, Hee-Jong; Tinker, Jeremy L.; Tojeiro, Rita; Vargas-Magaña, Mariana; Brownstein, Joel R.; Nichol, Robert C.; Olmstead, Matthew D.
2017-07-01
We develop a new computationally efficient methodology called double-probe analysis with the aim of minimizing informative priors (those coming from extra probes) in the estimation of cosmological parameters. Using our new methodology, we extract the dark energy model-independent cosmological constraints from the joint data sets of the Baryon Oscillation Spectroscopic Survey (BOSS) galaxy sample and Planck cosmic microwave background (CMB) measurements. We measure the mean values and covariance matrix of {R, la, Ωbh2, ns, log(As), Ωk, H(z), DA(z), f(z)σ8(z)}, which give an efficient summary of the Planck data and two-point statistics from the BOSS galaxy sample. The CMB shift parameters are R=√{Ω _m H_0^2} r(z_*) and la = πr(z*)/rs(z*), where z* is the redshift at the last scattering surface, and r(z*) and rs(z*) denote our comoving distance to the z* and sound horizon at z*, respectively; Ωb is the baryon fraction at z = 0. This approximate methodology guarantees that we will not need to put informative priors on the cosmological parameters that galaxy clustering is unable to constrain, I.e. Ωbh2 and ns. The main advantage is that the computational time required for extracting these parameters is decreased by a factor of 60 with respect to exact full-likelihood analyses. The results obtained show no tension with the flat Λ cold dark matter (ΛCDM) cosmological paradigm. By comparing with the full-likelihood exact analysis with fixed dark energy models, on one hand we demonstrate that the double-probe method provides robust cosmological parameter constraints that can be conveniently used to study dark energy models, and on the other hand we provide a reliable set of measurements assuming dark energy models to be used, for example, in distance estimations. We extend our study to measure the sum of the neutrino mass using different methodologies, including double-probe analysis (introduced in this study), full-likelihood analysis and single-probe analysis. From full-likelihood analysis, we obtain Σmν < 0.12 (68 per cent), assuming ΛCDM and Σmν < 0.20 (68 per cent) assuming owCDM. We also find that there is degeneracy between observational systematics and neutrino masses, which suggests that one should take great care when estimating these parameters in the case of not having control over the systematics of a given sample.
NASA Astrophysics Data System (ADS)
Huda, J.; Kauneckis, D. L.
2013-12-01
Climate change adaptation represents a number of unique policy-making challenges. Foremost among these is dealing with the range of future climate impacts to a wide scope of inter-related natural systems, their interaction with social and economic systems, and uncertainty resulting from the variety of downscaled climate model scenarios and climate science projections. These cascades of uncertainty have led to a number of new approaches as well as a reexamination of traditional methods for evaluating risk and uncertainty in policy-making. Policy makers are required to make decisions and formulate policy irrespective of the level of uncertainty involved and while a debate continues regarding the level of scientific certainty required in order to make a decision, incremental change in the climate policy continues at multiple governance levels. This project conducts a comparative analysis of the range of methodological approaches that are evolving to address uncertainty in climate change policy. It defines 'methodologies' to include a variety of quantitative and qualitative approaches involving both top-down and bottom-up policy processes that attempt to enable policymakers to synthesize climate information into the policy process. The analysis examines methodological approaches to decision-making in climate policy based on criteria such as sources of policy choice information, sectors to which the methodology has been applied, sources from which climate projections were derived, quantitative and qualitative methods used to deal with uncertainty, and the benefits and limitations of each. A typology is developed to better categorize the variety of approaches and methods, examine the scope of policy activities they are best suited for, and highlight areas for future research and development.
[In-depth interviews and the Kano model to determine user requirements in a burns unit].
González-Revaldería, J; Holguín-Holgado, P; Lumbreras-Marín, E; Núñez-López, G
To determine the healthcare requirements of patients in a Burns Unit, using qualitative techniques, such us in-depth personal interviews and Kano's methodology. Qualitative methodology using in-depth personal interviews (12 patients), Kano's conceptual model, and the SERVQHOS questionnaire (24 patients). All patients had been hospitalised in the last 12 months in the Burns Unit. Using Kano's methodology, service attributes were grouped by affinity diagrams, and classified as follows: must-be, attractive (unexpected, great satisfaction), and one-dimensional (linked to the degree of functionality of the service). The outcomes were compared with those obtained with SERVQHOS questionnaire. From the analysis of in-depth interviews, 11 requirements were obtained, referring to hotel aspects, information, need for closer staff relationship, and organisational aspects. The attributes classified as must-be were free television and automatic TV disconnection at midnight. Those classified as attractive were: individual room for more privacy, information about dressing change times in order to avoid anxiety, and additional staff for in-patients. The results were complementary to those obtained with the SERVQHOS questionnaire. In-depth personal interviews provide extra knowledge about patient requirements, complementing the information obtained with questionnaires. With this methodology, a more active patient participation is achieved and the companion's opinion is also taken into account. Copyright © 2016 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.
Nuclear Proliferation Technology Trends Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zentner, Michael D.; Coles, Garill A.; Talbert, Robert J.
2005-10-04
A process is underway to develop mature, integrated methodologies to address nonproliferation issues. A variety of methodologies (both qualitative and quantitative) are being considered. All have one thing in common, a need for a consistent set of proliferation related data that can be used as a basis for application. One approach to providing a basis for predicting and evaluating future proliferation events is to understand past proliferation events, that is, the different paths that have actually been taken to acquire or attempt to acquire special nuclear material. In order to provide this information, this report describing previous material acquisition activitiesmore » (obtained from open source material) has been prepared. This report describes how, based on an evaluation of historical trends in nuclear technology development, conclusions can be reached concerning: (1) The length of time it takes to acquire a technology; (2) The length of time it takes for production of special nuclear material to begin; and (3) The type of approaches taken for acquiring the technology. In addition to examining time constants, the report is intended to provide information that could be used to support the use of the different non-proliferation analysis methodologies. Accordingly, each section includes: (1) Technology description; (2) Technology origin; (3) Basic theory; (4) Important components/materials; (5) Technology development; (6) Technological difficulties involved in use; (7) Changes/improvements in technology; (8) Countries that have used/attempted to use the technology; (9) Technology Information; (10) Acquisition approaches; (11) Time constants for technology development; and (12) Required Concurrent Technologies.« less
Stream habitat analysis using the instream flow incremental methodology
Bovee, Ken D.; Lamb, Berton L.; Bartholow, John M.; Stalnaker, Clair B.; Taylor, Jonathan; Henriksen, Jim
1998-01-01
This document describes the Instream Flow Methodology in its entirety. This also is to serve as a comprehensive introductory textbook on IFIM for training courses as it contains the most complete and comprehensive description of IFIM in existence today. This should also serve as an official guide to IFIM in publication to counteract the misconceptions about the methodology that have pervaded the professional literature since the mid-1980's as this describes IFIM as it is envisioned by its developers. The document is aimed at the decisionmakers of management and allocation of natural resources in providing them an overview; and to those who design and implement studies to inform the decisionmakers. There should be enough background on model concepts, data requirements, calibration techniques, and quality assurance to help the technical user design and implement a cost-effective application of IFIM that will provide policy-relevant information. Some of the chapters deal with basic organization of IFIM, procedural sequence of applying IFIM starting with problem identification, study planning and implementation, and problem resolution.
Assessing Similarity Among Individual Tumor Size Lesion Dynamics: The CICIL Methodology
Girard, Pascal; Ioannou, Konstantinos; Klinkhardt, Ute; Munafo, Alain
2018-01-01
Mathematical models of tumor dynamics generally omit information on individual target lesions (iTLs), and consider the most important variable to be the sum of tumor sizes (TS). However, differences in lesion dynamics might be predictive of tumor progression. To exploit this information, we have developed a novel and flexible approach for the non‐parametric analysis of iTLs, which integrates knowledge from signal processing and machine learning. We called this new methodology ClassIfication Clustering of Individual Lesions (CICIL). We used CICIL to assess similarities among the TS dynamics of 3,223 iTLs measured in 1,056 patients with metastatic colorectal cancer treated with cetuximab combined with irinotecan, in two phase II studies. We mainly observed similar dynamics among lesions within the same tumor site classification. In contrast, lesions in anatomic locations with different features showed different dynamics in about 35% of patients. The CICIL methodology has also been implemented in a user‐friendly and efficient Java‐based framework. PMID:29388396
Introduction to SIMRAND: Simulation of research and development project
NASA Technical Reports Server (NTRS)
Miles, R. F., Jr.
1982-01-01
SIMRAND: SIMulation of Research ANd Development Projects is a methodology developed to aid the engineering and management decision process in the selection of the optimal set of systems or tasks to be funded on a research and development project. A project may have a set of systems or tasks under consideration for which the total cost exceeds the allocated budget. Other factors such as personnel and facilities may also enter as constraints. Thus the project's management must select, from among the complete set of systems or tasks under consideration, a partial set that satisfies all project constraints. The SIMRAND methodology uses analytical techniques and probability theory, decision analysis of management science, and computer simulation, in the selection of this optimal partial set. The SIMRAND methodology is truly a management tool. It initially specifies the information that must be generated by the engineers, thus providing information for the management direction of the engineers, and it ranks the alternatives according to the preferences of the decision makers.
ADVISORY ON UPDATED METHODOLOGY FOR ...
The National Academy of Sciences (NAS) published the Biological Effects of Ionizing Radiation (BEIR) committee's report (BEIR VII) on risks from ionizing radiation exposures in 2006. The Committee analyzed the most recent epidemiology from the important exposed cohorts and factored in changes resulting from the updated analysis of dosimetry for the Japanese atomic bomb survivors. To the extent practical, the Committee also considered relevant radiobiological data, including that from the Department of Energy's low dose effects research program. Based on the review of this information, the Committee proposed a set of models for estimating risks from low-dose ionizing radiation. ORIA then prepared a white paper revising the Agency's methodology for estimating cancer risks from exposure to ionizing radiation in light of this report and other relevant information. This is the first product to be developed as a result of the BEIR VII report. We requested that the SAB conduct an advisory during the development of this methodology. The second product to be prepared will be a revised version of the document,
Time-Varying, Multi-Scale Adaptive System Reliability Analysis of Lifeline Infrastructure Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gearhart, Jared Lee; Kurtz, Nolan Scot
2014-09-01
The majority of current societal and economic needs world-wide are met by the existing networked, civil infrastructure. Because the cost of managing such infrastructure is high and increases with time, risk-informed decision making is essential for those with management responsibilities for these systems. To address such concerns, a methodology that accounts for new information, deterioration, component models, component importance, group importance, network reliability, hierarchical structure organization, and efficiency concerns has been developed. This methodology analyzes the use of new information through the lens of adaptive Importance Sampling for structural reliability problems. Deterioration, multi-scale bridge models, and time-variant component importance aremore » investigated for a specific network. Furthermore, both bridge and pipeline networks are studied for group and component importance, as well as for hierarchical structures in the context of specific networks. Efficiency is the primary driver throughout this study. With this risk-informed approach, those responsible for management can address deteriorating infrastructure networks in an organized manner.« less
Imprecise (fuzzy) information in geostatistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bardossy, A.; Bogardi, I.; Kelly, W.E.
1988-05-01
A methodology based on fuzzy set theory for the utilization of imprecise data in geostatistics is presented. A common problem preventing a broader use of geostatistics has been the insufficient amount of accurate measurement data. In certain cases, additional but uncertain (soft) information is available and can be encoded as subjective probabilities, and then the soft kriging method can be applied (Journal, 1986). In other cases, a fuzzy encoding of soft information may be more realistic and simplify the numerical calculations. Imprecise (fuzzy) spatial information on the possible variogram is integrated into a single variogram which is used in amore » fuzzy kriging procedure. The overall uncertainty of prediction is represented by the estimation variance and the calculated membership function for each kriged point. The methodology is applied to the permeability prediction of a soil liner for hazardous waste containment. The available number of hard measurement data (20) was not enough for a classical geostatistical analysis. An additional 20 soft data made it possible to prepare kriged contour maps using the fuzzy geostatistical procedure.« less
Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn
2006-09-01
Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.
Evaluating the risk of industrial espionage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bott, T.F.
1998-12-31
A methodology for estimating the relative probabilities of different compromise paths for protected information by insider and visitor intelligence collectors has been developed based on an event-tree analysis of the intelligence collection operation. The analyst identifies target information and ultimate users who might attempt to gain that information. The analyst then uses an event tree to develop a set of compromise paths. Probability models are developed for each of the compromise paths that user parameters based on expert judgment or historical data on security violations. The resulting probability estimates indicate the relative likelihood of different compromise paths and provide anmore » input for security resource allocation. Application of the methodology is demonstrated using a national security example. A set of compromise paths and probability models specifically addressing this example espionage problem are developed. The probability models for hard-copy information compromise paths are quantified as an illustration of the results using parametric values representative of historical data available in secure facilities, supplemented where necessary by expert judgment.« less
Simulation of Attacks for Security in Wireless Sensor Network
Diaz, Alvaro; Sanchez, Pablo
2016-01-01
The increasing complexity and low-power constraints of current Wireless Sensor Networks (WSN) require efficient methodologies for network simulation and embedded software performance analysis of nodes. In addition, security is also a very important feature that has to be addressed in most WSNs, since they may work with sensitive data and operate in hostile unattended environments. In this paper, a methodology for security analysis of Wireless Sensor Networks is presented. The methodology allows designing attack-aware embedded software/firmware or attack countermeasures to provide security in WSNs. The proposed methodology includes attacker modeling and attack simulation with performance analysis (node’s software execution time and power consumption estimation). After an analysis of different WSN attack types, an attacker model is proposed. This model defines three different types of attackers that can emulate most WSN attacks. In addition, this paper presents a virtual platform that is able to model the node hardware, embedded software and basic wireless channel features. This virtual simulation analyzes the embedded software behavior and node power consumption while it takes into account the network deployment and topology. Additionally, this simulator integrates the previously mentioned attacker model. Thus, the impact of attacks on power consumption and software behavior/execution-time can be analyzed. This provides developers with essential information about the effects that one or multiple attacks could have on the network, helping them to develop more secure WSN systems. This WSN attack simulator is an essential element of the attack-aware embedded software development methodology that is also introduced in this work. PMID:27869710
Analysis of atomic force microscopy data for surface characterization using fuzzy logic
DOE Office of Scientific and Technical Information (OSTI.GOV)
Al-Mousa, Amjed, E-mail: aalmousa@vt.edu; Niemann, Darrell L.; Niemann, Devin J.
2011-07-15
In this paper we present a methodology to characterize surface nanostructures of thin films. The methodology identifies and isolates nanostructures using Atomic Force Microscopy (AFM) data and extracts quantitative information, such as their size and shape. The fuzzy logic based methodology relies on a Fuzzy Inference Engine (FIE) to classify the data points as being top, bottom, uphill, or downhill. The resulting data sets are then further processed to extract quantitative information about the nanostructures. In the present work we introduce a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and present an omni-directional searchmore » technique to improve the structural recognition accuracy. In order to demonstrate the effectiveness of our approach we present a case study which uses our approach to quantitatively identify particle sizes of two specimens each with a unique gold nanoparticle size distribution. - Research Highlights: {yields} A Fuzzy logic analysis technique capable of characterizing AFM images of thin films. {yields} The technique is applicable to different surfaces regardless of their densities. {yields} Fuzzy logic technique does not require manual adjustment of the algorithm parameters. {yields} The technique can quantitatively capture differences between surfaces. {yields} This technique yields more realistic structure boundaries compared to other methods.« less
75 FR 70966 - Transit Asset Management (TAM) Pilot Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-19
... interoperability between diverse types of information technology systems through use of open data formats and... to develop asset management plans, technical assistance, data collection and a pilot program as... condition assessment methodologies, as well as new data collection and analysis activities. $3 million has...
Space station needs, attributes and architectural options: Mission requirements
NASA Technical Reports Server (NTRS)
1983-01-01
Various mission requirements for the proposed space station are examined. Subjects include modelling methodology, science applications, commercial opportunities, operations analysis, integrated mission requirements, and the role of man in space station functions and activities. The information is presented through the use of graphs.
Meta-Analysis: A Systematic Method for Synthesizing Counseling Research
ERIC Educational Resources Information Center
Whiston, Susan C.; Li, Peiwei
2011-01-01
The authors provide a template for counseling researchers who are interested in quantitatively aggregating research findings. Meta-analytic studies can provide relevant information to the counseling field by systematically synthesizing studies performed by researchers from diverse fields. Methodologically sound meta-analyses require careful…
DOT National Transportation Integrated Search
2012-05-05
As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the ICM AMS methodology successfully and effectively. It provides a step-by-step approach to ...
Pattern recognition of satellite cloud imagery for improved weather prediction
NASA Technical Reports Server (NTRS)
Gautier, Catherine; Somerville, Richard C. J.; Volfson, Leonid B.
1986-01-01
The major accomplishment was the successful development of a method for extracting time derivative information from geostationary meteorological satellite imagery. This research is a proof-of-concept study which demonstrates the feasibility of using pattern recognition techniques and a statistical cloud classification method to estimate time rate of change of large-scale meteorological fields from remote sensing data. The cloud classification methodology is based on typical shape function analysis of parameter sets characterizing the cloud fields. The three specific technical objectives, all of which were successfully achieved, are as follows: develop and test a cloud classification technique based on pattern recognition methods, suitable for the analysis of visible and infrared geostationary satellite VISSR imagery; develop and test a methodology for intercomparing successive images using the cloud classification technique, so as to obtain estimates of the time rate of change of meteorological fields; and implement this technique in a testbed system incorporating an interactive graphics terminal to determine the feasibility of extracting time derivative information suitable for comparison with numerical weather prediction products.
NASA Astrophysics Data System (ADS)
Li, Y. Y.; Zhang, H.; Duan, Z.; Lian, M.; Zhao, G. Y.; Sun, X. H.; Hu, J. D.; Gao, L. N.; Feng, H. Q.; Svanberg, S.
2016-08-01
Identification of agricultural pest insects is an important aspect in insect research and agricultural monitoring. We have performed a methodological study of how spectroscopic techniques and wing-beat frequency analysis might provide relevant information. An optical system based on the combination of close-range remote sensing and reflectance spectroscopy was developed to study the optical characteristics of different flying insects, collected in Southern China. The results demonstrate that the combination of wing-beat frequency assessment and reflectance spectral analysis has the potential to successfully differentiate between insect species. Further, studies of spectroscopic characteristics of fixed specimen of insects, also from Central China, showed the possibility of refined agricultural pest identification. Here, in addition to reflectance recordings also laser-induced fluorescence spectra were investigated for all the species of insects under study and found to provide complementary information to optically distinguish insects. In order to prove the practicality of the techniques explored, clearly fieldwork aiming at elucidating the variability of parameters, even within species, must be performed.
Deep Borehole Emplacement Mode Hazard Analysis Revision 0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sevougian, S. David
This letter report outlines a methodology and provides resource information for the Deep Borehole Emplacement Mode Hazard Analysis (DBEMHA). The main purpose is identify the accident hazards and accident event sequences associated with the two emplacement mode options (wireline or drillstring), to outline a methodology for computing accident probabilities and frequencies, and to point to available databases on the nature and frequency of accidents typically associated with standard borehole drilling and nuclear handling operations. Risk mitigation and prevention measures, which have been incorporated into the two emplacement designs (see Cochran and Hardin 2015), are also discussed. A key intent ofmore » this report is to provide background information to brief subject matter experts involved in the Emplacement Mode Design Study. [Note: Revision 0 of this report is concentrated more on the wireline emplacement mode. It is expected that Revision 1 will contain further development of the preliminary fault and event trees for the drill string emplacement mode.]« less
Towards a Methodology for the Design of Multimedia Public Access Interfaces.
ERIC Educational Resources Information Center
Rowley, Jennifer
1998-01-01
Discussion of information systems methodologies that can contribute to interface design for public access systems covers: the systems life cycle; advantages of adopting information systems methodologies; soft systems methodologies; task-oriented approaches to user interface design; holistic design, the Star model, and prototyping; the…
Deljavan, Reza; Sadeghi-Bazargani, Homayoun; Fouladi, Nasrin; Arshi, Shahnam; Mohammadi, Reza
2012-01-01
Little has been done to investigate the application of injury specific qualitative research methods in the field of burn injuries. The aim of this study was to use an analytical tool (Haddon's matrix) through qualitative research methods to better understand people's perceptions about burn injuries. This study applied Haddon's matrix as a framework and an analytical tool for a qualitative research methodology in burn research. Both child and adult burn injury victims were enrolled into a qualitative study conducted using focus group discussion. Haddon's matrix was used to develop an interview guide and also through the analysis phase. The main analysis clusters were pre-event level/human (including risky behaviors, belief and cultural factors, and knowledge and education), pre-event level/object, pre-event phase/environment and event and post-event phase (including fire control, emergency scald and burn wound management, traditional remedies, medical consultation, and severity indicators). This research gave rise to results that are possibly useful both for future injury research and for designing burn injury prevention plans. Haddon's matrix is applicable in a qualitative research methodology both at data collection and data analysis phases. The study using Haddon's matrix through a qualitative research methodology yielded substantially rich information regarding burn injuries that may possibly be useful for prevention or future quantitative research.
Frempong, Samuel N; Sutton, Andrew J; Davenport, Clare; Barton, Pelham
2018-02-01
There is little specific guidance on the implementation of cost-effectiveness modelling at the early stage of test development. The aim of this study was to review the literature in this field to examine the methodologies and tools that have been employed to date. Areas Covered: A systematic review to identify relevant studies in established literature databases. Five studies were identified and included for narrative synthesis. These studies revealed that there is no consistent approach in this growing field. The perspective of patients and the potential for value of information (VOI) to provide information on the value of future research is often overlooked. Test accuracy is an essential consideration, with most studies having described and included all possible test results in their analysis, and conducted extensive sensitivity analyses on important parameters. Headroom analysis was considered in some instances but at the early development stage (not the concept stage). Expert commentary: The techniques available to modellers that can demonstrate the value of conducting further research and product development (i.e. VOI analysis, headroom analysis) should be better utilized. There is the need for concerted efforts to develop rigorous methodology in this growing field to maximize the value and quality of such analysis.
NASA Technical Reports Server (NTRS)
Oishi, Meeko; Tomlin, Claire; Degani, Asaf
2003-01-01
Human interaction with complex hybrid systems involves the user, the automation's discrete mode logic, and the underlying continuous dynamics of the physical system. Often the user-interface of such systems displays a reduced set of information about the entire system. In safety-critical systems, how can we identify user-interface designs which do not have adequate information, or which may confuse the user? Here we describe a methodology, based on hybrid system analysis, to verify that a user-interface contains information necessary to safely complete a desired procedure or task. Verification within a hybrid framework allows us to account for the continuous dynamics underlying the simple, discrete representations displayed to the user. We provide two examples: a car traveling through a yellow light at an intersection and an aircraft autopilot in a landing/go-around maneuver. The examples demonstrate the general nature of this methodology, which is applicable to hybrid systems (not fully automated) which have operational constraints we can pose in terms of safety. This methodology differs from existing work in hybrid system verification in that we directly account for the user's interactions with the system.
[A functional analysis of healthcare auditors' skills in Venezuela, 2008].
Chirinos-Muñoz, Mónica S
2010-10-01
Using functional analysis for identifying the basic, working, specific and generic skills and values which a health service auditor must have. Implementing the functional analysis technique with 10 experts, identifying specific, basic, generic skills and values by means of deductive logic. A functional map was obtained which started by establishing a key purpose based on improving healthcare and service quality from which three key functions emerged. The main functions and skills' units were then broken down into the competitive elements defining what a health service auditor is able to do. This functional map (following functional analysis methodology) shows in detail the simple and complex tasks which a healthcare auditor should apply in the workplace, adopting a forward management approach for improving healthcare and health service quality. This methodology, expressing logical-deductive awareness raising, provides expert consensual information validating each element regarding overall skills.
Applied Swarm-based medicine: collecting decision trees for patterns of algorithms analysis.
Panje, Cédric M; Glatzer, Markus; von Rappard, Joscha; Rothermundt, Christian; Hundsberger, Thomas; Zumstein, Valentin; Plasswilm, Ludwig; Putora, Paul Martin
2017-08-16
The objective consensus methodology has recently been applied in consensus finding in several studies on medical decision-making among clinical experts or guidelines. The main advantages of this method are an automated analysis and comparison of treatment algorithms of the participating centers which can be performed anonymously. Based on the experience from completed consensus analyses, the main steps for the successful implementation of the objective consensus methodology were identified and discussed among the main investigators. The following steps for the successful collection and conversion of decision trees were identified and defined in detail: problem definition, population selection, draft input collection, tree conversion, criteria adaptation, problem re-evaluation, results distribution and refinement, tree finalisation, and analysis. This manuscript provides information on the main steps for successful collection of decision trees and summarizes important aspects at each point of the analysis.
Saturno-Hernández, Pedro J; Gutiérrez-Reyes, Juan Pablo; Vieyra-Romero, Waldo Ivan; Romero-Martínez, Martín; O'Shea-Cuevas, Gabriel Jaime; Lozano-Herrera, Javier; Tavera-Martínez, Sonia; Hernández-Ávila, Mauricio
2016-01-01
To describe the conceptual framework and methods for implementation and analysis of the satisfaction survey of the Mexican System for Social Protection in Health. We analyze the methodological elements of the 2013, 2014 and 2015 surveys, including the instrument, sampling method and study design, conceptual framework, and characteristics and indicators of the analysis. The survey captures information on perceived quality and satisfaction. Sampling has national and State representation. Simple and composite indicators (index of satisfaction and rate of reported quality problems) are built and described. The analysis is completed using Pareto diagrams, correlation between indicators and association with satisfaction by means of multivariate models. The measurement of satisfaction and perceived quality is a complex but necessary process to comply with regulations and to identify strategies for improvement. The described survey presents a design and rigorous analysis focused on its utility for improving.
NASA Technical Reports Server (NTRS)
1974-01-01
The purpose of the BRAVO User's Manual is to describe the BRAVO methodology in terms of step-by-step procedures. The BRAVO methodology then becomes a tool which a team of analysts can utilize to perform cost effectiveness analyses on potential future space applications with a relatively general set of input information and a relatively small expenditure of resources. An overview of the BRAVO procedure is given by describing the complete procedure in a general form.
Research fronts analysis : A bibliometric to identify emerging fields of research
NASA Astrophysics Data System (ADS)
Miwa, Sayaka; Ando, Satoko
Research fronts analysis identifies emerging areas of research through observing co-clustering in highly-cited papers. This article introduces the concept of research fronts analysis, explains its methodology and provides case examples. It also demonstrates developing research fronts in Japan by looking at the past winners of Thomson Reuters Research Fronts Awards. Research front analysis is currently being used by the Japanese government to determine new trends in science and technology. Information professionals can also utilize this bibliometric as a research evaluation tool.
DuVernet, Amy M; Dierdorff, Erich C; Wilson, Mark A
2015-09-01
Work analysis is fundamental to designing effective human resource systems. The current investigation extends previous research by identifying the differential effects of common design decisions, purposes, and organizational contexts on the data generated by work analyses. The effects of 19 distinct factors that span choices of descriptor, collection method, rating scale, and data source, as well as project purpose and organizational features, are explored. Meta-analytic results cumulated from 205 articles indicate that many of these variables hold significant consequences for work analysis data. Factors pertaining to descriptor choice, collection method, rating scale, and the purpose for conducting the work analysis each showed strong associations with work analysis data. The source of the work analysis information and organizational context in which it was conducted displayed fewer relationships. Findings can be used to inform choices work analysts make about methodology and postcollection evaluations of work analysis information. (c) 2015 APA, all rights reserved).
2008-09-01
software facilitate targeting problem understanding and the network analysis tool, Palantir , as an efficient and tailored semi-automated means to...the use of compendium software facilitate targeting problem understanding and the network analysis tool, Palantir , as an efficient and tailored semi...OBJECTIVES USING COMPENDIUM SOFTWARE .....63 E. HOT TARGET PRIORITIZATION AND DEVELOPMENT USING PALANTIR SOFTWARE .................................69 1
Quantitation and detection of vanadium in biologic and pollution materials
NASA Technical Reports Server (NTRS)
Gordon, W. A.
1974-01-01
A review is presented of special considerations and methodology for determining vanadium in biological and air pollution materials. In addition to descriptions of specific analysis procedures, general sections are included on quantitation of analysis procedures, sample preparation, blanks, and methods of detection of vanadium. Most of the information presented is applicable to the determination of other trace elements in addition to vanadium.
Diagnostic Application of Absolute Neutron Activation Analysis in Hematology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zamboni, C.B.; Oliveira, L.C.; Dalaqua, L. Jr.
2004-10-03
The Absolute Neutron Activation Analysis (ANAA) technique was used to determine element concentrations of Cl and Na in blood of healthy group (male and female blood donators), select from Blood Banks at Sao Paulo city, to provide information which can help in diagnosis of patients. This study permitted to perform a discussion about the advantages and limitations of using this nuclear methodology in hematological examinations.
ERIC Educational Resources Information Center
Bashaw, W. L., Ed.; Findley, Warren G., Ed.
This volume contains the five major addresses and subsequent discussion from the Symposium on the General Linear Models Approach to the Analysis of Experimental Data in Educational Research, which was held in 1967 in Athens, Georgia. The symposium was designed to produce systematic information, including new methodology, for dissemination to the…
NASA Astrophysics Data System (ADS)
Okawa, Tsutomu; Kaminishi, Tsukasa; Hirabayashi, Syuichi; Suzuki, Ryo; Mitsui, Hiroyasu; Koizumi, Hisao
The business in the enterprise is closely related with the information system to such an extent that the business activities are difficult without the information system. The system design technique that considers the business process well, and that enables a quick system development is requested. In addition, the demand for the development cost is also severe than before. To cope with the current situation, the modeling technology named BPM(Business Process Management/Modeling)is drawing attention and becoming important as a key technology. BPM is a technology to model business activities as business processes and visualize them to improve the business efficiency. However, a general methodology to develop the information system using the analysis result of BPM doesn't exist, and a few development cases are reported. This paper proposes an information system development method combining business process modeling with executable modeling. In this paper we describe a guideline to support consistency of development and development efficiency and the framework enabling to develop the information system from model. We have prototyped the information system with the proposed method and our experience has shown that the methodology is valuable.
Architectural approaches for HL7-based health information systems implementation.
López, D M; Blobel, B
2010-01-01
Information systems integration is hard, especially when semantic and business process interoperability requirements need to be met. To succeed, a unified methodology, approaching different aspects of systems architecture such as business, information, computational, engineering and technology viewpoints, has to be considered. The paper contributes with an analysis and demonstration on how the HL7 standard set can support health information systems integration. Based on the Health Information Systems Development Framework (HIS-DF), common architectural models for HIS integration are analyzed. The framework is a standard-based, consistent, comprehensive, customizable, scalable methodology that supports the design of semantically interoperable health information systems and components. Three main architectural models for system integration are analyzed: the point to point interface, the messages server and the mediator models. Point to point interface and messages server models are completely supported by traditional HL7 version 2 and version 3 messaging. The HL7 v3 standard specification, combined with service-oriented, model-driven approaches provided by HIS-DF, makes the mediator model possible. The different integration scenarios are illustrated by describing a proof-of-concept implementation of an integrated public health surveillance system based on Enterprise Java Beans technology. Selecting the appropriate integration architecture is a fundamental issue of any software development project. HIS-DF provides a unique methodological approach guiding the development of healthcare integration projects. The mediator model - offered by the HIS-DF and supported in HL7 v3 artifacts - is the more promising one promoting the development of open, reusable, flexible, semantically interoperable, platform-independent, service-oriented and standard-based health information systems.
SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Payne, Suzette; Coppersmith, Ryan; Coppersmith, Kevin
A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Naval Reactors Facility (NRF), and the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL) (Figure 1-1). The PSHA followed the approaches and procedures appropriate for a Study Level 1 provided in the guidance advanced by the Senior Seismic Hazard Analysis Committee (SSHAC) in U.S. Nuclear Regulatory Commission (NRC) NUREG/CR-6372 and NUREG-2117 (NRC, 1997; 2012a). The SSHAC Level 1 PSHAs for MFC and ATR were conducted as part of the Seismic Risk Assessment (SRA) project (INL Project number 31287) to develop and apply a new-riskmore » informed methodology, respectively. The SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels. The SRA project is developing a new risk-informed methodology that will provide a systematic approach for evaluating the need for an update of an existing PSHA. The new methodology proposes criteria to be employed at specific analysis, decision, or comparison points in its evaluation process. The first four of seven criteria address changes in inputs and results of the PSHA and are given in U.S. Department of Energy (DOE) Standard, DOE-STD-1020-2012 (DOE, 2012a) and American National Standards Institute/American Nuclear Society (ANSI/ANS) 2.29 (ANS, 2008a). The last three criteria address evaluation of quantitative hazard and risk-focused information of an existing nuclear facility. The seven criteria and decision points are applied to Seismic Design Category (SDC) 3, 4, and 5, which are defined in American Society of Civil Engineers/Structural Engineers Institute (ASCE/SEI) 43-05 (ASCE, 2005). The application of the criteria and decision points could lead to an update or could determine that such update is not necessary.« less
Unsupervised User Similarity Mining in GSM Sensor Networks
Shad, Shafqat Ali; Chen, Enhong
2013-01-01
Mobility data has attracted the researchers for the past few years because of its rich context and spatiotemporal nature, where this information can be used for potential applications like early warning system, route prediction, traffic management, advertisement, social networking, and community finding. All the mentioned applications are based on mobility profile building and user trend analysis, where mobility profile building is done through significant places extraction, user's actual movement prediction, and context awareness. However, significant places extraction and user's actual movement prediction for mobility profile building are a trivial task. In this paper, we present the user similarity mining-based methodology through user mobility profile building by using the semantic tagging information provided by user and basic GSM network architecture properties based on unsupervised clustering approach. As the mobility information is in low-level raw form, our proposed methodology successfully converts it to a high-level meaningful information by using the cell-Id location information rather than previously used location capturing methods like GPS, Infrared, and Wifi for profile mining and user similarity mining. PMID:23576905
Methodological challenges to human medical study.
Zhong, Yixin; Liu, Baoyan; Qu, Hua; Xie, Qi
2014-09-01
With the transformation of modern medicinal pattern, medical studies are confronted with methodological challenges. By analyzing two methodologies existing in the study of physical matter system and information system, the article points out that traditional Chinese medicine (TCM), especially the treatment based on syndrome differentiation, embodies information conception of methodological positions, while western medicine represents matter conception of methodological positions. It proposes a new way of thinking about combination of TCM and western medicine by combinating two kinds of methodological methods.
"Testimonios" Informing a Human Rights and Social Justice Education Framework
ERIC Educational Resources Information Center
Prieto, Linda
2016-01-01
The recalling and documenting of "testimonio" "as a conceptual and methodological tool that transforms cultural and personal narratives into critical social analysis" (Fuentes & Pérez, 2016) is not an easy process. Often tears, "coraje" (both courage and rage) and laughter accompany this process--a transformative…
Mapping Sustainability Efforts at the Claremont Colleges
ERIC Educational Resources Information Center
Srebotnjak, Tanja; Norgaard, Lee Michelle
2017-01-01
Purpose: The purpose of this study is to map and analyze sustainability activities and relationships at the seven Claremont Colleges and graduate institutions using social network analysis (SNA) to inform sustainability planning and programming. Design/methodology/approach: Online surveys and interviews were conducted among faculty, staff and…
Motivation and Satisfaction in Internet-Supported Learning Environments: A Review
ERIC Educational Resources Information Center
Bekele, Teklu Abate
2010-01-01
Previous studies examined student motivation and satisfaction in Internet-Supported Learning Environments (ISLE) in higher education but none provided a comprehensive analysis of significant methodological and theoretical issues. To contribute toward filling this knowledge gap and then to better inform instructional systems development, practice,…
This article presents a general and versatile methodology for assessing sustainability with Fisher Information as a function of dynamic changes in urban systems. Using robust statistical methods, six Metropolitan Statistical Areas (MSAs) in Ohio were evaluated to comparatively as...
Cognitive approaches for patterns analysis and security applications
NASA Astrophysics Data System (ADS)
Ogiela, Marek R.; Ogiela, Lidia
2017-08-01
In this paper will be presented new opportunities for developing innovative solutions for semantic pattern classification and visual cryptography, which will base on cognitive and bio-inspired approaches. Such techniques can be used for evaluation of the meaning of analyzed patterns or encrypted information, and allow to involve such meaning into the classification task or encryption process. It also allows using some crypto-biometric solutions to extend personalized cryptography methodologies based on visual pattern analysis. In particular application of cognitive information systems for semantic analysis of different patterns will be presented, and also a novel application of such systems for visual secret sharing will be described. Visual shares for divided information can be created based on threshold procedure, which may be dependent on personal abilities to recognize some image details visible on divided images.
A Study of Technical Engineering Peer Reviews at NASA
NASA Technical Reports Server (NTRS)
Chao, Lawrence P.; Tumer, Irem Y.; Bell, David G.
2003-01-01
This report describes the state of practices of design reviews at NASA and research into what can be done to improve peer review practices. There are many types of reviews at NASA: required and not, formalized and informal, programmatic and technical. Standing project formal reviews such as the Preliminary Design Review and Critical Design Review are a required part of every project and mission development. However, the technical, engineering peer reviews that support teams' work on such projects are informal, some times ad hoc, and inconsistent across the organization. The goal of this work is to identify best practices and lessons learned from NASA's experience, supported by academic research and methodologies to ultimately improve the process. This research has determined that the organization, composition, scope, and approach of the reviews impact their success. Failure Modes and Effects Analysis (FMEA) can identify key areas of concern before or in the reviews. Product definition tools like the Project Priority Matrix, engineering-focused Customer Value Chain Analysis (CVCA), and project or system-based Quality Function Deployment (QFD) help prioritize resources in reviews. The use of information technology and structured design methodologies can strengthen the engineering peer review process to help NASA work towards error-proofing the design process.
Yager, Douglas B.; Hofstra, Albert H.; Granitto, Matthew
2012-01-01
This report emphasizes geographic information system analysis and the display of data stored in the legacy U.S. Geological Survey National Geochemical Database for use in mineral resource investigations. Geochemical analyses of soils, stream sediments, and rocks that are archived in the National Geochemical Database provide an extensive data source for investigating geochemical anomalies. A study area in the Egan Range of east-central Nevada was used to develop a geographic information system analysis methodology for two different geochemical datasets involving detailed (Bureau of Land Management Wilderness) and reconnaissance-scale (National Uranium Resource Evaluation) investigations. ArcGIS was used to analyze and thematically map geochemical information at point locations. Watershed-boundary datasets served as a geographic reference to relate potentially anomalous sample sites with hydrologic unit codes at varying scales. The National Hydrography Dataset was analyzed with Hydrography Event Management and ArcGIS Utility Network Analyst tools to delineate potential sediment-sample provenance along a stream network. These tools can be used to track potential upstream-sediment-contributing areas to a sample site. This methodology identifies geochemically anomalous sample sites, watersheds, and streams that could help focus mineral resource investigations in the field.
AERIS: An Integrated Domain Information System for Aerospace Science and Technology
ERIC Educational Resources Information Center
Hatua, Sudip Ranjan; Madalli, Devika P.
2011-01-01
Purpose: The purpose of this paper is to discuss the methodology in building an integrated domain information system with illustrations that provide proof of concept. Design/methodology/approach: The present work studies the usual search engine approach to information and its pitfalls. A methodology was adopted for construction of a domain-based…
2016-12-22
assumptions of behavior. This research proposes an information theoretic methodology to discover such complex network structures and dynamics while overcoming...the difficulties historically associated with their study. Indeed, this was the first application of an information theoretic methodology as a tool...1 Research Objectives and Questions..............................................................................2 Methodology
An emerging discourse: toward epistemic diversity in nursing.
Georges, Jane M
2003-01-01
Grounded in a postmodern feminist methodology, this article undertakes an initial analysis of a newly emerging discourse in contemporary nursing academia in the United States. Two currently prominent discourses in nursing, a dominant discourse informed by the processes and values of "science" in the Enlightenment sense and a concurrent marginalized discourse informed by postmodernism, are described as a context for the emerging discourse. A genealogy informed by the work of Foucault is presented as a basis for an analysis of the power effects resulting from the conflict between these 2 discourses. Finally, 3 recent texts in nursing are analyzed and common themes identified as indicative of a new intertextual discourse, termed "epistemic diversity," emerging from this discursive conflict.
NASA Technical Reports Server (NTRS)
Boton, Matthew L.; Bass, Ellen J.; Comstock, James R., Jr.
2006-01-01
The evaluation of human-centered systems can be performed using a variety of different methodologies. This paper describes a human-centered systems evaluation methodology where participants watch 5-second non-interactive videos of a system in operation before supplying judgments and subjective measures based on the information conveyed in the videos. This methodology was used to evaluate the ability of different textures and fields of view to convey spatial awareness in synthetic vision systems (SVS) displays. It produced significant results for both judgment based and subjective measures. This method is compared to other methods commonly used to evaluate SVS displays based on cost, the amount of experimental time required, experimental flexibility, and the type of data provided.
Understanding palliative care on the heart failure care team: an innovative research methodology.
Lingard, Lorelei A; McDougall, Allan; Schulz, Valerie; Shadd, Joshua; Marshall, Denise; Strachan, Patricia H; Tait, Glendon R; Arnold, J Malcolm; Kimel, Gil
2013-05-01
There is a growing call to integrate palliative care for patients with advanced heart failure (HF). However, the knowledge to inform integration efforts comes largely from interview and survey research with individual patients and providers. This work has been critically important in raising awareness of the need for integration, but it is insufficient to inform solutions that must be enacted not by isolated individuals but by complex care teams. Research methods are urgently required to support systematic exploration of the experiences of patients with HF, family caregivers, and health care providers as they interact as a care team. To design a research methodology that can support systematic exploration of the experiences of patients with HF, caregivers, and health care providers as they interact as a care team. This article describes in detail a methodology that we have piloted and are currently using in a multisite study of HF care teams. We describe three aspects of the methodology: the theoretical framework, an innovative sampling strategy, and an iterative system of data collection and analysis that incorporates four data sources and four analytical steps. We anticipate that this innovative methodology will support groundbreaking research in both HF care and other team settings in which palliative integration efforts are emerging for patients with advanced nonmalignant disease. Copyright © 2013 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.
A hierarchical modeling methodology for the definition and selection of requirements
NASA Astrophysics Data System (ADS)
Dufresne, Stephane
This dissertation describes the development of a requirements analysis methodology that takes into account the concept of operations and the hierarchical decomposition of aerospace systems. At the core of the methodology, the Analytic Network Process (ANP) is used to ensure the traceability between the qualitative and quantitative information present in the hierarchical model. The proposed methodology is implemented to the requirements definition of a hurricane tracker Unmanned Aerial Vehicle. Three research objectives are identified in this work; (1) improve the requirements mapping process by matching the stakeholder expectations with the concept of operations, systems and available resources; (2) reduce the epistemic uncertainty surrounding the requirements and requirements mapping; and (3) improve the requirements down-selection process by taking into account the level of importance of the criteria and the available resources. Several challenges are associated with the identification and definition of requirements. The complexity of the system implies that a large number of requirements are needed to define the systems. These requirements are defined early in the conceptual design, where the level of knowledge is relatively low and the level of uncertainty is large. The proposed methodology intends to increase the level of knowledge and reduce the level of uncertainty by guiding the design team through a structured process. To address these challenges, a new methodology is created to flow-down the requirements from the stakeholder expectations to the systems alternatives. A taxonomy of requirements is created to classify the information gathered during the problem definition. Subsequently, the operational and systems functions and measures of effectiveness are integrated to a hierarchical model to allow the traceability of the information. Monte Carlo methods are used to evaluate the variations of the hierarchical model elements and consequently reduce the epistemic uncertainty. The proposed methodology is applied to the design of a hurricane tracker Unmanned Aerial Vehicles to demonstrate the origin and impact of requirements on the concept of operations and systems alternatives. This research demonstrates that the hierarchical modeling methodology provides a traceable flow-down of the requirements from the problem definition to the systems alternatives phases of conceptual design.
Trial-Based Economic Evaluations in Occupational Health
van Wier, Marieke F.; Tompa, Emile; Bongers, Paulien M.; van der Beek, Allard J.; van Tulder, Maurits W.; Bosmans, Judith E.
2014-01-01
To allocate available resources as efficiently as possible, decision makers need information on the relative economic merits of occupational health and safety (OHS) interventions. Economic evaluations can provide this information by comparing the costs and consequences of alternatives. Nevertheless, only a few of the studies that consider the effectiveness of OHS interventions take the extra step of considering their resource implications. Moreover, the methodological quality of those that do is generally poor. Therefore, this study aims to help occupational health researchers conduct high-quality trial-based economic evaluations by discussing the theory and methodology that underlie them, and by providing recommendations for good practice regarding their design, analysis, and reporting. This study also helps consumers of this literature with understanding and critically appraising trial-based economic evaluations of OHS interventions. PMID:24854249
Trial-based economic evaluations in occupational health: principles, methods, and recommendations.
van Dongen, Johanna M; van Wier, Marieke F; Tompa, Emile; Bongers, Paulien M; van der Beek, Allard J; van Tulder, Maurits W; Bosmans, Judith E
2014-06-01
To allocate available resources as efficiently as possible, decision makers need information on the relative economic merits of occupational health and safety (OHS) interventions. Economic evaluations can provide this information by comparing the costs and consequences of alternatives. Nevertheless, only a few of the studies that consider the effectiveness of OHS interventions take the extra step of considering their resource implications. Moreover, the methodological quality of those that do is generally poor. Therefore, this study aims to help occupational health researchers conduct high-quality trial-based economic evaluations by discussing the theory and methodology that underlie them, and by providing recommendations for good practice regarding their design, analysis, and reporting. This study also helps consumers of this literature with understanding and critically appraising trial-based economic evaluations of OHS interventions.
Unicomb, Rachael; Colyvas, Kim; Harrison, Elisabeth; Hewat, Sally
2015-06-01
Case-study methodology studying change is often used in the field of speech-language pathology, but it can be criticized for not being statistically robust. Yet with the heterogeneous nature of many communication disorders, case studies allow clinicians and researchers to closely observe and report on change. Such information is valuable and can further inform large-scale experimental designs. In this research note, a statistical analysis for case-study data is outlined that employs a modification to the Reliable Change Index (Jacobson & Truax, 1991). The relationship between reliable change and clinical significance is discussed. Example data are used to guide the reader through the use and application of this analysis. A method of analysis is detailed that is suitable for assessing change in measures with binary categorical outcomes. The analysis is illustrated using data from one individual, measured before and after treatment for stuttering. The application of this approach to assess change in categorical, binary data has potential application in speech-language pathology. It enables clinicians and researchers to analyze results from case studies for their statistical and clinical significance. This new method addresses a gap in the research design literature, that is, the lack of analysis methods for noncontinuous data (such as counts, rates, proportions of events) that may be used in case-study designs.
[The workplace-based learning: a main paradigm of an effective continuing medical education].
Lelli, Maria Barbara
2010-01-01
On the strength of the literature analysis and the Emilia-Romagna Region experience, we suggest a reflection on the workplace-based learning that goes beyond the analysis of the effectiveness of specific didactic methodologies and aspects related to Continuing Medical Education. Health education and training issue is viewed from a wider perspective, that integrates the three learning dimensions (formal, non formal and informal). In such a perspective the workplace-based learning becomes an essential paradigm to reshape the explicit knowledge conveyed in formal context and to emphasize informal contexts where innovation is generated.
NASA Astrophysics Data System (ADS)
Mora, R.; Barahona, A.; Aguilar, H.
2015-04-01
This paper presents a method for using high detail volumetric information, captured with a land based photogrammetric survey, to obtain information from individual trees. Applying LIDAR analysis techniques it is possible to measure diameter at breast height, height at first branch (commercial height), basal area and volume of an individual tree. Given this information it is possible to calculate how much of that tree can be exploited as wood. The main objective is to develop a methodology for successfully surveying one individual tree, capturing every side of the stem a using high resolution digital camera and reference marks with GPS coordinates. The process is executed for several individuals of two species present in the metropolitan area in San Jose, Costa Rica, Delonix regia (Bojer) Raf. and Tabebuia rosea (Bertol.) DC., each one with different height, stem shape and crown area. Using a photogrammetry suite all the pictures are aligned, geo-referenced and a dense point cloud is generated with enough detail to perform the required measurements, as well as a solid tridimensional model for volume measurement. This research will open the way to develop a capture methodology with an airborne camera using close range UAVs. An airborne platform will make possible to capture every individual in a forest plantation, furthermore if the analysis techniques applied in this research are automated it will be possible to calculate with high precision the exploit potential of a forest plantation and improve its management.
ERIC Educational Resources Information Center
Borthwick, J.; Knight, B.; Bender, A.; Loveder, P.
These two volumes provide information on the scope of adult and community education (ACE) in Australia and implications for improved data collection and reporting. Volume 1 begins with a glossary. Chapter 1 addresses project objectives and processes and methodology. Chapter 2 analyzes the scope and diversity of ACE in terms of what is currently…
Dynamic Decision Making under Uncertainty and Partial Information
2017-01-30
order to address these problems, we investigated efficient computational methodologies for dynamic decision making under uncertainty and partial...information. In the course of this research, we developed and studied efficient simulation-based methodologies for dynamic decision making under...uncertainty and partial information; (ii) studied the application of these decision making models and methodologies to practical problems, such as those
Leveraging Our Expertise To Inform International RE Roadmaps | Energy
energy targets to support Mexico's renewable energy goal. NREL and its Mexico partners developed the institutions need to take to determine how the electricity infrastructure and systems must change to accommodate high levels of renewables. The roadmap focuses on analysis methodologies-including grid expansion
Resource Characterization | Water Power | NREL
characterization and assessment, NREL has extended its capabilities to the field of water power. NREL's team of , modeling, data analysis, and Geographic Information Systems. Many years of experience in wind assessment have enabled NREL to develop the skills and methodologies to evaluate the development potential of many
Rapid economic analysis of northern hardwood stand improvement options
William B. Leak
1980-01-01
Data and methodology are provided for projecting basal area, diameter, volumes, and values by product for northern hardwood stands, and for determining the rate of return on stand improvement investments. The method is rapid, requires a minimum amount of information, and should prove useful for on-the-ground economic analyses.
In Vitro Evaluation of a Program for Machine-Aided Indexing.
ERIC Educational Resources Information Center
Jacquemin, Christian; Daille, Beatrice; Royaute, Jean; Polanco, Xavier
2002-01-01
Presents the human evaluation of ILIAD, a program for machine-aided indexing that was designed to assist expert librarians in computer-aided indexing and document analysis. Topics include controlled indexing and free indexing; natural language and concept-based information retrieval; evaluation methodology; syntactic variations; and a comparison…
Training and Learning in the Knowledge and Service Economy
ERIC Educational Resources Information Center
Sloman, Martyn; Philpott, John
2006-01-01
Purpose: The purpose of this paper is to consider whether the shift from training to learning is related to employment categories using a categorisation popularised by Robert Reich. Design/methodology/approach: Collation and analysis of existing CIPD research information and assessment of labour statistics. Findings: An examination of the national…
Projecting Timber Inventory at the Product Level
Lawrence Teeter; Xiaoping Zhou
1999-01-01
Current timber inventory projections generally lack information on inventory by product classes. Most models available for inventory projection and linked to supply analyses are limited to projecting aggregate softwood and hardwood. The research presented describes a methodology for distributing the volume on each FIA (USDA Forest Service Forest Inventory and Analysis...
James, Abi; Draffan, E A; Wald, Mike
2017-01-01
This paper presents a gap analysis between crowdsourced functional accessibility evaluations of ebooks conducted by non-experts and the technical accessibility standards employed by developers. It also illustrates how combining these approaches can provide more appropriate information for a wider group of users with print impairments.
ICT-Based, Cross-Cultural Communication: A Methodological Perspective
ERIC Educational Resources Information Center
Larsen, Niels; Bruselius-Jensen, Maria; Danielsen, Dina; Nyamai, Rachael; Otiende, James; Aagaard-Hansen, Jens
2014-01-01
The article discusses how cross-cultural communication based on information and communication technologies (ICT) may be used in participatory health promotion as well as in education in general. The analysis draws on experiences from a health education research project with grade 6 (approx. 12 years) pupils in Nairobi (Kenya) and Copenhagen…
Challenges Facing Principals in the First Year at Their Schools
ERIC Educational Resources Information Center
Bayar, Adem
2016-01-01
The aim of this study is to identify key challenges of practice that principals face. In line with this purpose, the researcher has employed a qualitative research methodology, interviewing principals working in Amasya district, over and above doing document analysis to collect detailed information concerning leadership and administration…
Quality Assurance Strategies in Higher Education: The Case of Ghanaian Polytechnics
ERIC Educational Resources Information Center
Swanzy, Patrick; Potts, Anthony
2017-01-01
This study investigated the internal quality assurance strategies Ghanaian polytechnics enacted in order to improve graduate outcomes. A qualitative methodological approach via in-depth interviewing of twenty key informants and document analysis was used to establish how polytechnics in Ghana addressed problematic situations in quality assurance.…
Analysis of Senate Amendment 2028, the Climate Stewardship Act of 2003
2004-01-01
On May 11, 2004, Senator Landrieu asked the Energy Information Administration (EIA) to evaluate SA 2028. This paper responds to that request, relying on the modeling methodology, data sources, and assumptions used to analyze the original bill, as extensively documented in EIA's June 2003 report.
High School Students Presenting Science: An Interactional Sociolinguistic Analysis.
ERIC Educational Resources Information Center
Bleicher, Robert
1994-01-01
This study examines strategies employed by students as they make science presentations; assesses student's conceptual understandings of particular science topics; and investigates gender differences. Focuses on the methodology employed and how it helps inform these goals. Advice for teaching students to present science, implications for use of…
ERIC Educational Resources Information Center
Enyi, Amaechi Uneke
2015-01-01
The study entitled "Language and Interactional Discourse: Deconstructing the Talk-Generating Machinery in Natural Conversation" is an analysis of spontaneous and informal conversation. The study, carried out in the theoretical and methodological tradition of Ethnomethodology, was aimed at explicating how ordinary talk is organized and…
Analysis of Trends from DoD Level Peer Reviewed Contracts
2014-06-01
Kluwer Academic Publishers. Layton , E . (2007). The Defense Acquisition University: Training professionals for the acquisition workforce 1992–2003...3 E . RESEARCH METHODOLOGY ...................................................................4 F. ORGANIZATION OF REPORT...16 D. PROCEDURES, GUIDANCE, AND INFORMATION 201.170 CONDUCTING PEER REVIEWS ..............................................................16 E
NASA Astrophysics Data System (ADS)
Tignanelli, H. L.; Vazquez, R. A.; Mostaccio, C.; Gordillo, S.; Plastino, A.
1990-11-01
RESUMEN. Presentamos una metodologia de analisis de la homogeneidad a partir de la Teoria de la Informaci6n, aplicable a muestras de datos observacionales. ABSTRACT:Standard concepts that underlie Information Theory are employed in order design a methodology that enables one to analyze the homogeneity of a given data sample. Key : DATA ANALYSIS
The use of geospatial web services for exchanging utilities data
NASA Astrophysics Data System (ADS)
Kuczyńska, Joanna
2013-04-01
Geographic information technologies and related geo-information systems currently play an important role in the management of public administration in Poland. One of these tasks is to maintain and update Geodetic Evidence of Public Utilities (GESUT), part of the National Geodetic and Cartographic Resource, which contains an important for many institutions information of technical infrastructure. It requires an active exchange of data between the Geodesy and Cartography Documentation Centers and institutions, which administrate transmission lines. The administrator of public utilities, is legally obliged to provide information about utilities to GESUT. The aim of the research work was to develop a universal data exchange methodology, which can be implemented on a variety of hardware and software platforms. This methodology use Unified Modeling Language (UML), eXtensible Markup Language (XML), and Geography Markup Language (GML). The proposed methodology is based on the two different strategies: Model Driven Architecture (MDA) and Service Oriented Architecture (SOA). Used solutions are consistent with the INSPIRE Directive and ISO 19100 series standards for geographic information. On the basis of analysis of the input data structures, conceptual models were built for both databases. Models were written in the universal modeling language: UML. Combined model that defines a common data structure was also built. This model was transformed into developed for the exchange of geographic information GML standard. The structure of the document describing the data that may be exchanged is defined in the .xsd file. Network services were selected and implemented in the system designed for data exchange based on open source tools. Methodology was implemented and tested. Data in the agreed data structure and metadata were set up on the server. Data access was provided by geospatial network services: data searching possibilities by Catalog Service for the Web (CSW), data collection by Web Feature Service (WFS). WFS provides also operation for modification data, for example to update them by utility administrator. The proposed solution significantly increases the efficiency of data exchange and facilitates maintenance the National Geodetic and Cartographic Resource.
Probabilistic Flood Maps to support decision-making: Mapping the Value of Information
NASA Astrophysics Data System (ADS)
Alfonso, L.; Mukolwe, M. M.; Di Baldassarre, G.
2016-02-01
Floods are one of the most frequent and disruptive natural hazards that affect man. Annually, significant flood damage is documented worldwide. Flood mapping is a common preimpact flood hazard mitigation measure, for which advanced methods and tools (such as flood inundation models) are used to estimate potential flood extent maps that are used in spatial planning. However, these tools are affected, largely to an unknown degree, by both epistemic and aleatory uncertainty. Over the past few years, advances in uncertainty analysis with respect to flood inundation modeling show that it is appropriate to adopt Probabilistic Flood Maps (PFM) to account for uncertainty. However, the following question arises; how can probabilistic flood hazard information be incorporated into spatial planning? Thus, a consistent framework to incorporate PFMs into the decision-making is required. In this paper, a novel methodology based on Decision-Making under Uncertainty theories, in particular Value of Information (VOI) is proposed. Specifically, the methodology entails the use of a PFM to generate a VOI map, which highlights floodplain locations where additional information is valuable with respect to available floodplain management actions and their potential consequences. The methodology is illustrated with a simplified example and also applied to a real case study in the South of France, where a VOI map is analyzed on the basis of historical land use change decisions over a period of 26 years. Results show that uncertain flood hazard information encapsulated in PFMs can aid decision-making in floodplain planning.
Law, Andrew J.; Sharma, Gaurav; Schieber, Marc H.
2014-01-01
We present a methodology for detecting effective connections between simultaneously recorded neurons using an information transmission measure to identify the presence and direction of information flow from one neuron to another. Using simulated and experimentally-measured data, we evaluate the performance of our proposed method and compare it to the traditional transfer entropy approach. In simulations, our measure of information transmission outperforms transfer entropy in identifying the effective connectivity structure of a neuron ensemble. For experimentally recorded data, where ground truth is unavailable, the proposed method also yields a more plausible connectivity structure than transfer entropy. PMID:21096617
NASA Astrophysics Data System (ADS)
Miola, Apollonia; Ciuffo, Biagio
2011-04-01
Maritime transport plays a central role in the transport sector's sustainability debate. Its contribution to air pollution and greenhouse gases is significant. An effective policy strategy to regulate air emissions requires their robust estimation in terms of quantification and location. This paper provides a critical analysis of the ship emission modelling approaches and data sources available, identifying their limits and constraints. It classifies the main methodologies on the basis of the approach followed (bottom-up or top-down) for the evaluation and geographic characterisation of emissions. The analysis highlights the uncertainty of results from the different methods. This is mainly due to the level of uncertainty connected with the sources of information that are used as inputs to the different studies. This paper describes the sources of the information required for these analyses, paying particular attention to AIS data and to the possible problems associated with their use. One way of reducing the overall uncertainty in the results could be the simultaneous use of different sources of information. This paper presents an alternative methodology based on this approach. As a final remark, it can be expected that new approaches to the problem together with more reliable data sources over the coming years could give more impetus to the debate on the global impact of maritime traffic on the environment that, currently, has only reached agreement via the "consensus" estimates provided by IMO (2009).
European Healthy Cities evaluation: conceptual framework and methodology.
de Leeuw, Evelyne; Green, Geoff; Dyakova, Mariana; Spanswick, Lucy; Palmer, Nicola
2015-06-01
This paper presents the methodology, programme logic and conceptual framework that drove the evaluation of the Fifth Phase of the WHO European Healthy Cities Network. Towards the end of the phase, 99 cities were designated progressively through the life of the phase (2009-14). The paper establishes the values, systems and aspirations that these cities sign up for, as foundations for the selection of methodology. We assert that a realist synthesis methodology, driven by a wide range of qualitative and quantitative methods, is the most appropriate perspective to address the wide geopolitical, demographic, population and health diversities of these cities. The paper outlines the rationale for a structured multiple case study approach, the deployment of a comprehensive questionnaire, data mining through existing databases including Eurostat and analysis of management information generation tools used throughout the period. Response rates were considered extremely high for this type of research. Non-response analyses are described, which show that data are representative for cities across the spectrum of diversity. This paper provides a foundation for further analysis on specific areas of interest presented in this supplement. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang
2016-07-01
This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKinney, Adriana L.; Varga, Tamas
Branching structures such as lungs, blood vessels and plant roots play a critical role in life. Growth, structure, and function of these branching structures have an immense effect on our lives. Therefore, quantitative size information on such structures in their native environment is invaluable for studying their growth and the effect of the environment on them. X-ray computed tomography (XCT) has been an effective tool for in situ imaging and analysis of branching structures. We developed a costless tool that approximates the surface and volume of branching structures. Our methodology of noninvasive imaging, segmentation and extraction of quantitative information ismore » demonstrated through the analysis of a plant root in its soil medium from 3D tomography data. XCT data collected on a grass specimen was used to visualize its root structure. A suite of open-source software was employed to segment the root from the soil and determine its isosurface, which was used to calculate its volume and surface. This methodology of processing 3D data is applicable to other branching structures even when the structure of interest is of similar x-ray attenuation to its environment and difficulties arise with sample segmentation.« less
Mertz, Marcel; Strech, Daniel; Kahrass, Hannes
2017-12-19
(Semi-)systematic approaches to finding, analysing, and synthesising ethics literature on medical topics are still in their infancy. However, our recent systematic review showed that the rate of publication of such (semi-)systematic reviews has increased in the last two decades. This is not only true for reviews of empirical ethics literature, but also for reviews of normative ethics literature. In the latter case, there is currently little in the way of standards and guidance available. Therefore, the methods and reporting strategies of such reviews vary greatly. The purpose of the follow-up study we present was to obtain deeper methodological insight into the ways reviews of normative literature are actually conducted and to analyse the methods used. Our search in the PubMed, PhilPapers, and Google Scholar databases led to the identification of 183 reviews of ethics literature published between 1997 and 2015, of which 84 were identified as reviews of normative and mixed literature. Qualitative content analysis was used to extract and synthesise descriptions of search, selection, quality appraisal, analysis, and synthesis methods. We further assessed quantitatively how often certain methods (e.g. search strategies, data analysis procedures) were used by the reviews. The overall reporting quality varies among the analysed reviews and was generally poor even for major criteria regarding the search and selection of literature. For example, only 24 (29%) used a PRISMA flowchart. Also, only 55 (66%) reviews mentioned the information unit they sought to extract, and 12 (14%) stated an ethical approach as the theoretical basis for the analysis. Interpretable information on the synthesis method was given by 47 (60%); the most common methods applied were qualitative methods commonly used in social science research (83%). Reviews which fail to provide sufficient relevant information to readers have reduced methodological transparency regardless of actual methodological quality. In order to increase the internal validity (i.e. reproducibility) as well as the external validity (i.e. utility for the intended audience) of future reviews of normative literature, we suggest more accurate reporting regarding the goal of the review, the definition of the information unit, the ethical approach used, and technical aspects.
Introducing a new bond reactivity index: Philicities for natural bond orbitals.
Sánchez-Márquez, Jesús; Zorrilla, David; García, Víctor; Fernández, Manuel
2017-12-22
In the present work, a new methodology defined for obtaining reactivity indices (philicities) is proposed. This is based on reactivity functions such as the Fukui function or the dual descriptor, and makes it possible to project the information from reactivity functions onto molecular orbitals, instead of onto the atoms of the molecule (atomic reactivity indices). The methodology focuses on the molecules' natural bond orbitals (bond reactivity indices) because these orbitals have the advantage of being localized, allowing the reaction site of an electrophile or nucleophile to be determined within a very precise molecular region. This methodology provides a "philicity" index for every NBO, and a representative set of molecules has been used to test the new definition. A new methodology has also been developed to compare the "finite difference" and the "frontier molecular orbital" approximations. To facilitate their use, the proposed methodology as well as the possibility of calculating the new indices have been implemented in a new version of UCA-FUKUI software. In addition, condensation schemes based on atomic populations of the "atoms in molecules" theory, the Hirshfeld population analysis, the approximation of Mulliken (with a minimal basis set) and electrostatic potential-derived charges have also been implemented, including the calculation of "bond reactivity indices" defined in previous studies. Graphical abstract A new methodology defined for obtaining bond reactivity indices (philicities) is proposed and makes it possible to project the information from reactivity functions onto molecular orbitals. The proposed methodology as well as the possibility of calculating the new indices have been implemented in a new version of UCA-FUKUI software. In addition, this version can use new atomic condensation schemes and new "utilities" have also been included in this second version.
Interdisciplinary barriers: An impediment to the effective application of systems engineering
NASA Technical Reports Server (NTRS)
Harrison, E., Jr.
1971-01-01
Interdisciplinary transfer of information and technology does not occur very readily, even for system planners, because of the existence of some very real barriers. These barriers to flow of information and technology between disciplines represent one of the important difficulties associated with the application of systems analysis to many problems. The nature and characteristics of some of these barriers are enumerated and discussed in detail. A number of methodologies and techniques which have been specifically developed to aid in the transfer of technology and information across these interdisciplinary barriers is examined.
2017-09-01
Robert E Dumais Jr Computational and Information Sciences Directorate, ARL Yuanfu Xie National Oceanic and Atmospheric Administration, Boulder, CO...No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for...reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection information
Assessing the impact of healthcare research: A systematic review of methodological frameworks
Keeley, Thomas J.; Calvert, Melanie J.
2017-01-01
Background Increasingly, researchers need to demonstrate the impact of their research to their sponsors, funders, and fellow academics. However, the most appropriate way of measuring the impact of healthcare research is subject to debate. We aimed to identify the existing methodological frameworks used to measure healthcare research impact and to summarise the common themes and metrics in an impact matrix. Methods and findings Two independent investigators systematically searched the Medical Literature Analysis and Retrieval System Online (MEDLINE), the Excerpta Medica Database (EMBASE), the Cumulative Index to Nursing and Allied Health Literature (CINAHL+), the Health Management Information Consortium, and the Journal of Research Evaluation from inception until May 2017 for publications that presented a methodological framework for research impact. We then summarised the common concepts and themes across methodological frameworks and identified the metrics used to evaluate differing forms of impact. Twenty-four unique methodological frameworks were identified, addressing 5 broad categories of impact: (1) ‘primary research-related impact’, (2) ‘influence on policy making’, (3) ‘health and health systems impact’, (4) ‘health-related and societal impact’, and (5) ‘broader economic impact’. These categories were subdivided into 16 common impact subgroups. Authors of the included publications proposed 80 different metrics aimed at measuring impact in these areas. The main limitation of the study was the potential exclusion of relevant articles, as a consequence of the poor indexing of the databases searched. Conclusions The measurement of research impact is an essential exercise to help direct the allocation of limited research resources, to maximise research benefit, and to help minimise research waste. This review provides a collective summary of existing methodological frameworks for research impact, which funders may use to inform the measurement of research impact and researchers may use to inform study design decisions aimed at maximising the short-, medium-, and long-term impact of their research. PMID:28792957
Methodological Quality of Consensus Guidelines in Implant Dentistry.
Faggion, Clovis Mariano; Apaza, Karol; Ariza-Fritas, Tania; Málaga, Lilian; Giannakopoulos, Nikolaos Nikitas; Alarcón, Marco Antonio
2017-01-01
Consensus guidelines are useful to improve clinical decision making. Therefore, the methodological evaluation of these guidelines is of paramount importance. Low quality information may guide to inadequate or harmful clinical decisions. To evaluate the methodological quality of consensus guidelines published in implant dentistry using a validated methodological instrument. The six implant dentistry journals with impact factors were scrutinised for consensus guidelines related to implant dentistry. Two assessors independently selected consensus guidelines, and four assessors independently evaluated their methodological quality using the Appraisal of Guidelines for Research & Evaluation (AGREE) II instrument. Disagreements in the selection and evaluation of guidelines were resolved by consensus. First, the consensus guidelines were analysed alone. Then, systematic reviews conducted to support the guidelines were included in the analysis. Non-parametric statistics for dependent variables (Wilcoxon signed rank test) was used to compare both groups. Of 258 initially retrieved articles, 27 consensus guidelines were selected. Median scores in four domains (applicability, rigour of development, stakeholder involvement, and editorial independence), expressed as percentages of maximum possible domain scores, were below 50% (median, 26%, 30.70%, 41.70%, and 41.70%, respectively). The consensus guidelines and consensus guidelines + systematic reviews data sets could be compared for 19 guidelines, and the results showed significant improvements in all domain scores (p < 0.05). Methodological improvement of consensus guidelines published in major implant dentistry journals is needed. The findings of the present study may help researchers to better develop consensus guidelines in implant dentistry, which will improve the quality and trust of information needed to make proper clinical decisions.
Methodological Quality of Consensus Guidelines in Implant Dentistry
Faggion, Clovis Mariano; Apaza, Karol; Ariza-Fritas, Tania; Málaga, Lilian; Giannakopoulos, Nikolaos Nikitas; Alarcón, Marco Antonio
2017-01-01
Background Consensus guidelines are useful to improve clinical decision making. Therefore, the methodological evaluation of these guidelines is of paramount importance. Low quality information may guide to inadequate or harmful clinical decisions. Objective To evaluate the methodological quality of consensus guidelines published in implant dentistry using a validated methodological instrument. Methods The six implant dentistry journals with impact factors were scrutinised for consensus guidelines related to implant dentistry. Two assessors independently selected consensus guidelines, and four assessors independently evaluated their methodological quality using the Appraisal of Guidelines for Research & Evaluation (AGREE) II instrument. Disagreements in the selection and evaluation of guidelines were resolved by consensus. First, the consensus guidelines were analysed alone. Then, systematic reviews conducted to support the guidelines were included in the analysis. Non-parametric statistics for dependent variables (Wilcoxon signed rank test) was used to compare both groups. Results Of 258 initially retrieved articles, 27 consensus guidelines were selected. Median scores in four domains (applicability, rigour of development, stakeholder involvement, and editorial independence), expressed as percentages of maximum possible domain scores, were below 50% (median, 26%, 30.70%, 41.70%, and 41.70%, respectively). The consensus guidelines and consensus guidelines + systematic reviews data sets could be compared for 19 guidelines, and the results showed significant improvements in all domain scores (p < 0.05). Conclusions Methodological improvement of consensus guidelines published in major implant dentistry journals is needed. The findings of the present study may help researchers to better develop consensus guidelines in implant dentistry, which will improve the quality and trust of information needed to make proper clinical decisions. PMID:28107405
NASA Astrophysics Data System (ADS)
Salim, Mohd Faiz; Roslan, Ridha; Ibrahim, Mohd Rizal Mamat @
2014-02-01
Deterministic Safety Analysis (DSA) is one of the mandatory requirements conducted for Nuclear Power Plant licensing process, with the aim of ensuring safety compliance with relevant regulatory acceptance criteria. DSA is a technique whereby a set of conservative deterministic rules and requirements are applied for the design and operation of facilities or activities. Computer codes are normally used to assist in performing all required analysis under DSA. To ensure a comprehensive analysis, the conduct of DSA should follow a systematic approach. One of the methodologies proposed is the Standardized and Consolidated Reference Experimental (and Calculated) Database (SCRED) developed by University of Pisa. Based on this methodology, the use of Reference Data Set (RDS) as a pre-requisite reference document for developing input nodalization was proposed. This paper shall describe the application of RDS with the purpose of assessing its effectiveness. Two RDS documents were developed for an Integral Test Facility of LOBI-MOD2 and associated Test A1-83. Data and information from various reports and drawings were referred in preparing the RDS. The results showed that by developing RDS, it has made possible to consolidate all relevant information in one single document. This is beneficial as it enables preservation of information, promotes quality assurance, allows traceability, facilitates continuous improvement, promotes solving of contradictions and finally assisting in developing thermal hydraulic input regardless of whichever code selected. However, some disadvantages were also recognized such as the need for experience in making engineering judgments, language barrier in accessing foreign information and limitation of resources. Some possible improvements are suggested to overcome these challenges.
From data to evidence: evaluative methods in evidence-based medicine.
Landry, M D; Sibbald, W J
2001-11-01
The amount of published information is increasing exponentially, and recent technologic advances have created systems whereby mass distribution of this information can occur at an infinite rate. This is particularly true in the broad field of medicine, as the absolute volume of data available to the practicing clinician is creating new challenges in the management of relevant information flow. Evidence-based medicine (EBM) is an information management and learning strategy that seeks to integrate clinical expertise with the best evidence available in order to make effective clinical decisions that will ultimately improve patient care. The systematic approach underlying EBM encourages the clinician to formulate specific and relevant questions, which are answered in an iterative manner through accessing the best available published evidence. The arguments against EBM stem from the idea that there are inherent weaknesses in research methodologies and that emphasis placed on published research may ignore clinical skills and individual patient needs. Despite these arguments, EBM is gaining momentum and is consistently used as a method of learning and improving health care delivery. However, if EBM is to be effective, the clinician needs to have a critical understanding of research methodology in order to judge the value and level of a particular data source. Without critical analysis of research methodology, there is an inherent risk of drawing incorrect conclusions that may affect clinical decision-making. Currently, there is a trend toward using secondary pre-appraised data rather than primary sources as best evidence. We review the qualitative and quantitative methodology commonly used in EBM and argue that it is necessary for the clinician to preferentially use primary rather than secondary sources in making clinically relevant decisions.
Costing the satellite power system
NASA Technical Reports Server (NTRS)
Hazelrigg, G. A., Jr.
1978-01-01
The paper presents a methodology for satellite power system costing, places approximate limits on the accuracy possible in cost estimates made at this time, and outlines the use of probabilistic cost information in support of the decision-making process. Reasons for using probabilistic costing or risk analysis procedures instead of standard deterministic costing procedures are considered. Components of cost, costing estimating relationships, grass roots costing, and risk analysis are discussed. Risk analysis using a Monte Carlo simulation model is used to estimate future costs.
Zanetti-Polzi, Laura; Corni, Stefano; Daidone, Isabella; Amadei, Andrea
2016-07-21
Here, a methodology is proposed to investigate the collective fluctuation modes of an arbitrary set of observables, maximally contributing to the fluctuation of another functionally relevant observable. The methodology, based on the analysis of fully classical molecular dynamics (MD) simulations, exploits the essential dynamics (ED) method, originally developed to analyse the collective motions in proteins. We apply this methodology to identify the residues that are more relevant for determining the reduction potential (E(0)) of a redox-active protein. To this aim, the fluctuation modes of the single-residue electrostatic potentials mostly contributing to the fluctuations of the total electrostatic potential (the main determinant of E(0)) are investigated for wild-type azurin and two of its mutants with a higher E(0). By comparing the results here obtained with a previous study on the same systems [Zanetti-Polzi et al., Org. Biomol. Chem., 2015, 13, 11003] we show that the proposed methodology is able to identify the key sites that determine E(0). This information can be used for a general deeper understanding of the molecular mechanisms on the basis of the redox properties of the proteins under investigation, as well as for the rational design of mutants with a higher or lower E(0). From the results of the present analysis we propose a new azurin mutant that, according to our calculations, shows a further increase of E(0).
Molina, Iñigo; Martinez, Estibaliz; Arquero, Agueda; Pajares, Gonzalo; Sanchez, Javier
2012-01-01
Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth’s resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution. PMID:22737023
Molina, Iñigo; Martinez, Estibaliz; Arquero, Agueda; Pajares, Gonzalo; Sanchez, Javier
2012-01-01
Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth's resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution.
Nowak, Peter
2011-03-01
There is a broad range qualitative linguistic research (sequential analysis) on doctor-patient interaction that had only a marginal impact on clinical research and practice. At least in parts this is due to the lack of qualitative research synthesis in the field. Available research summaries are not systematic in their methodology. This paper proposes a synthesis methodology for qualitative, sequential analytic research on doctor-patient interaction. The presented methodology is not new but specifies standard methodology of qualitative research synthesis for sequential analytic research. This pilot review synthesizes twelve studies on German-speaking doctor-patient interactions, identifies 45 verbal actions of doctors and structures them in a systematics of eight interaction components. Three interaction components ("Listening", "Asking for information", and "Giving information") seem to be central and cover two thirds of the identified action types. This pilot review demonstrates that sequential analytic research can be synthesized in a consistent and meaningful way, thus providing a more comprehensive and unbiased integration of research. Future synthesis of qualitative research in the area of health communication research is very much needed. Qualitative research synthesis can support the development of quantitative research and of educational materials in medical training and patient training. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Vaynshtok, Natalia
2017-10-01
The article provides the results of development of the methodology for construction compliance monitoring in the crediting of investment projects for road construction. Work scope analysis of construction audit was conducted and an algorithm of financial audit in the crediting investment projects was developed. Furthermore, the possible pitfalls and abuses of counterparties were investigated and recommendations were given allowing the bank to receive objective and independent information on the progress of the project in real time. This mechanism is useful for the bank in insurance of possible risks, targeted and rational use of credit funds.
A systems-based food safety evaluation: an experimental approach.
Higgins, Charles L; Hartfield, Barry S
2004-11-01
Food establishments are complex systems with inputs, subsystems, underlying forces that affect the system, outputs, and feedback. Building on past exploration of the hazard analysis critical control point concept and Ludwig von Bertalanffy General Systems Theory, the National Park Service (NPS) is attempting to translate these ideas into a realistic field assessment of food service establishments and to use information gathered by these methods in efforts to improve food safety. Over the course of the last two years, an experimental systems-based methodology has been drafted, developed, and tested by the NPS Public Health Program. This methodology is described in this paper.
Two-stage commercial evaluation of engineering systems production projects for high-rise buildings
NASA Astrophysics Data System (ADS)
Bril, Aleksander; Kalinina, Olga; Levina, Anastasia
2018-03-01
The paper is devoted to the current and debatable problem of methodology of choosing the effective innovative enterprises for venture financing. A two-stage system of commercial innovation evaluation based on the UNIDO methodology is proposed. Engineering systems account for 25 to 40% of the cost of high-rise residential buildings. This proportion increases with the use of new construction technologies. Analysis of the construction market in Russia showed that the production of internal engineering systems elements based on innovative technologies has a growth trend. The production of simple elements is organized in small enterprises on the basis of new technologies. The most attractive for development is the use of venture financing of small innovative business. To improve the efficiency of these operations, the paper proposes a methodology for a two-stage evaluation of small business development projects. A two-stage system of commercial evaluation of innovative projects allows creating an information base for informed and coordinated decision-making on venture financing of enterprises that produce engineering systems elements for the construction business.
Application of atomic force microscopy as a nanotechnology tool in food science.
Yang, Hongshun; Wang, Yifen; Lai, Shaojuan; An, Hongjie; Li, Yunfei; Chen, Fusheng
2007-05-01
Atomic force microscopy (AFM) provides a method for detecting nanoscale structural information. First, this review explains the fundamentals of AFM, including principle, manipulation, and analysis. Applications of AFM are then reported in food science and technology research, including qualitative macromolecule and polymer imaging, complicated or quantitative structure analysis, molecular interaction, molecular manipulation, surface topography, and nanofood characterization. The results suggested that AFM could bring insightful knowledge on food properties, and the AFM analysis could be used to illustrate some mechanisms of property changes during processing and storage. However, the current difficulty in applying AFM to food research is lacking appropriate methodology for different food systems. Better understanding of AFM technology and developing corresponding methodology for complicated food systems would lead to a more in-depth understanding of food properties at macromolecular levels and enlarge their applications. The AFM results could greatly improve the food processing and storage technologies.
A systematic review of health economic evaluations of vaccines in Brazil.
Sartori, Ana Marli Christovam; Rozman, Luciana Martins; Decimoni, Tassia Cristina; Leandro, Roseli; Novaes, Hillegonda Maria Dutilh; de Soárez, Patrícia Coelho
2017-06-03
In Brazil, since 2005, the Ministry of Health requires Health Economic Evaluation (HEE) of vaccines for introduction into the National Immunization Program. To describe and analyze the full HEE on vaccines conducted in Brazil from 1980 to 2013. Systematic review of the literature. We searched multiple databases. Two researchers independently selected the studies and extracted the data. The methodological quality of individual studies was evaluated using CHEERS items. Twenty studies were reviewed. The most evaluated vaccines were pneumococcal (25%) and HPV (15%). The most used types of HEE were cost-effectiveness analysis (45%) and cost-utility analysis (20%). The research question and compared strategies were stated in all 20 studies and the target population was clear in 95%. Nevertheless, many studies did not inform the perspective of analysis or data sources. HEE of vaccines in Brazil has increased since 2008. However, the studies still have methodological deficiencies.
A systematic review of health economic evaluations of vaccines in Brazil
2017-01-01
ABSTRACT Background: In Brazil, since 2005, the Ministry of Health requires Health Economic Evaluation (HEE) of vaccines for introduction into the National Immunization Program. Objectives: To describe and analyze the full HEE on vaccines conducted in Brazil from 1980 to 2013. Methods: Systematic review of the literature. We searched multiple databases. Two researchers independently selected the studies and extracted the data. The methodological quality of individual studies was evaluated using CHEERS items. Results: Twenty studies were reviewed. The most evaluated vaccines were pneumococcal (25%) and HPV (15%). The most used types of HEE were cost-effectiveness analysis (45%) and cost-utility analysis (20%). The research question and compared strategies were stated in all 20 studies and the target population was clear in 95%. Nevertheless, many studies did not inform the perspective of analysis or data sources. Conclusions: HEE of vaccines in Brazil has increased since 2008. However, the studies still have methodological deficiencies. PMID:28129026
MS-based analytical methodologies to characterize genetically modified crops.
García-Cañas, Virginia; Simó, Carolina; León, Carlos; Ibáñez, Elena; Cifuentes, Alejandro
2011-01-01
The development of genetically modified crops has had a great impact on the agriculture and food industries. However, the development of any genetically modified organism (GMO) requires the application of analytical procedures to confirm the equivalence of the GMO compared to its isogenic non-transgenic counterpart. Moreover, the use of GMOs in foods and agriculture faces numerous criticisms from consumers and ecological organizations that have led some countries to regulate their production, growth, and commercialization. These regulations have brought about the need of new and more powerful analytical methods to face the complexity of this topic. In this regard, MS-based technologies are increasingly used for GMOs analysis to provide very useful information on GMO composition (e.g., metabolites, proteins). This review focuses on the MS-based analytical methodologies used to characterize genetically modified crops (also called transgenic crops). First, an overview on genetically modified crops development is provided, together with the main difficulties of their analysis. Next, the different MS-based analytical approaches applied to characterize GM crops are critically discussed, and include "-omics" approaches and target-based approaches. These methodologies allow the study of intended and unintended effects that result from the genetic transformation. This information is considered to be essential to corroborate (or not) the equivalence of the GM crop with its isogenic non-transgenic counterpart. Copyright © 2010 Wiley Periodicals, Inc.
Júnez-Ferreira, H E; Herrera, G S
2013-04-01
This paper presents a new methodology for the optimal design of space-time hydraulic head monitoring networks and its application to the Valle de Querétaro aquifer in Mexico. The selection of the space-time monitoring points is done using a static Kalman filter combined with a sequential optimization method. The Kalman filter requires as input a space-time covariance matrix, which is derived from a geostatistical analysis. A sequential optimization method that selects the space-time point that minimizes a function of the variance, in each step, is used. We demonstrate the methodology applying it to the redesign of the hydraulic head monitoring network of the Valle de Querétaro aquifer with the objective of selecting from a set of monitoring positions and times, those that minimize the spatiotemporal redundancy. The database for the geostatistical space-time analysis corresponds to information of 273 wells located within the aquifer for the period 1970-2007. A total of 1,435 hydraulic head data were used to construct the experimental space-time variogram. The results show that from the existing monitoring program that consists of 418 space-time monitoring points, only 178 are not redundant. The implied reduction of monitoring costs was possible because the proposed method is successful in propagating information in space and time.
The dynamics of information-driven coordination phenomena: A transfer entropy analysis
Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro
2016-01-01
Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social subunits. In the absence of clear exogenous driving, social collective phenomena can be represented as endogenously driven structural transitions of the information transfer network. This study provides results that can help define models and predictive algorithms for the analysis of societal events based on open source data. PMID:27051875
The dynamics of information-driven coordination phenomena: A transfer entropy analysis.
Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro
2016-04-01
Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social subunits. In the absence of clear exogenous driving, social collective phenomena can be represented as endogenously driven structural transitions of the information transfer network. This study provides results that can help define models and predictive algorithms for the analysis of societal events based on open source data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kammerer, Annie
Department of Energy (DOE) nuclear facilities must comply with DOE Order 420.1C Facility Safety, which requires that all such facilities review their natural phenomena hazards (NPH) assessments no less frequently than every ten years. The Order points the reader to Standard DOE-STD-1020-2012. In addition to providing a discussion of the applicable evaluation criteria, the Standard references other documents, including ANSI/ANS-2.29-2008 and NUREG-2117. These documents provide supporting criteria and approaches for evaluating the need to update an existing probabilistic seismic hazard analysis (PSHA). All of the documents are consistent at a high level regarding the general conceptual criteria that should bemore » considered. However, none of the documents provides step-by-step detailed guidance on the required or recommended approach for evaluating the significance of new information and determining whether or not an existing PSHA should be updated. Further, all of the conceptual approaches and criteria given in these documents deal with changes that may have occurred in the knowledge base that might impact the inputs to the PSHA, the calculated hazard itself, or the technical basis for the hazard inputs. Given that the DOE Order is aimed at achieving and assuring the safety of nuclear facilities—which is a function not only of the level of the seismic hazard but also the capacity of the facility to withstand vibratory ground motions—the inclusion of risk information in the evaluation process would appear to be both prudent and in line with the objectives of the Order. The purpose of this white paper is to describe a risk-informed methodology for evaluating the need for an update of an existing PSHA consistent with the DOE Order. While the development of the proposed methodology was undertaken as a result of assessments for specific SDC-3 facilities at Idaho National Laboratory (INL), and it is expected that the application at INL will provide a demonstration of the methodology, there is potential for general applicability to other facilities across the DOE complex. As such, both a general methodology and a specific approach intended for INL are described in this document. The general methodology proposed in this white paper is referred to as the “seismic hazard periodic review methodology,” or SHPRM. It presents a graded approach for SDC-3, SDC-4 and SDC-5 facilities that can be applied in any risk-informed regulatory environment by once risk-objectives appropriate for the framework are developed. While the methodology was developed for seismic hazard considerations, it can also be directly applied to other types of natural hazards.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kammerer, Annie
Department of Energy (DOE) nuclear facilities must comply with DOE Order 420.1C Facility Safety, which requires that all such facilities review their natural phenomena hazards (NPH) assessments no less frequently than every ten years. The Order points the reader to Standard DOE-STD-1020-2012. In addition to providing a discussion of the applicable evaluation criteria, the Standard references other documents, including ANSI/ANS-2.29-2008 and NUREG-2117. These documents provide supporting criteria and approaches for evaluating the need to update an existing probabilistic seismic hazard analysis (PSHA). All of the documents are consistent at a high level regarding the general conceptual criteria that should bemore » considered. However, none of the documents provides step-by-step detailed guidance on the required or recommended approach for evaluating the significance of new information and determining whether or not an existing PSHA should be updated. Further, all of the conceptual approaches and criteria given in these documents deal with changes that may have occurred in the knowledge base that might impact the inputs to the PSHA, the calculated hazard itself, or the technical basis for the hazard inputs. Given that the DOE Order is aimed at achieving and assuring the safety of nuclear facilities—which is a function not only of the level of the seismic hazard but also the capacity of the facility to withstand vibratory ground motions—the inclusion of risk information in the evaluation process would appear to be both prudent and in line with the objectives of the Order. The purpose of this white paper is to describe a risk-informed methodology for evaluating the need for an update of an existing PSHA consistent with the DOE Order. While the development of the proposed methodology was undertaken as a result of assessments for specific SDC-3 facilities at Idaho National Laboratory (INL), and it is expected that the application at INL will provide a demonstration of the methodology, there is potential for general applicability to other facilities across the DOE complex. As such, both a general methodology and a specific approach intended for INL are described in this document. The general methodology proposed in this white paper is referred to as the “seismic hazard periodic review methodology,” or SHPRM. It presents a graded approach for SDC-3, SDC-4 and SDC-5 facilities that can be applied in any risk-informed regulatory environment once risk-objectives appropriate for the framework are developed. While the methodology was developed for seismic hazard considerations, it can also be directly applied to other types of natural hazards.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-09
... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB Number 1121-NEW] Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological Research To Support the National... Redesign Research (NCVS-RR) program: Methodological Research to Support the National Crime Victimization...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-07
... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB No. 1121-NEW] Agency Information Collection Activities: Proposed Collection; Comments Requested Methodological Research To Support the National Crime... related to the National Crime Victimization Survey Redesign Research (NCVS-RR) program: Methodological...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwin A. Harvego; James E. O'Brien; Michael G. McKellar
2012-11-01
Results of a system evaluation and lifecycle cost analysis are presented for a commercial-scale high-temperature electrolysis (HTE) central hydrogen production plant. The plant design relies on grid electricity to power the electrolysis process and system components, and industrial natural gas to provide process heat. The HYSYS process analysis software was used to evaluate the reference central plant design capable of producing 50,000 kg/day of hydrogen. The HYSYS software performs mass and energy balances across all components to allow optimization of the design using a detailed process flow sheet and realistic operating conditions specified by the analyst. The lifecycle cost analysismore » was performed using the H2A analysis methodology developed by the Department of Energy (DOE) Hydrogen Program. This methodology utilizes Microsoft Excel spreadsheet analysis tools that require detailed plant performance information (obtained from HYSYS), along with financial and cost information to calculate lifecycle costs. The results of the lifecycle analyses indicate that for a 10% internal rate of return, a large central commercial-scale hydrogen production plant can produce 50,000 kg/day of hydrogen at an average cost of $2.68/kg. When the cost of carbon sequestration is taken into account, the average cost of hydrogen production increases by $0.40/kg to $3.08/kg.« less
Discovering objects in a blood recipient information system.
Qiu, D; Junghans, G; Marquardt, K; Kroll, H; Mueller-Eckhardt, C; Dudeck, J
1995-01-01
Application of object-oriented (OO) methodologies has been generally considered as a solution to the problem of improving the software development process and managing the so-called software crisis. Among them, object-oriented analysis (OOA) is the most essential and is a vital prerequisite for the successful use of other OO methodologies. Though there are already a good deal of OOA methods published, the most important aspect common to all these methods: discovering objects classes truly relevant to the given problem domain, has remained a subject to be intensively researched. In this paper, using the successful development of a blood recipient information system as an example, we present our approach which is based on the conceptual framework of responsibility-driven OOA. In the discussion, we also suggest that it may be inadequate to simply attribute the software crisis to the waterfall model of the software development life-cycle. We are convinced that the real causes for the failure of some software and information systems should be sought in the methodologies used in some crucial phases of the software development process. Furthermore, a software system can also fail if object classes essential to the problem domain are not discovered, implemented and visualized, so that the real-world situation cannot be faithfully traced by it.
Numerical characteristics of quantum computer simulation
NASA Astrophysics Data System (ADS)
Chernyavskiy, A.; Khamitov, K.; Teplov, A.; Voevodin, V.; Voevodin, Vl.
2016-12-01
The simulation of quantum circuits is significantly important for the implementation of quantum information technologies. The main difficulty of such modeling is the exponential growth of dimensionality, thus the usage of modern high-performance parallel computations is relevant. As it is well known, arbitrary quantum computation in circuit model can be done by only single- and two-qubit gates, and we analyze the computational structure and properties of the simulation of such gates. We investigate the fact that the unique properties of quantum nature lead to the computational properties of the considered algorithms: the quantum parallelism make the simulation of quantum gates highly parallel, and on the other hand, quantum entanglement leads to the problem of computational locality during simulation. We use the methodology of the AlgoWiki project (algowiki-project.org) to analyze the algorithm. This methodology consists of theoretical (sequential and parallel complexity, macro structure, and visual informational graph) and experimental (locality and memory access, scalability and more specific dynamic characteristics) parts. Experimental part was made by using the petascale Lomonosov supercomputer (Moscow State University, Russia). We show that the simulation of quantum gates is a good base for the research and testing of the development methods for data intense parallel software, and considered methodology of the analysis can be successfully used for the improvement of the algorithms in quantum information science.
Ethics and Epistemology in Big Data Research.
Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A
2017-12-01
Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.
Estimating magnitude and frequency of floods using the PeakFQ 7.0 program
Veilleux, Andrea G.; Cohn, Timothy A.; Flynn, Kathleen M.; Mason, Jr., Robert R.; Hummel, Paul R.
2014-01-01
Flood-frequency analysis provides information about the magnitude and frequency of flood discharges based on records of annual maximum instantaneous peak discharges collected at streamgages. The information is essential for defining flood-hazard areas, for managing floodplains, and for designing bridges, culverts, dams, levees, and other flood-control structures. Bulletin 17B (B17B) of the Interagency Advisory Committee on Water Data (IACWD; 1982) codifies the standard methodology for conducting flood-frequency studies in the United States. B17B specifies that annual peak-flow data are to be fit to a log-Pearson Type III distribution. Specific methods are also prescribed for improving skew estimates using regional skew information, tests for high and low outliers, adjustments for low outliers and zero flows, and procedures for incorporating historical flood information. The authors of B17B identified various needs for methodological improvement and recommended additional study. In response to these needs, the Advisory Committee on Water Information (ACWI, successor to IACWD; http://acwi.gov/, Subcommittee on Hydrology (SOH), Hydrologic Frequency Analysis Work Group (HFAWG), has recommended modest changes to B17B. These changes include adoption of a generalized method-of-moments estimator denoted the Expected Moments Algorithm (EMA) (Cohn and others, 1997) and a generalized version of the Grubbs-Beck test for low outliers (Cohn and others, 2013). The SOH requested that the USGS implement these changes in a user-friendly, publicly accessible program.
A Systematic Method for Reviewing and Analyzing Health Information on Consumer-Oriented Websites.
Rew, Lynn; Saenz, Ashley; Walker, Lorraine O
2018-05-29
A discussion of a proposed method for analyzing the quality of consumer-oriented websites that provide health-related information. The quality of health information available to consumers online varies widely in quality. In an effort to improve the quality of online information, experts have undertaken systematic reviews on selected health topics; however, no standardized comprehensive methodology currently exists for such review. An eight-step method is recommended embracing the following steps: (1) select topic; (2) determine the purpose of the analysis; (3) select search terms and engines; (4) develop and apply website inclusion and exclusion criteria; (5) develop processes and tools to manage search results; (6) specify measures of quality; (7) compute readability; (8) evaluate websites. Each of these steps is illustrated in relation to the health topic of gynecomastia, a physical and mental health challenge for many adolescent males and young men. Although most extant analyses of consumer-oriented websites have focused on disease conditions and their treatment, website-analysis methodology would encourage analyses that fall into the nursing care domain. The method outlined in this paper is intended to provide nurses and others who work with specific patient populations with the tools needed for website analytic studies. Such studies provide a foundation for making recommendations about quality websites, as well as identifying gaps in online information for health consumers. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Sizing the science data processing requirements for EOS
NASA Technical Reports Server (NTRS)
Wharton, Stephen W.; Chang, Hyo D.; Krupp, Brian; Lu, Yun-Chi
1991-01-01
The methodology used in the compilation and synthesis of baseline science requirements associated with the 30 + EOS (Earth Observing System) instruments and over 2,400 EOS data products (both output and required input) proposed by EOS investigators is discussed. A brief background on EOS and the EOS Data and Information System (EOSDIS) is presented, and the approach is outlined in terms of a multilayer model. The methodology used to compile, synthesize, and tabulate requirements within the model is described. The principal benefit of this approach is the reduction of effort needed to update the analysis and maintain the accuracy of the science data processing requirements in response to changes in EOS platforms, instruments, data products, processing center allocations, or other model input parameters. The spreadsheets used in the model provide a compact representation, thereby facilitating review and presentation of the information content.
Use of information sources by family physicians: a literature survey.
Verhoeven, A A; Boerma, E J; Meyboom-de Jong, B
1995-01-01
Analysis of the use of information sources by family physicians is important for both practical and theoretical reasons. First, analysis of the ways in which family physicians handle information may point to opportunities for improvement. Second, such efforts may lead to improvements in the methodology of literature research in general. This article reports on a survey of the literature on information use by family physicians. Eleven relevant research publications could be found. The data showed that family physicians used colleagues most often as information sources, followed by journals and books. This outcome corresponded with results in other professions. Several factors influenced the use of information sources by family physicians, including the physical, functional, and intellectual accessibility of the source; the physician's age; participation by the physician in research or education; the social context of the physician; practice characteristics; and the stage of the information-gathering process. The publications studied suggested ways to improve information gathering in the areas of computerization, education, library organization, and journal articles. PMID:7703946
Ghasemi, Jahan B; Safavi-Sohi, Reihaneh; Barbosa, Euzébio G
2012-02-01
A quasi 4D-QSAR has been carried out on a series of potent Gram-negative LpxC inhibitors. This approach makes use of the molecular dynamics (MD) trajectories and topology information retrieved from the GROMACS package. This new methodology is based on the generation of a conformational ensemble profile, CEP, for each compound instead of only one conformation, followed by the calculation intermolecular interaction energies at each grid point considering probes and all aligned conformations resulting from MD simulations. These interaction energies are independent variables employed in a QSAR analysis. The comparison of the proposed methodology to comparative molecular field analysis (CoMFA) formalism was performed. This methodology explores jointly the main features of CoMFA and 4D-QSAR models. Step-wise multiple linear regression was used for the selection of the most informative variables. After variable selection, multiple linear regression (MLR) and partial least squares (PLS) methods used for building the regression models. Leave-N-out cross-validation (LNO), and Y-randomization were performed in order to confirm the robustness of the model in addition to analysis of the independent test set. Best models provided the following statistics: [Formula in text] (PLS) and [Formula in text] (MLR). Docking study was applied to investigate the major interactions in protein-ligand complex with CDOCKER algorithm. Visualization of the descriptors of the best model helps us to interpret the model from the chemical point of view, supporting the applicability of this new approach in rational drug design.
Evaluating multiple determinants of the structure of plant-animal mutualistic networks.
Vázquez, Diego P; Chacoff, Natacha P; Cagnolo, Luciano
2009-08-01
The structure of mutualistic networks is likely to result from the simultaneous influence of neutrality and the constraints imposed by complementarity in species phenotypes, phenologies, spatial distributions, phylogenetic relationships, and sampling artifacts. We develop a conceptual and methodological framework to evaluate the relative contributions of these potential determinants. Applying this approach to the analysis of a plant-pollinator network, we show that information on relative abundance and phenology suffices to predict several aggregate network properties (connectance, nestedness, interaction evenness, and interaction asymmetry). However, such information falls short of predicting the detailed network structure (the frequency of pairwise interactions), leaving a large amount of variation unexplained. Taken together, our results suggest that both relative species abundance and complementarity in spatiotemporal distribution contribute substantially to generate observed network patters, but that this information is by no means sufficient to predict the occurrence and frequency of pairwise interactions. Future studies could use our methodological framework to evaluate the generality of our findings in a representative sample of study systems with contrasting ecological conditions.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-16
... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB Number 1121-NEW] Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological Research to Support the National...: Methodological Research to Support the National Crime Victimization Survey: Self-Report Data on Rape and Sexual...
Effective Information Systems: What's the Secret?
ERIC Educational Resources Information Center
Kirkham, Sandi
1994-01-01
Argues that false assumptions about user needs implicit in methodologies for building information systems have resulted in inadequate and inflexible systems. Checkland's Soft Systems Methodology is examined as a useful alternative. Its fundamental features are described, and examples of models demonstrate how the methodology can facilitate…
Gu, Jing; Wang, Qi; Wang, Xiaogang; Li, Hailong; Gu, Mei; Ming, Haixia; Dong, Xiaoli; Yang, Kehu; Wu, Hongyan
2014-01-01
Background. This review provides the first methodological information assessment of protocol of acupuncture RCTs registered in WHO International Clinical Trials Registry Platform (ICTRP). Methods. All records of acupuncture RCTs registered in the ICTRP have been collected. The methodological design assessment involved whether the randomization methods, allocation concealment, and blinding were adequate or not based on the information of registration records (protocols of acupuncture RCTs). Results. A total of 453 records, found in 11 registries, were examined. Methodological details were insufficient in registration records; there were 76.4%, 89.0%, and 21.4% records that did not provide information on randomization methods, allocation concealment, and blinding respectively. The proportions of adequate randomization methods, allocation concealment, and blinding were only 107 (23.6%), 48 (10.6%), and 210 (46.4%), respectively. The methodological design improved year by year, especially after 2007. Additionally, methodology of RCTs with ethics approval was clearly superior to those without ethics approval and different among registries. Conclusions. The overall methodological design based on registration records of acupuncture RCTs is not very well but improved year by year. The insufficient information on randomization methods, allocation concealment, and blinding maybe due to the relevant description is not taken seriously in acupuncture RCTs' registration. PMID:24688591
Gu, Jing; Wang, Qi; Wang, Xiaogang; Li, Hailong; Gu, Mei; Ming, Haixia; Dong, Xiaoli; Yang, Kehu; Wu, Hongyan
2014-01-01
Background. This review provides the first methodological information assessment of protocol of acupuncture RCTs registered in WHO International Clinical Trials Registry Platform (ICTRP). Methods. All records of acupuncture RCTs registered in the ICTRP have been collected. The methodological design assessment involved whether the randomization methods, allocation concealment, and blinding were adequate or not based on the information of registration records (protocols of acupuncture RCTs). Results. A total of 453 records, found in 11 registries, were examined. Methodological details were insufficient in registration records; there were 76.4%, 89.0%, and 21.4% records that did not provide information on randomization methods, allocation concealment, and blinding respectively. The proportions of adequate randomization methods, allocation concealment, and blinding were only 107 (23.6%), 48 (10.6%), and 210 (46.4%), respectively. The methodological design improved year by year, especially after 2007. Additionally, methodology of RCTs with ethics approval was clearly superior to those without ethics approval and different among registries. Conclusions. The overall methodological design based on registration records of acupuncture RCTs is not very well but improved year by year. The insufficient information on randomization methods, allocation concealment, and blinding maybe due to the relevant description is not taken seriously in acupuncture RCTs' registration.
Citizenship, Beneficence, and Informed Consent: The Ethics of Working in Mixed-Status Families
ERIC Educational Resources Information Center
Mangual Figueroa, Ariana
2016-01-01
This article draws from a 23-month ethnographic study conducted in mixed-status Mexican homes to detail the particular methodological concerns that arise when conducting research within these legally complex and vulnerable families. Specifically, the analysis illustrates when and why undocumented parents in one focal family asked the ethnographer…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-27
... has obtained an OMB generic clearance to conduct survey and instrument design and administration... conduct the detailed preparation needed for a study of this size and complexity, the NCS was designed to... methodological studies conducted during the Vanguard phase will inform the implementation and analysis plan for...
ERIC Educational Resources Information Center
Errecart, Michael T.
The Response to Educational Needs Project (RENP) focuses on training teachers as a vehicle for promoting student achievement in a compensatory education program. This document supplements a report on RENP replication and provides information on cost analysis, methodology, and sample and data collection. In Appendix A the following questions are…
Towards the Measurement of EFL Listening Beliefs with Item Response Theory Methods
ERIC Educational Resources Information Center
Nix, John-Michael L.; Tseng, Wen-Ta
2014-01-01
The present research aims to identify the underlying English listening belief structure of English-as-a-foreign-language (EFL) learners, thereby informing methodologies for subsequent analysis of beliefs with respect to listening achievement. Development of a measurement model of English listening learning beliefs entailed the creation of an…
Using classification tree analysis to predict oak wilt distribution in Minnesota and Texas
Marla c. Downing; Vernon L. Thomas; Jennifer Juzwik; David N. Appel; Robin M. Reich; Kim Camilli
2008-01-01
We developed a methodology and compared results for predicting the potential distribution of Ceratocystis fagacearum (causal agent of oak wilt), in both Anoka County, MN, and Fort Hood, TX. The Potential Distribution of Oak Wilt (PDOW) utilizes a binary classification tree statistical technique that incorporates: geographical information systems (GIS...
Inside the Black Box: Revealing the Process in Applying a Grounded Theory Analysis
ERIC Educational Resources Information Center
Rich, Peter
2012-01-01
Qualitative research methods have long set an example of rich description, in which data and researchers' hermeneutics work together to inform readers of findings in specific contexts. Among published works, insight into the analytical process is most often represented in the form of methodological propositions or research results. This paper…
ERIC Educational Resources Information Center
Adolwa, Ivan S.; Okoth, Peter F.; Mulwa, Richard M.; Esilaba, Anthony O.; Mairura, Franklin S.; Nambiro, Elizabeth
2012-01-01
Purpose: The following study was carried out to evaluate the socio-economic factors influencing access to Integrated Soil Fertility Management (ISFM) information and knowledge among farmers in western Kenya, and subsequent ISFM uptake with a view to assessing communication gaps. Design/Methodology/Approach: Structured questionnaires were…
Video Analysis in Cross-Cultural Environments and Methodological Issues
ERIC Educational Resources Information Center
Montandon, Christiane
2015-01-01
This paper addresses the use of videography combined with group interviews, as a way to better understand the informal learnings of 11-12 year old children in cross-cultural encounters during French-German school exchanges. The complete, consistent video data required the researchers to choose the most significant sequences to highlight the…
USDA-ARS?s Scientific Manuscript database
Local adaptation research in plants: limitations to synthetic understanding Local adaptation is used as a criterion to select plant materials that will display high fitness in new environments. A large body of research has explored local adaptation in plants, however, to what extent findings can inf...
Assessment of Alternative Student Aid Delivery Systems: Assessment of the Current Delivery System.
ERIC Educational Resources Information Center
Advanced Technology, Inc., Reston, VA.
The effects of the current system for delivering federal financial assistance to students under the Pell Grant, Guaranteed Student Loan (GSL), and campus-based programs are analyzed. Information is included on the use of the assessment model, which combines program evaluation, systems research, and policy analysis methodologies.…
Eye Movement Correlates of Acquired Central Dyslexia
ERIC Educational Resources Information Center
Schattka, Kerstin I.; Radach, Ralph; Huber, Walter
2010-01-01
Based on recent progress in theory and measurement techniques, the analysis of eye movements has become one of the major methodological tools in experimental reading research. Our work uses this approach to advance the understanding of impaired information processing in acquired central dyslexia of stroke patients with aphasia. Up to now there has…
Measuring the Impact of Technology on Nurse Workflow: A Mixed Methods Approach
ERIC Educational Resources Information Center
Cady, Rhonda Guse
2012-01-01
Background. Investment in health information technology (HIT) is rapidly accelerating. The absence of contextual or situational analysis of the environment in which HIT is incorporated makes it difficult to measure success or failure. The methodology introduced in this paper combines observational research with time-motion study to measure the…
ERIC Educational Resources Information Center
Lopes Cardozo, Mieke T. A.; Shah, Ritesh
2016-01-01
A critical and more nuanced understanding of the multifaceted relationship between projects of peacebuilding and educational provision is starting to develop. Drawing on an epistemological and ontological anchor of critical realism, and a methodology informed by the application of cultural political economy analysis and the strategic relational…
Ohio Geographers: Recent Research Themes. Volume Number 1: 1973.
ERIC Educational Resources Information Center
Pyle, Gerald F., Ed.
Eight professional geography research papers presented at the Geography Section of the 1973 annual meeting of the Ohio Academy of Science are provided. The specific topics of concern range from methodological essays to the quantitative analysis of historical geographic information. The titles of the papers are (1) An Attempt at Reform in Regional…
Biologically-inspired data decorrelation for hyper-spectral imaging
NASA Astrophysics Data System (ADS)
Picon, Artzai; Ghita, Ovidiu; Rodriguez-Vaamonde, Sergio; Iriondo, Pedro Ma; Whelan, Paul F.
2011-12-01
Hyper-spectral data allows the construction of more robust statistical models to sample the material properties than the standard tri-chromatic color representation. However, because of the large dimensionality and complexity of the hyper-spectral data, the extraction of robust features (image descriptors) is not a trivial issue. Thus, to facilitate efficient feature extraction, decorrelation techniques are commonly applied to reduce the dimensionality of the hyper-spectral data with the aim of generating compact and highly discriminative image descriptors. Current methodologies for data decorrelation such as principal component analysis (PCA), linear discriminant analysis (LDA), wavelet decomposition (WD), or band selection methods require complex and subjective training procedures and in addition the compressed spectral information is not directly related to the physical (spectral) characteristics associated with the analyzed materials. The major objective of this article is to introduce and evaluate a new data decorrelation methodology using an approach that closely emulates the human vision. The proposed data decorrelation scheme has been employed to optimally minimize the amount of redundant information contained in the highly correlated hyper-spectral bands and has been comprehensively evaluated in the context of non-ferrous material classification
A data protection scheme for a remote vital signs monitoring healthcare service.
Gritzalis, D; Lambrinoudakis, C
2000-01-01
Personal and medical data processed by Healthcare Information Systems must be protected against unauthorized access, modification and withholding. Security measures should be selected to provide the required level of protection in a cost-efficient manner. This is only feasible if specific characteristics of the information system are examined on a basis of a risk analysis methodology. This paper presents the results of a risk analysis, based on the CRAMM methodology, for a healthcare organization offering a patient home-monitoring service through the transmission of vital signs, focusing on the identified security needs and the proposed countermeasures. The architectural and functional models of this service were utilized for identifying and valuating the system assets, the associated threats and vulnerabilities, as well as for assessing the impact on the patients and on the service provider, should the security of any of these assets is affected. A set of adequate organizational, administrative and technical countermeasures is described for the remote vital signs monitoring service, thus providing the healthcare organization with a data protection framework that can be utilized for the development of its own security plan.
High school students presenting science: An interactional sociolinguistic analysis
NASA Astrophysics Data System (ADS)
Bleicher, Robert
Presenting science is an authentic activity of practicing scientists. Thus, effective communication of science is an important skill to nurture in high school students who are learning science. This study examines strategies employed by high school students as they make science presentations; it assesses students' conceptual understandings of particular science topics through their presentations and investigates gender differences. Data are derived from science presentation given by eight high school students, three females and five males who attended a summer science program. Data sources included videotaped presentations, ethnographic fieldnotes, interviews with presenters and members of the audience, and presenter notes and overheads. Presentations were transcribed and submitted to discourse analysis from an interactional sociolinguistic perspective. This article focuses on the methodology employed and how it helps inform the above research questions. The author argues that use of this methodology leads to findings that inform important social-communicative issues in the learning of science. Practical advice for teaching students to present science, implications for use of presentations to assess conceptual learning, and indications of some possible gender differences are discussed.Received: 14 April 1993; Revised: 15 February 1994;
ERIC Educational Resources Information Center
Baker-Doyle, Kira J.; Yoon, Susan A.
2011-01-01
This paper presents the first in a series of studies on the informal advice networks of a community of teachers in an in-service professional development program. The aim of the research was to use Social Network Analysis as a methodological tool to reveal the social networks developed by the teachers, and to examine whether these networks…
A Graphics Editor for Structured Analysis with a Data Dictionary.
1987-12-01
4-3 Human/Computer Interface Considerations 4-3 Screen Layout .... ............. 4-4 Menu System ..... .............. 4-6 Voice Feedback...central computer system . This project is a direct follow on to the 1986 thesis by James W. Urscheler. lie created an initial version of a tool (nicknamed...graphics information. Background r SADT. SADT is the name of SofTech’s methodology for doing requirement analysis and system design. It was first published
Extracting chemical information from high-resolution Kβ X-ray emission spectroscopy
NASA Astrophysics Data System (ADS)
Limandri, S.; Robledo, J.; Tirao, G.
2018-06-01
High-resolution X-ray emission spectroscopy allows studying the chemical environment of a wide variety of materials. Chemical information can be obtained by fitting the X-ray spectra and observing the behavior of some spectral features. Spectral changes can also be quantified by means of statistical parameters calculated by considering the spectrum as a probability distribution. Another possibility is to perform statistical multivariate analysis, such as principal component analysis. In this work the performance of these procedures for extracting chemical information in X-ray emission spectroscopy spectra for mixtures of Mn2+ and Mn4+ oxides are studied. A detail analysis of the parameters obtained, as well as the associated uncertainties is shown. The methodologies are also applied for Mn oxidation state characterization of double perovskite oxides Ba1+xLa1-xMnSbO6 (with 0 ≤ x ≤ 0.7). The results show that statistical parameters and multivariate analysis are the most suitable for the analysis of this kind of spectra.
Bağcıoğlu, Murat; Zimmermann, Boris; Kohler, Achim
2015-01-01
Background Analysis of pollen grains reveals valuable information on biology, ecology, forensics, climate change, insect migration, food sources and aeroallergens. Vibrational (infrared and Raman) spectroscopies offer chemical characterization of pollen via identifiable spectral features without any sample pretreatment. We have compared the level of chemical information that can be obtained by different multiscale vibrational spectroscopic techniques. Methodology Pollen from 15 different species of Pinales (conifers) were measured by seven infrared and Raman methodologies. In order to obtain infrared spectra, both reflectance and transmission measurements were performed on ground and intact pollen grains (bulk measurements), in addition, infrared spectra were obtained by microspectroscopy of multigrain and single pollen grain measurements. For Raman microspectroscopy measurements, spectra were obtained from the same pollen grains by focusing two different substructures of pollen grain. The spectral data from the seven methodologies were integrated into one data model by the Consensus Principal Component Analysis, in order to obtain the relations between the molecular signatures traced by different techniques. Results The vibrational spectroscopy enabled biochemical characterization of pollen and detection of phylogenetic variation. The spectral differences were clearly connected to specific chemical constituents, such as lipids, carbohydrates, carotenoids and sporopollenins. The extensive differences between pollen of Cedrus and the rest of Pinaceae family were unambiguously connected with molecular composition of sporopollenins in pollen grain wall, while pollen of Picea has apparently higher concentration of carotenoids than the rest of the family. It is shown that vibrational methodologies have great potential for systematic collection of data on ecosystems and that the obtained phylogenetic variation can be well explained by the biochemical composition of pollen. Out of the seven tested methodologies, the best taxonomical differentiation of pollen was obtained by infrared measurements on bulk samples, as well as by Raman microspectroscopy measurements of the corpus region of the pollen grain. Raman microspectroscopy measurements indicate that measurement area, as well as the depth of focus, can have crucial influence on the obtained data. PMID:26376486
Spatial pattern recognition of seismic events in South West Colombia
NASA Astrophysics Data System (ADS)
Benítez, Hernán D.; Flórez, Juan F.; Duque, Diana P.; Benavides, Alberto; Lucía Baquero, Olga; Quintero, Jiber
2013-09-01
Recognition of seismogenic zones in geographical regions supports seismic hazard studies. This recognition is usually based on visual, qualitative and subjective analysis of data. Spatial pattern recognition provides a well founded means to obtain relevant information from large amounts of data. The purpose of this work is to identify and classify spatial patterns in instrumental data of the South West Colombian seismic database. In this research, clustering tendency analysis validates whether seismic database possesses a clustering structure. A non-supervised fuzzy clustering algorithm creates groups of seismic events. Given the sensitivity of fuzzy clustering algorithms to centroid initial positions, we proposed a methodology to initialize centroids that generates stable partitions with respect to centroid initialization. As a result of this work, a public software tool provides the user with the routines developed for clustering methodology. The analysis of the seismogenic zones obtained reveals meaningful spatial patterns in South-West Colombia. The clustering analysis provides a quantitative location and dispersion of seismogenic zones that facilitates seismological interpretations of seismic activities in South West Colombia.
A content analysis of analyst research: health care through the eyes of analysts.
Nielsen, Christian
2008-01-01
This article contributes to the understanding of how health care companies may communicate the business models by studying financial analysts' analyst reports. The study examines the differences between the information conveyed in recurrent and fundamental analyst reports as well as whether the characteristics of the analysts and their environment affect their business model analyses. A medium-sized health care company in the medical-technology sector, internationally renowned for its state-of-the-art business reporting, was chosen as the basis for the study. An analysis of 111 fundamental and recurrent analyst reports on this company by each investment bank actively following it was conducted using a content analysis methodology. The study reveals that the recurrent analyses are concerned with evaluating the information disclosed by the health care company itself and not so much with digging up new information. It also indicates that while maintenance work might be focused on evaluating specific details, fundamental research is more concerned with extending the understanding of the general picture, i.e., the sustainability and performance of the overall business model. The amount of financial information disclosed in either type of report is not correlated to the other disclosures in the reports. In comparison to business reporting practices, the fundamental analyst reports put considerably less weight on social and sustainability, intellectual capital and corporate governance information, and they disclose much less comparable non-financial information. The suggestion made is that looking at the types of information financial analysts consider important and convey to their "customers," the investors and fund managers, constitutes a valuable indication to health care companies regarding the needs of the financial market users of their reports and other communications. There are some limitations to the possibility of applying statistical tests to the data-set as well as methodological limitations in relation to the exclusion of tables and graphs.
A development framework for semantically interoperable health information systems.
Lopez, Diego M; Blobel, Bernd G M E
2009-02-01
Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.
Source apportionment and sensitivity analysis: two methodologies with two different purposes
NASA Astrophysics Data System (ADS)
Clappier, Alain; Belis, Claudio A.; Pernigotti, Denise; Thunis, Philippe
2017-11-01
This work reviews the existing methodologies for source apportionment and sensitivity analysis to identify key differences and stress their implicit limitations. The emphasis is laid on the differences between source impacts
(sensitivity analysis) and contributions
(source apportionment) obtained by using four different methodologies: brute-force top-down, brute-force bottom-up, tagged species and decoupled direct method (DDM). A simple theoretical example to compare these approaches is used highlighting differences and potential implications for policy. When the relationships between concentration and emissions are linear, impacts and contributions are equivalent concepts. In this case, source apportionment and sensitivity analysis may be used indifferently for both air quality planning purposes and quantifying source contributions. However, this study demonstrates that when the relationship between emissions and concentrations is nonlinear, sensitivity approaches are not suitable to retrieve source contributions and source apportionment methods are not appropriate to evaluate the impact of abatement strategies. A quantification of the potential nonlinearities should therefore be the first step prior to source apportionment or planning applications, to prevent any limitations in their use. When nonlinearity is mild, these limitations may, however, be acceptable in the context of the other uncertainties inherent to complex models. Moreover, when using sensitivity analysis for planning, it is important to note that, under nonlinear circumstances, the calculated impacts will only provide information for the exact conditions (e.g. emission reduction share) that are simulated.
EPRI/NRC-RES fire human reliability analysis guidelines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, Stuart R.; Cooper, Susan E.; Najafi, Bijan
2010-03-01
During the 1990s, the Electric Power Research Institute (EPRI) developed methods for fire risk analysis to support its utility members in the preparation of responses to Generic Letter 88-20, Supplement 4, 'Individual Plant Examination - External Events' (IPEEE). This effort produced a Fire Risk Assessment methodology for operations at power that was used by the majority of U.S. nuclear power plants (NPPs) in support of the IPEEE program and several NPPs overseas. Although these methods were acceptable for accomplishing the objectives of the IPEEE, EPRI and the U.S. Nuclear Regulatory Commission (NRC) recognized that they required upgrades to support currentmore » requirements for risk-informed, performance-based (RI/PB) applications. In 2001, EPRI and the USNRC's Office of Nuclear Regulatory Research (RES) embarked on a cooperative project to improve the state-of-the-art in fire risk assessment to support a new risk-informed environment in fire protection. This project produced a consensus document, NUREG/CR-6850 (EPRI 1011989), entitled 'Fire PRA Methodology for Nuclear Power Facilities' which addressed fire risk for at power operations. NUREG/CR-6850 developed high level guidance on the process for identification and inclusion of human failure events (HFEs) into the fire PRA (FPRA), and a methodology for assigning quantitative screening values to these HFEs. It outlined the initial considerations of performance shaping factors (PSFs) and related fire effects that may need to be addressed in developing best-estimate human error probabilities (HEPs). However, NUREG/CR-6850 did not describe a methodology to develop best-estimate HEPs given the PSFs and the fire-related effects. In 2007, EPRI and RES embarked on another cooperative project to develop explicit guidance for estimating HEPs for human failure events under fire generated conditions, building upon existing human reliability analysis (HRA) methods. This document provides a methodology and guidance for conducting a fire HRA. This process includes identification and definition of post-fire human failure events, qualitative analysis, quantification, recovery, dependency, and uncertainty. This document provides three approaches to quantification: screening, scoping, and detailed HRA. Screening is based on the guidance in NUREG/CR-6850, with some additional guidance for scenarios with long time windows. Scoping is a new approach to quantification developed specifically to support the iterative nature of fire PRA quantification. Scoping is intended to provide less conservative HEPs than screening, but requires fewer resources than a detailed HRA analysis. For detailed HRA quantification, guidance has been developed on how to apply existing methods to assess post-fire fire HEPs.« less
Gargon, Elizabeth; Williamson, Paula R; Young, Bridget
2017-06-01
The objective of the study was to explore core outcome set (COS) developers' experiences of their work to inform methodological guidance on COS development and identify areas for future methodological research. Semistructured, audio-recorded interviews with a purposive sample of 32 COS developers. Analysis of transcribed interviews was informed by the constant comparative method and framework analysis. Developers found COS development to be challenging, particularly in relation to patient participation and accessing funding. Their accounts raised fundamental questions about the status of COS development and whether it is consultation or research. Developers emphasized how the absence of guidance had affected their work and identified areas where guidance or evidence about COS development would be useful including, patient participation, ethics, international development, and implementation. They particularly wanted guidance on systematic reviews, Delphi, and consensus meetings. The findings raise important questions about the funding, status, and process of COS development and indicate ways that it could be strengthened. Guidance could help developers to strengthen their work, but over specification could threaten quality in COS development. Guidance should therefore highlight common issues to consider and encourage tailoring of COS development to the context and circumstances of particular COS. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Fortibuoni, Tomaso; Libralato, Simone; Raicevich, Saša; Giovanardi, Otello; Solidoro, Cosimo
2010-01-01
The understanding of fish communities' changes over the past centuries has important implications for conservation policy and marine resource management. However, reconstructing these changes is difficult because information on marine communities before the second half of the 20th century is, in most cases, anecdotal and merely qualitative. Therefore, historical qualitative records and modern quantitative data are not directly comparable, and their integration for long-term analyses is not straightforward. We developed a methodology that allows the coding of qualitative information provided by early naturalists into semi-quantitative information through an intercalibration with landing proportions. This approach allowed us to reconstruct and quantitatively analyze a 200-year-long time series of fish community structure indicators in the Northern Adriatic Sea (Mediterranean Sea). Our analysis provides evidence of long-term changes in fish community structure, including the decline of Chondrichthyes, large-sized and late-maturing species. This work highlights the importance of broadening the time-frame through which we look at marine ecosystem changes and provides a methodology to exploit, in a quantitative framework, historical qualitative sources. To the purpose, naturalists' eyewitness accounts proved to be useful for extending the analysis on fish community back in the past, well before the onset of field-based monitoring programs. PMID:21103349
ERIC Educational Resources Information Center
Sandieson, Robert W.; Kirkpatrick, Lori C.; Sandieson, Rachel M.; Zimmerman, Walter
2010-01-01
Digital technologies enable the storage of vast amounts of information, accessible with remarkable ease. However, along with this facility comes the challenge to find pertinent information from the volumes of nonrelevant information. The present article describes the pearl-harvesting methodological framework for information retrieval. Pearl…
Craig, Hugh; Berretta, Regina; Moscato, Pablo
2016-01-01
In this study we propose a novel, unsupervised clustering methodology for analyzing large datasets. This new, efficient methodology converts the general clustering problem into the community detection problem in graph by using the Jensen-Shannon distance, a dissimilarity measure originating in Information Theory. Moreover, we use graph theoretic concepts for the generation and analysis of proximity graphs. Our methodology is based on a newly proposed memetic algorithm (iMA-Net) for discovering clusters of data elements by maximizing the modularity function in proximity graphs of literary works. To test the effectiveness of this general methodology, we apply it to a text corpus dataset, which contains frequencies of approximately 55,114 unique words across all 168 written in the Shakespearean era (16th and 17th centuries), to analyze and detect clusters of similar plays. Experimental results and comparison with state-of-the-art clustering methods demonstrate the remarkable performance of our new method for identifying high quality clusters which reflect the commonalities in the literary style of the plays. PMID:27571416
Sangnawakij, Patarawan; Böhning, Dankmar; Adams, Stephen; Stanton, Michael; Holling, Heinz
2017-04-30
Statistical inference for analyzing the results from several independent studies on the same quantity of interest has been investigated frequently in recent decades. Typically, any meta-analytic inference requires that the quantity of interest is available from each study together with an estimate of its variability. The current work is motivated by a meta-analysis on comparing two treatments (thoracoscopic and open) of congenital lung malformations in young children. Quantities of interest include continuous end-points such as length of operation or number of chest tube days. As studies only report mean values (and no standard errors or confidence intervals), the question arises how meta-analytic inference can be developed. We suggest two methods to estimate study-specific variances in such a meta-analysis, where only sample means and sample sizes are available in the treatment arms. A general likelihood ratio test is derived for testing equality of variances in two groups. By means of simulation studies, the bias and estimated standard error of the overall mean difference from both methodologies are evaluated and compared with two existing approaches: complete study analysis only and partial variance information. The performance of the test is evaluated in terms of type I error. Additionally, we illustrate these methods in the meta-analysis on comparing thoracoscopic and open surgery for congenital lung malformations and in a meta-analysis on the change in renal function after kidney donation. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
van Dam, Jesse C J; Schaap, Peter J; Martins dos Santos, Vitor A P; Suárez-Diez, María
2014-09-26
Different methods have been developed to infer regulatory networks from heterogeneous omics datasets and to construct co-expression networks. Each algorithm produces different networks and efforts have been devoted to automatically integrate them into consensus sets. However each separate set has an intrinsic value that is diluted and partly lost when building a consensus network. Here we present a methodology to generate co-expression networks and, instead of a consensus network, we propose an integration framework where the different networks are kept and analysed with additional tools to efficiently combine the information extracted from each network. We developed a workflow to efficiently analyse information generated by different inference and prediction methods. Our methodology relies on providing the user the means to simultaneously visualise and analyse the coexisting networks generated by different algorithms, heterogeneous datasets, and a suite of analysis tools. As a show case, we have analysed the gene co-expression networks of Mycobacterium tuberculosis generated using over 600 expression experiments. Regarding DNA damage repair, we identified SigC as a key control element, 12 new targets for LexA, an updated LexA binding motif, and a potential mismatch repair system. We expanded the DevR regulon with 27 genes while identifying 9 targets wrongly assigned to this regulon. We discovered 10 new genes linked to zinc uptake and a new regulatory mechanism for ZuR. The use of co-expression networks to perform system level analysis allows the development of custom made methodologies. As show cases we implemented a pipeline to integrate ChIP-seq data and another method to uncover multiple regulatory layers. Our workflow is based on representing the multiple types of information as network representations and presenting these networks in a synchronous framework that allows their simultaneous visualization while keeping specific associations from the different networks. By simultaneously exploring these networks and metadata, we gained insights into regulatory mechanisms in M. tuberculosis that could not be obtained through the separate analysis of each data type.
The methodology of database design in organization management systems
NASA Astrophysics Data System (ADS)
Chudinov, I. L.; Osipova, V. V.; Bobrova, Y. V.
2017-01-01
The paper describes the unified methodology of database design for management information systems. Designing the conceptual information model for the domain area is the most important and labor-intensive stage in database design. Basing on the proposed integrated approach to design, the conceptual information model, the main principles of developing the relation databases are provided and user’s information needs are considered. According to the methodology, the process of designing the conceptual information model includes three basic stages, which are defined in detail. Finally, the article describes the process of performing the results of analyzing user’s information needs and the rationale for use of classifiers.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-24
... Information Collection: The purpose of the proposed methodological study is to evaluate the feasibility... the NCS, the multiple methodological studies conducted during the Vanguard phase will inform the... methodological study is identification of recruitment strategies and components of recruitment strategies that...
77 FR 15092 - U.S. Energy Information Administration; Proposed Agency Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-14
... conducted under this clearance will generally be methodological studies of 500 cases or less. The samples... conducted under this clearance will generally be methodological studies of 500 cases or less, but will... the methodological design, sampling procedures (where possible) and questionnaires of the full scale...
Dusseldorp, Elise; van Genugten, Lenneke; van Buuren, Stef; Verheijden, Marieke W; van Empelen, Pepijn
2014-12-01
Many health-promoting interventions combine multiple behavior change techniques (BCTs) to maximize effectiveness. Although, in theory, BCTs can amplify each other, the available meta-analyses have not been able to identify specific combinations of techniques that provide synergistic effects. This study overcomes some of the shortcomings in the current methodology by applying classification and regression trees (CART) to meta-analytic data in a special way, referred to as Meta-CART. The aim was to identify particular combinations of BCTs that explain intervention success. A reanalysis of data from Michie, Abraham, Whittington, McAteer, and Gupta (2009) was performed. These data included effect sizes from 122 interventions targeted at physical activity and healthy eating, and the coding of the interventions into 26 BCTs. A CART analysis was performed using the BCTs as predictors and treatment success (i.e., effect size) as outcome. A subgroup meta-analysis using a mixed effects model was performed to compare the treatment effect in the subgroups found by CART. Meta-CART identified the following most effective combinations: Provide information about behavior-health link with Prompt intention formation (mean effect size ḡ = 0.46), and Provide information about behavior-health link with Provide information on consequences and Use of follow-up prompts (ḡ = 0.44). Least effective interventions were those using Provide feedback on performance without using Provide instruction (ḡ = 0.05). Specific combinations of BCTs increase the likelihood of achieving change in health behavior, whereas other combinations decrease this likelihood. Meta-CART successfully identified these combinations and thus provides a viable methodology in the context of meta-analysis.
NASA Astrophysics Data System (ADS)
Brennan-Tonetta, Margaret
This dissertation seeks to provide key information and a decision support tool that states can use to support long-term goals of fossil fuel displacement and greenhouse gas reductions. The research yields three outcomes: (1) A methodology that allows for a comprehensive and consistent inventory and assessment of bioenergy feedstocks in terms of type, quantity, and energy potential. Development of a standardized methodology for consistent inventorying of biomass resources fosters research and business development of promising technologies that are compatible with the state's biomass resource base. (2) A unique interactive decision support tool that allows for systematic bioenergy analysis and evaluation of policy alternatives through the generation of biomass inventory and energy potential data for a wide variety of feedstocks and applicable technologies, using New Jersey as a case study. Development of a database that can assess the major components of a bioenergy system in one tool allows for easy evaluation of technology, feedstock and policy options. The methodology and decision support tool is applicable to other states and regions (with location specific modifications), thus contributing to the achievement of state and federal goals of renewable energy utilization. (3) Development of policy recommendations based on the results of the decision support tool that will help to guide New Jersey into a sustainable renewable energy future. The database developed in this research represents the first ever assessment of bioenergy potential for New Jersey. It can serve as a foundation for future research and modifications that could increase its power as a more robust policy analysis tool. As such, the current database is not able to perform analysis of tradeoffs across broad policy objectives such as economic development vs. CO2 emissions, or energy independence vs. source reduction of solid waste. Instead, it operates one level below that with comparisons of kWh or GGE generated by different feedstock/technology combinations at the state and county level. Modification of the model to incorporate factors that will enable the analysis of broader energy policy issues as those mentioned above, are recommended for future research efforts.
2014-01-01
Background Digital image analysis has the potential to address issues surrounding traditional histological techniques including a lack of objectivity and high variability, through the application of quantitative analysis. A key initial step in image analysis is the identification of regions of interest. A widely applied methodology is that of segmentation. This paper proposes the application of image analysis techniques to segment skin tissue with varying degrees of histopathological damage. The segmentation of human tissue is challenging as a consequence of the complexity of the tissue structures and inconsistencies in tissue preparation, hence there is a need for a new robust method with the capability to handle the additional challenges materialising from histopathological damage. Methods A new algorithm has been developed which combines enhanced colour information, created following a transformation to the L*a*b* colourspace, with general image intensity information. A colour normalisation step is included to enhance the algorithm’s robustness to variations in the lighting and staining of the input images. The resulting optimised image is subjected to thresholding and the segmentation is fine-tuned using a combination of morphological processing and object classification rules. The segmentation algorithm was tested on 40 digital images of haematoxylin & eosin (H&E) stained skin biopsies. Accuracy, sensitivity and specificity of the algorithmic procedure were assessed through the comparison of the proposed methodology against manual methods. Results Experimental results show the proposed fully automated methodology segments the epidermis with a mean specificity of 97.7%, a mean sensitivity of 89.4% and a mean accuracy of 96.5%. When a simple user interaction step is included, the specificity increases to 98.0%, the sensitivity to 91.0% and the accuracy to 96.8%. The algorithm segments effectively for different severities of tissue damage. Conclusions Epidermal segmentation is a crucial first step in a range of applications including melanoma detection and the assessment of histopathological damage in skin. The proposed methodology is able to segment the epidermis with different levels of histological damage. The basic method framework could be applied to segmentation of other epithelial tissues. PMID:24521154
2011-04-30
internal constructs f l f t th h l i l li k l i (LLA)? 3 use u or managemen , roug ex ca n ana ys s LLA Methodology Can Help! Warfighters RDTE...information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports...categories of interest in various spreadsheets). This year, we started to develop LLA from a demonstration to an operational capability and facilitate a
DLA Systems Modernization Methodology: Logical Analysis and Design Procedures
1990-07-01
Information Requirement would have little meaning and thus would lose its value . 3 I3 I 1.1.3 INPUT PRODUCTS 3 1.1.3.1 Enterprise Model Objective List 1.1.3.2...at the same time, the attribute is said to be multi- valued . i For example, an E-R model may contain information on the languages an employee speaks...Relationship model is examined in detail to ensure that each data group contains attributes whose values are absolutely determined by their respective
Information Systems: Opportunities Exist to Strengthen SEC’s Oversight of Capacity and Security
2001-07-01
Strengthen SEC’s Oversight of Capacity and Security 5 . FUNDING NUMBERS 6. AUTHOR(S) GAO 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING...ANSI Std. Z39-18 298-102 Page i GAO-01-863 Information Systems Letter 1 Results in Brief 2 Background 4 Scope and Methodology 5 SEC Uses a Wide Range...or external organizations to conduct the independent reviews. These internal audits are performed cyclically based on an annual risk analysis. SEC
A methodology for hard/soft information fusion in the condition monitoring of aircraft
NASA Astrophysics Data System (ADS)
Bernardo, Joseph T.
2013-05-01
Condition-based maintenance (CBM) refers to the philosophy of performing maintenance when the need arises, based upon indicators of deterioration in the condition of the machinery. Traditionally, CBM involves equipping machinery with electronic sensors that continuously monitor components and collect data for analysis. The addition of the multisensory capability of human cognitive functions (i.e., sensemaking, problem detection, planning, adaptation, coordination, naturalistic decision making) to traditional CBM may create a fuller picture of machinery condition. Cognitive systems engineering techniques provide an opportunity to utilize a dynamic resource—people acting as soft sensors. The literature is extensive on techniques to fuse data from electronic sensors, but little work exists on fusing data from humans with that from electronic sensors (i.e., hard/soft fusion). The purpose of my research is to explore, observe, investigate, analyze, and evaluate the fusion of pilot and maintainer knowledge, experiences, and sensory perceptions with digital maintenance resources. Hard/soft information fusion has the potential to increase problem detection capability, improve flight safety, and increase mission readiness. This proposed project consists the creation of a methodology that is based upon the Living Laboratories framework, a research methodology that is built upon cognitive engineering principles1. This study performs a critical assessment of concept, which will support development of activities to demonstrate hard/soft information fusion in operationally relevant scenarios of aircraft maintenance. It consists of fieldwork, knowledge elicitation to inform a simulation and a prototype.
An integrated framework for the geographic surveillance of chronic disease
2009-01-01
Background Geographic public health surveillance is concerned with describing and disseminating geographic information about disease and other measures of health to policy makers and the public. While methodological developments in the geographical analysis of disease are numerous, few have been integrated into a framework that also considers the effects of case ascertainment bias on the effectiveness of chronic disease surveillance. Results We present a framework for the geographic surveillance of chronic disease that integrates methodological developments in the spatial statistical analysis and case ascertainment. The framework uses an hierarchical approach to organize and model health information derived from an administrative health data system, and importantly, supports the detection and analysis of case ascertainment bias in geographic data. We test the framework on asthmatic data from Alberta, Canada. We observe high prevalence in south-western Alberta, particularly among Aboriginal females. We also observe that persons likely mistaken for asthmatics tend to be distributed in a pattern similar to asthmatics, suggesting that there may be an underlying social vulnerability to a variety of respiratory illnesses, or the presence of a diagnostic practice style effect. Finally, we note that clustering of asthmatics tends to occur at small geographic scales, while clustering of persons mistaken for asthmatics tends to occur at larger geographic scales. Conclusion Routine and ongoing geographic surveillance of chronic diseases is critical to developing an understanding of underlying epidemiology, and is critical to informing policy makers and the public about the health of the population. PMID:19948046
PepsNMR for 1H NMR metabolomic data pre-processing.
Martin, Manon; Legat, Benoît; Leenders, Justine; Vanwinsberghe, Julien; Rousseau, Réjane; Boulanger, Bruno; Eilers, Paul H C; De Tullio, Pascal; Govaerts, Bernadette
2018-08-17
In the analysis of biological samples, control over experimental design and data acquisition procedures alone cannot ensure well-conditioned 1 H NMR spectra with maximal information recovery for data analysis. A third major element affects the accuracy and robustness of results: the data pre-processing/pre-treatment for which not enough attention is usually devoted, in particular in metabolomic studies. The usual approach is to use proprietary software provided by the analytical instruments' manufacturers to conduct the entire pre-processing strategy. This widespread practice has a number of advantages such as a user-friendly interface with graphical facilities, but it involves non-negligible drawbacks: a lack of methodological information and automation, a dependency of subjective human choices, only standard processing possibilities and an absence of objective quality criteria to evaluate pre-processing quality. This paper introduces PepsNMR to meet these needs, an R package dedicated to the whole processing chain prior to multivariate data analysis, including, among other tools, solvent signal suppression, internal calibration, phase, baseline and misalignment corrections, bucketing and normalisation. Methodological aspects are discussed and the package is compared to the gold standard procedure with two metabolomic case studies. The use of PepsNMR on these data shows better information recovery and predictive power based on objective and quantitative quality criteria. Other key assets of the package are workflow processing speed, reproducibility, reporting and flexibility, graphical outputs and documented routines. Copyright © 2018 Elsevier B.V. All rights reserved.
Ancient DNA studies: new perspectives on old samples
2012-01-01
In spite of past controversies, the field of ancient DNA is now a reliable research area due to recent methodological improvements. A series of recent large-scale studies have revealed the true potential of ancient DNA samples to study the processes of evolution and to test models and assumptions commonly used to reconstruct patterns of evolution and to analyze population genetics and palaeoecological changes. Recent advances in DNA technologies, such as next-generation sequencing make it possible to recover DNA information from archaeological and paleontological remains allowing us to go back in time and study the genetic relationships between extinct organisms and their contemporary relatives. With the next-generation sequencing methodologies, DNA sequences can be retrieved even from samples (for example human remains) for which the technical pitfalls of classical methodologies required stringent criteria to guaranty the reliability of the results. In this paper, we review the methodologies applied to ancient DNA analysis and the perspectives that next-generation sequencing applications provide in this field. PMID:22697611
Information-theoretic approach to lead-lag effect on financial markets
NASA Astrophysics Data System (ADS)
Fiedor, Paweł
2014-08-01
Recently the interest of researchers has shifted from the analysis of synchronous relationships of financial instruments to the analysis of more meaningful asynchronous relationships. Both types of analysis are concentrated mostly on Pearson's correlation coefficient and consequently intraday lead-lag relationships (where one of the variables in a pair is time-lagged) are also associated with them. Under the Efficient-Market Hypothesis such relationships are not possible as all information is embedded in the prices, but in real markets we find such dependencies. In this paper we analyse lead-lag relationships of financial instruments and extend known methodology by using mutual information instead of Pearson's correlation coefficient. Mutual information is not only a more general measure, sensitive to non-linear dependencies, but also can lead to a simpler procedure of statistical validation of links between financial instruments. We analyse lagged relationships using New York Stock Exchange 100 data not only on an intraday level, but also for daily stock returns, which have usually been ignored.
Api, A M; Belsito, D; Bruze, M; Cadby, P; Calow, P; Dagli, M L; Dekant, W; Ellis, G; Fryer, A D; Fukayama, M; Griem, P; Hickey, C; Kromidas, L; Lalko, J F; Liebler, D C; Miyachi, Y; Politano, V T; Renskers, K; Ritacco, G; Salvito, D; Schultz, T W; Sipes, I G; Smith, B; Vitale, D; Wilcox, D K
2015-08-01
The Research Institute for Fragrance Materials, Inc. (RIFM) has been engaged in the generation and evaluation of safety data for fragrance materials since its inception over 45 years ago. Over time, RIFM's approach to gathering data, estimating exposure and assessing safety has evolved as the tools for risk assessment evolved. This publication is designed to update the RIFM safety assessment process, which follows a series of decision trees, reflecting advances in approaches in risk assessment and new and classical toxicological methodologies employed by RIFM over the past ten years. These changes include incorporating 1) new scientific information including a framework for choosing structural analogs, 2) consideration of the Threshold of Toxicological Concern (TTC), 3) the Quantitative Risk Assessment (QRA) for dermal sensitization, 4) the respiratory route of exposure, 5) aggregate exposure assessment methodology, 6) the latest methodology and approaches to risk assessments, 7) the latest alternatives to animal testing methodology and 8) environmental risk assessment. The assessment begins with a thorough analysis of existing data followed by in silico analysis, identification of 'read across' analogs, generation of additional data through in vitro testing as well as consideration of the TTC approach. If necessary, risk management may be considered. Copyright © 2014 Elsevier Ltd. All rights reserved.
The Escherichia coli Proteome: Past, Present, and Future Prospects†
Han, Mee-Jung; Lee, Sang Yup
2006-01-01
Proteomics has emerged as an indispensable methodology for large-scale protein analysis in functional genomics. The Escherichia coli proteome has been extensively studied and is well defined in terms of biochemical, biological, and biotechnological data. Even before the entire E. coli proteome was fully elucidated, the largest available data set had been integrated to decipher regulatory circuits and metabolic pathways, providing valuable insights into global cellular physiology and the development of metabolic and cellular engineering strategies. With the recent advent of advanced proteomic technologies, the E. coli proteome has been used for the validation of new technologies and methodologies such as sample prefractionation, protein enrichment, two-dimensional gel electrophoresis, protein detection, mass spectrometry (MS), combinatorial assays with n-dimensional chromatographies and MS, and image analysis software. These important technologies will not only provide a great amount of additional information on the E. coli proteome but also synergistically contribute to other proteomic studies. Here, we review the past development and current status of E. coli proteome research in terms of its biological, biotechnological, and methodological significance and suggest future prospects. PMID:16760308
Ergonomic assessment methodologies in manual handling of loads--opportunities in organizations.
Pires, Claudia
2012-01-01
The present study was developed based on the analysis of workplaces in the engineering industry, particularly in automotive companies. The main objectives of the study were to evaluate the activities present in the workplace concerning manual handling, using assessment methodologies NIOSH Ergonomic Equation [1] and Manual Material Handling [2], present in ISO 11228 [3-4], and to consider the possibility of developing musculoskeletal injuries associated with these activities, an issue of great concern in all industrial sectors. Similarly, it was also shown the suitability of each method to the task concerned. The study was conducted in three steps. The first step was to collect images and information about the target tasks. As a second step proceeded to the analysis, determining the method to use and to evaluate activities. Finally, we found the results obtained and acted on accordingly. With the study observed situations considered urgent action, according to the methodologies used, and proceeded to develop solutions in order to solve the problems identified, eliminating and / or minimizing embarrassing situations and harmful to employees.
Supporting Air and Space Expeditionary Forces: Analysis of Combat Support Basing Options
2004-01-01
Brooke et al., 2003. 13 For more information on Set Covering models, see Daskin , 1995. Analysis Methodology 43 Transportation Model. A detailed...This PDF document was made available from www.rand.org as a public service of the RAND Corporation. 6Jump down to document Visit RAND at...www.rand.org Explore RAND Project AIR FORCE View document details This document and trademark(s) contained herein are protected by law as indicated in a
Full-Envelope Launch Abort System Performance Analysis Methodology
NASA Technical Reports Server (NTRS)
Aubuchon, Vanessa V.
2014-01-01
The implementation of a new dispersion methodology is described, which dis-perses abort initiation altitude or time along with all other Launch Abort System (LAS) parameters during Monte Carlo simulations. In contrast, the standard methodology assumes that an abort initiation condition is held constant (e.g., aborts initiated at altitude for Mach 1, altitude for maximum dynamic pressure, etc.) while dispersing other LAS parameters. The standard method results in large gaps in performance information due to the discrete nature of initiation conditions, while the full-envelope dispersion method provides a significantly more comprehensive assessment of LAS abort performance for the full launch vehicle ascent flight envelope and identifies performance "pinch-points" that may occur at flight conditions outside of those contained in the discrete set. The new method has significantly increased the fidelity of LAS abort simulations and confidence in the results.
Using Public Data for Comparative Proteome Analysis in Precision Medicine Programs.
Hughes, Christopher S; Morin, Gregg B
2018-03-01
Maximizing the clinical utility of information obtained in longitudinal precision medicine programs would benefit from robust comparative analyses to known information to assess biological features of patient material toward identifying the underlying features driving their disease phenotype. Herein, the potential for utilizing publically deposited mass-spectrometry-based proteomics data to perform inter-study comparisons of cell-line or tumor-tissue materials is investigated. To investigate the robustness of comparison between MS-based proteomics studies carried out with different methodologies, deposited data representative of label-free (MS1) and isobaric tagging (MS2 and MS3 quantification) are utilized. In-depth quantitative proteomics data acquired from analysis of ovarian cancer cell lines revealed the robust recapitulation of observable gene expression dynamics between individual studies carried out using significantly different methodologies. The observed signatures enable robust inter-study clustering of cell line samples. In addition, the ability to classify and cluster tumor samples based on observed gene expression trends when using a single patient sample is established. With this analysis, relevant gene expression dynamics are obtained from a single patient tumor, in the context of a precision medicine analysis, by leveraging a large cohort of repository data as a comparator. Together, these data establish the potential for state-of-the-art MS-based proteomics data to serve as resources for robust comparative analyses in precision medicine applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-22
... comment the submission of additional information concerning the methodological changes for the digital... additional information concerning the methodological changes suggested in the comments by Mr. Shumate for the...-loss. The Commission is requesting a detailed description of the methodological changes that would be...
Managing In-House Development of a Campus-Wide Information System
ERIC Educational Resources Information Center
Shurville, Simon; Williams, John
2005-01-01
Purpose: To show how a combination of hard and soft project and change management methodologies guided successful in-house development of a campus-wide information system. Design/methodology/approach: A case study of the methodologies and management structures that guided the development is presented. Findings: Applying a combination of the…
Zhu; Dale
2000-10-01
/ Regional resource use planning relies on key regional stakeholder groups using and having equitable access to appropriate social, economic, and environmental information and assessment tools. Decision support systems (DSS) can improve stakeholder access to such information and analysis tools. Regional resource use planning, however, is a complex process involving multiple issues, multiple assessment criteria, multiple stakeholders, and multiple values. There is a need for an approach to DSS development that can assist in understanding and modeling complex problem situations in regional resource use so that areas where DSSs could provide effective support can be identified, and the user requirements can be well established. This paper presents an approach based on the soft systems methodology for identifying DSS opportunities for regional resource use planning, taking the Central Highlands Region of Queensland, Australia, as a case study.
Counter unmanned aerial system testing and evaluation methodology
NASA Astrophysics Data System (ADS)
Kouhestani, C.; Woo, B.; Birch, G.
2017-05-01
Unmanned aerial systems (UAS) are increasing in flight times, ease of use, and payload sizes. Detection, classification, tracking, and neutralization of UAS is a necessary capability for infrastructure and facility protection. We discuss test and evaluation methodology developed at Sandia National Laboratories to establish a consistent, defendable, and unbiased means for evaluating counter unmanned aerial system (CUAS) technologies. The test approach described identifies test strategies, performance metrics, UAS types tested, key variables, and the necessary data analysis to accurately quantify the capabilities of CUAS technologies. The tests conducted, as defined by this approach, will allow for the determination of quantifiable limitations, strengths, and weaknesses in terms of detection, tracking, classification, and neutralization. Communicating the results of this testing in such a manner informs decisions by government sponsors and stakeholders that can be used to guide future investments and inform procurement, deployment, and advancement of such systems into their specific venues.
ERIC Educational Resources Information Center
Poole, Wendy; Fallon, Gerald
2015-01-01
This paper examines increasing privatisation of education in the province of British Columbia, Canada. Conceptually, the paper is informed by theories of privatisation and social justice; and methodologically, it uses policy analysis to examine documents and financial records obtained from government departments. The paper critically analyses…
ERIC Educational Resources Information Center
Helliwell, Carolyn B.; Jung, Steven M.
Summarized are the findings of an American Institutes for Research (AIR) project to field test a data capture and dissemination system that would provide information for improving consumer protection in postsecondary education. Presented is a discussion of the methodology used, examples of consumer abuses cited in the literature, an analysis of…
ERIC Educational Resources Information Center
Gweon, Gahgene; Jain, Mahaveer; McDonough, John; Raj, Bhiksha; Rose, Carolyn P.
2013-01-01
This paper contributes to a theory-grounded methodological foundation for automatic collaborative learning process analysis. It does this by illustrating how insights from the social psychology and sociolinguistics of speech style provide a theoretical framework to inform the design of a computational model. The purpose of that model is to detect…
Children's Perceptions and Learning about Tropical Rainforests: An Analysis of Their Drawings
ERIC Educational Resources Information Center
Bowker, Rob
2007-01-01
This study analysed 9 to 11 year old children's drawings of tropical rainforests immediately before and after a visit to the Humid Tropics Biome at the Eden Project, Cornwall, UK. A theoretical framework derived from considerations of informal learning and constructivism was used as a basis to develop a methodology to interpret the children's…
Hypermedia Management in Television through Text Processing
ERIC Educational Resources Information Center
Caldera-Serrano, Jorge
2008-01-01
Purpose: This paper aims to describe a working routine for the analysis of audio-visual documents for serving the needs of television journalists. Design/methodology/approach: A description in the form of a synthesis is given of the process by which television information must be put at the user's disposal with a response that is both fast and…
ERIC Educational Resources Information Center
Navarro, Manuel
2014-01-01
This paper presents a model of how children generate concrete concepts from perception through processes of differentiation and integration. The model informs the design of a novel methodology ("evolutionary maps" or "emaps"), whose implementation on certain domains unfolds the web of itineraries that children may follow in the…
Anomalies in Economics Enrollment: 1991-1992 to 1995-1996
ERIC Educational Resources Information Center
Lombardi, Waldo; Ramrattan, Lall B.; Szenberg, Michael
2004-01-01
This paper presents data and empirical models to explain the causes of the decline in the enrollment of economics majors during the 1991-1992 to 1995-1996 academic years. It first discusses the theoretical bases for a qualitative analysis of this type. It then discusses a sample survey methodology used to obtain cross-sectional information from…
a Method for the Measurements of Children's Feet
NASA Astrophysics Data System (ADS)
Bernard, , M.; Buffevant, B.; Querio, R.; Rigal, R.
1980-07-01
The Centre Technique du Cuir (Leather Technical Center) has been entrusted with the task of measuring children's feet. A new equipement has been devised which makes the precision measures sure and which is quick to give informations. The paper will present : 1 - the existing engineerings, 2 - the research's and analysis's methodology, 3 - the CTC apparatus actually used in schools.
ERIC Educational Resources Information Center
Anderson, Melissa L.; Wolf Craig, Kelly S.; Ziedonis, Douglas M.
2017-01-01
Deaf individuals experience significant obstacles to participating in behavioral health research when careful consideration is not given to accessibility during the design of study methodology. To inform such considerations, we conducted an exploratory secondary analysis of a mixed-methods study that originally explored 16 Deaf trauma survivors'…
The Rescue911 Emergency Response Information System (ERIS): A Systems Development Project Case
ERIC Educational Resources Information Center
Cohen, Jason F.; Thiel, Franz H.
2010-01-01
This teaching case presents a systems development project useful for courses in object-oriented analysis and design. The case has a strong focus on the business, methodology, modeling and implementation aspects of systems development. The case is centered on a fictitious ambulance and emergency services company (Rescue911). The case describes that…
An Analysis of Career Tracks in the Design of IS Curricula in the U.S.
ERIC Educational Resources Information Center
Hwang, Drew; Soe, Louise L.
2010-01-01
Studies of undergraduate curricula in the field of Information Systems (IS) over the past two decades demonstrate a continual process of development and change. Many factors influence curriculum design, including new technologies and methodologies, and emerging subfields and subject areas. However, one deficit in the literature about IS curriculum…
ERIC Educational Resources Information Center
Tao, Fumiyo; And Others
This volume contains technical and supporting materials that supplement Volume I, which describes upward mobility programs for disadvantaged and dislocated workers in the service sector. Appendix A is a detailed description of the project methodology, including data collection methods and information on data compilation, processing, and analysis.…
ERIC Educational Resources Information Center
Santoro, Ninetta
2014-01-01
In this article I describe how a multiple perspectives framework drawn from the field of social work informed my analysis of interview data obtained from Australian preservice teachers who had gone on an international study trip. One incident recounted differently by three separate interviewees meant that the sometimes-similar and…
Aircrew Discourse: Exploring Strategies of Information and Action Management
NASA Technical Reports Server (NTRS)
Irwin, Cheryl M.; Veinott, Elizabeth S.; Shafto, Michael G. (Technical Monitor)
1995-01-01
This paper explores methodology issues encountered in the analysis of flightcrew communications in aviation simulation research. Examples are provided by two recent studies which are compared on three issues: level of analysis, data definition, and interpretation of the results. The data discussed were collected in a study comparing two levels of aircraft automation. The first example is an investigation of how pilots' information transfer strategies differed as a function of automation during low and high-workload flight phases. The second study focuses on how crews managed actions in the two aircraft during a ten minute, high-workload flight segment. Results indicated that crews in the two aircraft differed in their strategies of information and action management. The differences are discussed in terms of their operational and research significance.
Takecian, Pedro L.; Oikawa, Marcio K.; Braghetto, Kelly R.; Rocha, Paulo; Lucena, Fred; Kavounis, Katherine; Schlumpf, Karen S.; Acker, Susan; Carneiro-Proietti, Anna B. F.; Sabino, Ester C.; Custer, Brian; Busch, Michael P.; Ferreira, João E.
2013-01-01
Over time, data warehouse (DW) systems have become more difficult to develop because of the growing heterogeneity of data sources. Despite advances in research and technology, DW projects are still too slow for pragmatic results to be generated. Here, we address the following question: how can the complexity of DW development for integration of heterogeneous transactional information systems be reduced? To answer this, we proposed methodological guidelines based on cycles of conceptual modeling and data analysis, to drive construction of a modular DW system. These guidelines were applied to the blood donation domain, successfully reducing the complexity of DW development. PMID:23729945
Protein Aggregation Measurement through Electrical Impedance Spectroscopy
NASA Astrophysics Data System (ADS)
Affanni, A.; Corazza, A.; Esposito, G.; Fogolari, F.; Polano, M.
2013-09-01
The paper presents a novel methodology to measure the fibril formation in protein solutions. We designed a bench consisting of a sensor having interdigitated electrodes, a PDMS hermetic reservoir and an impedance meter automatically driven by calculator. The impedance data are interpolated with a lumped elements model and their change over time can provide information on the aggregation process. Encouraging results have been obtained by testing the methodology on K-casein, a protein of milk, with and without the addition of a drug inhibiting the aggregation. The amount of sample needed to perform this measurement is by far lower than the amount needed by fluorescence analysis.
Takecian, Pedro L; Oikawa, Marcio K; Braghetto, Kelly R; Rocha, Paulo; Lucena, Fred; Kavounis, Katherine; Schlumpf, Karen S; Acker, Susan; Carneiro-Proietti, Anna B F; Sabino, Ester C; Custer, Brian; Busch, Michael P; Ferreira, João E
2013-06-01
Over time, data warehouse (DW) systems have become more difficult to develop because of the growing heterogeneity of data sources. Despite advances in research and technology, DW projects are still too slow for pragmatic results to be generated. Here, we address the following question: how can the complexity of DW development for integration of heterogeneous transactional information systems be reduced? To answer this, we proposed methodological guidelines based on cycles of conceptual modeling and data analysis, to drive construction of a modular DW system. These guidelines were applied to the blood donation domain, successfully reducing the complexity of DW development.
Considerations for the design, analysis and presentation of in vivo studies.
Ranstam, J; Cook, J A
2017-03-01
To describe, explain and give practical suggestions regarding important principles and key methodological challenges in the study design, statistical analysis, and reporting of results from in vivo studies. Pre-specifying endpoints and analysis, recognizing the common underlying assumption of statistically independent observations, performing sample size calculations, and addressing multiplicity issues are important parts of an in vivo study. A clear reporting of results and informative graphical presentations of data are other important parts. Copyright © 2016 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.
Esquivel, Rodolfo O; Molina-Espíritu, Moyocoyani; López-Rosa, Sheila; Soriano-Correa, Catalina; Barrientos-Salcedo, Carolina; Kohout, Miroslav; Dehesa, Jesús S
2015-08-24
In this work we undertake a pioneer information-theoretical analysis of 18 selected amino acids extracted from a natural protein, bacteriorhodopsin (1C3W). The conformational structures of each amino acid are analyzed by use of various quantum chemistry methodologies at high levels of theory: HF, M062X and CISD(Full). The Shannon entropy, Fisher information and disequilibrium are determined to grasp the spatial spreading features of delocalizability, order and uniformity of the optimized structures. These three entropic measures uniquely characterize all amino acids through a predominant information-theoretic quality scheme (PIQS), which gathers all chemical families by means of three major spreading features: delocalization, narrowness and uniformity. This scheme recognizes four major chemical families: aliphatic (delocalized), aromatic (delocalized), electro-attractive (narrowed) and tiny (uniform). All chemical families recognized by the existing energy-based classifications are embraced by this entropic scheme. Finally, novel chemical patterns are shown in the information planes associated with the PIQS entropic measures. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Chernyavskiy, Andrey; Khamitov, Kamil; Teplov, Alexey; Voevodin, Vadim; Voevodin, Vladimir
2016-10-01
In recent years, quantum information technologies (QIT) showed great development, although, the way of the implementation of QIT faces the serious difficulties, some of which are challenging computational tasks. This work is devoted to the deep and broad analysis of the parallel algorithmic properties of such tasks. As an example we take one- and two-qubit transformations of a many-qubit quantum state, which are the most critical kernels of many important QIT applications. The analysis of the algorithms uses the methodology of the AlgoWiki project (algowiki-project.org) and consists of two parts: theoretical and experimental. Theoretical part includes features like sequential and parallel complexity, macro structure, and visual information graph. Experimental part was made by using the petascale Lomonosov supercomputer (Moscow State University, Russia) and includes the analysis of locality and memory access, scalability and the set of more specific dynamic characteristics of realization. This approach allowed us to obtain bottlenecks and generate ideas of efficiency improvement.
ERIC Educational Resources Information Center
Lundh, Anna
2010-01-01
Introduction: The concept of information needs is significant within the field of Information Needs Seeking and Use. "How" information needs can be studied empirically is however something that has been called into question. The main aim of this paper is to explore the methodological consequences of discursively oriented theories when…
Analysis of potential trade-offs in regulation of disinfection by-products
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cromwell, J.E.; Zhang, X.; Regli, S.
1992-11-01
Executive Order 12291 requires the preparation of a Regulatory Impact Analysis (RIA) on all new major federal regulations. The goal of an RIA is to develop and organize information on benefits, costs, and economic impacts so as to clarify trade-offs among alternative regulatory options. This paper outlines explicit methodology for assessing the technical potential for risk-risk tradeoffs. The strategies used to cope with complexities and uncertainties in developing the Disinfection By-Products Regulatory Analysis Model are explained. Results are presented and discussed in light of uncertainties, and in light of the analytical requirements for regulatory impact analysis.
Methodologies and Methods for User Behavioral Research.
ERIC Educational Resources Information Center
Wang, Peiling
1999-01-01
Discusses methodological issues in empirical studies of information-related behavior in six specific research areas: information needs and uses; information seeking; relevance judgment; online searching (including online public access catalog, online database, and the Web); human-system interactions; and reference transactions. (Contains 191…
Thinking about Museum Information.
ERIC Educational Resources Information Center
Reed, Patricia Ann; Sledge, Jane
1988-01-01
Describes work in progress at the Smithsonian Institution in developing a system to understand and articulate the information needed to support collection related functions. The discussion covers the data modeling methodology used and the advantages of this methodology in structuring museum collections information. (one reference) (CLB)
NASA Astrophysics Data System (ADS)
Sahoo, Madhumita; Sahoo, Satiprasad; Dhar, Anirban; Pradhan, Biswajeet
2016-10-01
Groundwater vulnerability assessment has been an accepted practice to identify the zones with relatively increased potential for groundwater contamination. DRASTIC is the most popular secondary information-based vulnerability assessment approach. Original DRASTIC approach considers relative importance of features/sub-features based on subjective weighting/rating values. However variability of features at a smaller scale is not reflected in this subjective vulnerability assessment process. In contrast to the subjective approach, the objective weighting-based methods provide flexibility in weight assignment depending on the variation of the local system. However experts' opinion is not directly considered in the objective weighting-based methods. Thus effectiveness of both subjective and objective weighting-based approaches needs to be evaluated. In the present study, three methods - Entropy information method (E-DRASTIC), Fuzzy pattern recognition method (F-DRASTIC) and Single parameter sensitivity analysis (SA-DRASTIC), were used to modify the weights of the original DRASTIC features to include local variability. Moreover, a grey incidence analysis was used to evaluate the relative performance of subjective (DRASTIC and SA-DRASTIC) and objective (E-DRASTIC and F-DRASTIC) weighting-based methods. The performance of the developed methodology was tested in an urban area of Kanpur City, India. Relative performance of the subjective and objective methods varies with the choice of water quality parameters. This methodology can be applied without/with suitable modification. These evaluations establish the potential applicability of the methodology for general vulnerability assessment in urban context.
Semantic Information Processing of Physical Simulation Based on Scientific Concept Vocabulary Model
NASA Astrophysics Data System (ADS)
Kino, Chiaki; Suzuki, Yoshio; Takemiya, Hiroshi
Scientific Concept Vocabulary (SCV) has been developed to actualize Cognitive methodology based Data Analysis System: CDAS which supports researchers to analyze large scale data efficiently and comprehensively. SCV is an information model for processing semantic information for physics and engineering. In the model of SCV, all semantic information is related to substantial data and algorisms. Consequently, SCV enables a data analysis system to recognize the meaning of execution results output from a numerical simulation. This method has allowed a data analysis system to extract important information from a scientific view point. Previous research has shown that SCV is able to describe simple scientific indices and scientific perceptions. However, it is difficult to describe complex scientific perceptions by currently-proposed SCV. In this paper, a new data structure for SCV has been proposed in order to describe scientific perceptions in more detail. Additionally, the prototype of the new model has been constructed and applied to actual data of numerical simulation. The result means that the new SCV is able to describe more complex scientific perceptions.
Minimum Information about a Genotyping Experiment (MIGEN)
Huang, Jie; Mirel, Daniel; Pugh, Elizabeth; Xing, Chao; Robinson, Peter N.; Pertsemlidis, Alexander; Ding, LiangHao; Kozlitina, Julia; Maher, Joseph; Rios, Jonathan; Story, Michael; Marthandan, Nishanth; Scheuermann, Richard H.
2011-01-01
Genotyping experiments are widely used in clinical and basic research laboratories to identify associations between genetic variations and normal/abnormal phenotypes. Genotyping assay techniques vary from single genomic regions that are interrogated using PCR reactions to high throughput assays examining genome-wide sequence and structural variation. The resulting genotype data may include millions of markers of thousands of individuals, requiring various statistical, modeling or other data analysis methodologies to interpret the results. To date, there are no standards for reporting genotyping experiments. Here we present the Minimum Information about a Genotyping Experiment (MIGen) standard, defining the minimum information required for reporting genotyping experiments. MIGen standard covers experimental design, subject description, genotyping procedure, quality control and data analysis. MIGen is a registered project under MIBBI (Minimum Information for Biological and Biomedical Investigations) and is being developed by an interdisciplinary group of experts in basic biomedical science, clinical science, biostatistics and bioinformatics. To accommodate the wide variety of techniques and methodologies applied in current and future genotyping experiment, MIGen leverages foundational concepts from the Ontology for Biomedical Investigations (OBI) for the description of the various types of planned processes and implements a hierarchical document structure. The adoption of MIGen by the research community will facilitate consistent genotyping data interpretation and independent data validation. MIGen can also serve as a framework for the development of data models for capturing and storing genotyping results and experiment metadata in a structured way, to facilitate the exchange of metadata. PMID:22180825
Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa
2012-06-01
The use of sophisticated information and communication technologies (ICTs) in the health care domain is a way to improve the quality of services. However, there are also hazards associated with the introduction of ICTs in this domain and a great number of projects have failed due to the lack of systematic consideration of human and other non-technology issues throughout the design or implementation process, particularly in the requirements engineering process. This paper presents the methodological approach followed in the design process of a web-based information system (WbIS) for managing the clinical information in hemophilia care, which integrates the values and practices of user-centered design (UCD) activities into the principles of software engineering, particularly in the phase of requirements engineering (RE). This process followed a paradigm that combines a grounded theory for data collection with an evolutionary design based on constant development and refinement of the generic domain model using three well-known methodological approaches: (a) object-oriented system analysis; (b) task analysis; and, (c) prototyping, in a triangulation work. This approach seems to be a good solution for the requirements engineering process in this particular case of the health care domain, since the inherent weaknesses of individual methods are reduced, and emergent requirements are easier to elicit. Moreover, the requirements triangulation matrix gives the opportunity to look across the results of all used methods and decide what requirements are critical for the system success. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Information System Design Methodology Based on PERT/CPM Networking and Optimization Techniques.
ERIC Educational Resources Information Center
Bose, Anindya
The dissertation attempts to demonstrate that the program evaluation and review technique (PERT)/Critical Path Method (CPM) or some modified version thereof can be developed into an information system design methodology. The methodology utilizes PERT/CPM which isolates the basic functional units of a system and sets them in a dynamic time/cost…
ERIC Educational Resources Information Center
Afzal, Waseem
2017-01-01
Introduction: The purpose of this paper is to propose a methodology to conceptualize, operationalize, and empirically validate the concept of information need. Method: The proposed methodology makes use of both qualitative and quantitative perspectives, and includes a broad array of approaches such as literature reviews, expert opinions, focus…
Exploring Methodologies and Indicators for Cross-disciplinary Applications
NASA Astrophysics Data System (ADS)
Bernknopf, R.; Pearlman, J.
2015-12-01
Assessing the impact and benefit of geospatial information is a multidisciplinary task that involves the social, economic and environmental knowledge to formulate indicators and methods. There are use cases that couple the social sciences including economics, psychology, sociology that incorporate geospatial information. Benefit - cost analysis is an empirical approach that uses money as an indicator for decision making. It is a traditional base for a use case and has been applied to geospatial information and other areas. A new use case that applies indicators is Meta Regression analysis, which is used to evaluate transfers of socioeconomic benefits from different geographic regions into a unifying statistical approach. In this technique, qualitative and quantitative variables are indicators, which provide a weighted average of value for the nonmarket good or resource over a large region. The expected willingness to pay for the nonmarket good can be applied to a specific region. A third use case is the application of Decision Support Systems and Tools that have been used for forecasting agricultural prices and analysis of hazard policies. However, new methods for integrating these disciplines into use cases, an avenue to instruct the development of operational applications of geospatial information, are needed. Experience in one case may not be broadly transferable to other uses and applications if multiple disciplines are involved. To move forward, more use cases are needed and, especially, applications in the private sector. Applications are being examined across a multidisciplinary community for good examples that would be instructive in meeting the challenge. This presentation will look at the results of an investigation into directions in the broader applications of use cases to teach the methodologies and use of indicators that have applications across fields of interest.
Guidelines for the Design and Conduct of Clinical Studies in Knee Articular Cartilage Repair
Mithoefer, Kai; Saris, Daniel B.F.; Farr, Jack; Kon, Elizaveta; Zaslav, Kenneth; Cole, Brian J.; Ranstam, Jonas; Yao, Jian; Shive, Matthew; Levine, David; Dalemans, Wilfried; Brittberg, Mats
2011-01-01
Objective: To summarize current clinical research practice and develop methodological standards for objective scientific evaluation of knee cartilage repair procedures and products. Design: A comprehensive literature review was performed of high-level original studies providing information relevant for the design of clinical studies on articular cartilage repair in the knee. Analysis of cartilage repair publications and synopses of ongoing trials were used to identify important criteria for the design, reporting, and interpretation of studies in this field. Results: Current literature reflects the methodological limitations of the scientific evidence available for articular cartilage repair. However, clinical trial databases of ongoing trials document a trend suggesting improved study designs and clinical evaluation methodology. Based on the current scientific information and standards of clinical care, detailed methodological recommendations were developed for the statistical study design, patient recruitment, control group considerations, study endpoint definition, documentation of results, use of validated patient-reported outcome instruments, and inclusion and exclusion criteria for the design and conduct of scientifically sound cartilage repair study protocols. A consensus statement among the International Cartilage Repair Society (ICRS) and contributing authors experienced in clinical trial design and implementation was achieved. Conclusions: High-quality clinical research methodology is critical for the optimal evaluation of current and new cartilage repair technologies. In addition to generally applicable principles for orthopedic study design, specific criteria and considerations apply to cartilage repair studies. Systematic application of these criteria and considerations can facilitate study designs that are scientifically rigorous, ethical, practical, and appropriate for the question(s) being addressed in any given cartilage repair research project. PMID:26069574
Cellular neural network-based hybrid approach toward automatic image registration
NASA Astrophysics Data System (ADS)
Arun, Pattathal VijayaKumar; Katiyar, Sunil Kumar
2013-01-01
Image registration is a key component of various image processing operations that involve the analysis of different image data sets. Automatic image registration domains have witnessed the application of many intelligent methodologies over the past decade; however, inability to properly model object shape as well as contextual information has limited the attainable accuracy. A framework for accurate feature shape modeling and adaptive resampling using advanced techniques such as vector machines, cellular neural network (CNN), scale invariant feature transform (SIFT), coreset, and cellular automata is proposed. CNN has been found to be effective in improving feature matching as well as resampling stages of registration and complexity of the approach has been considerably reduced using coreset optimization. The salient features of this work are cellular neural network approach-based SIFT feature point optimization, adaptive resampling, and intelligent object modelling. Developed methodology has been compared with contemporary methods using different statistical measures. Investigations over various satellite images revealed that considerable success was achieved with the approach. This system has dynamically used spectral and spatial information for representing contextual knowledge using CNN-prolog approach. This methodology is also illustrated to be effective in providing intelligent interpretation and adaptive resampling.
NASA Astrophysics Data System (ADS)
Morse, Brad S.; Pohll, Greg; Huntington, Justin; Rodriguez Castillo, Ramiro
2003-06-01
In 1992, Mexican researchers discovered concentrations of arsenic in excess of World Heath Organization (WHO) standards in several municipal wells in the Zimapan Valley of Mexico. This study describes a method to delineate a capture zone for one of the most highly contaminated wells to aid in future well siting. A stochastic approach was used to model the capture zone because of the high level of uncertainty in several input parameters. Two stochastic techniques were performed and compared: "standard" Monte Carlo analysis and the generalized likelihood uncertainty estimator (GLUE) methodology. The GLUE procedure differs from standard Monte Carlo analysis in that it incorporates a goodness of fit (termed a likelihood measure) in evaluating the model. This allows for more information (in this case, head data) to be used in the uncertainty analysis, resulting in smaller prediction uncertainty. Two likelihood measures are tested in this study to determine which are in better agreement with the observed heads. While the standard Monte Carlo approach does not aid in parameter estimation, the GLUE methodology indicates best fit models when hydraulic conductivity is approximately 10-6.5 m/s, with vertically isotropic conditions and large quantities of interbasin flow entering the basin. Probabilistic isochrones (capture zone boundaries) are then presented, and as predicted, the GLUE-derived capture zones are significantly smaller in area than those from the standard Monte Carlo approach.
Udod, Sonia A; Racine, Louise
2017-12-01
To draw on the findings of a grounded theory study aimed at exploring how power is exercised in nurse-manager relationships in the hospital setting, this paper examines the empirical and pragmatic adequacy of grounded theory as a methodology to advance the concept of empowerment in the area of nursing leadership and management. The evidence on staff nurse empowerment has highlighted the magnitude of individual and organisational outcomes, but has not fully explicated the micro-level processes underlying how power is exercised, shared or created within the nurse-manager relationship. Although grounded theory is a widely adopted nursing research methodology, it remains less used in nursing leadership because of the dominance of quantitative approaches to research. Grounded theory methodology provides the empirical and pragmatic relevance to inform nursing practice and policy. Grounded theory is a relevant qualitative approach to use in leadership research as it provides a fine and detailed analysis of the process underlying complexity and bureaucracy. Discursive paper. A critical examination of the empirical and pragmatic relevance of grounded theory by (Corbin & Strauss, , ) as a method for analysing and solving problems in nurses' practice is provided. This paper provides evidence to support the empirical and pragmatic adequacy of grounded theory methodology. Although the application of the ontological, epistemological and methodological assumptions of grounded theory is challenging, this methodology is useful to address real-life problems in nursing practice by developing theoretical explanations of nurse empowerment, or lack thereof, in the workplace. Grounded theory represents a relevant methodology to inform nursing leadership research. Grounded theory is anchored in the reality of practice. The strength of grounded theory is to provide results that can be readily applied to clinical practice and policy as they arise from problems that affect practice and that are meaningful to nurses. © 2017 John Wiley & Sons Ltd.
Boronat, F; Budia, A; Broseta, E; Ruiz-Cerdá, J L; Vivas-Consuelo, D
To describe the application of the Lean methodology as a method for continuously improving the efficiency of a urology department in a tertiary hospital. The implementation of the Lean Healthcare methodology in a urology department was conducted in 3 phases: 1) team training and improvement of feedback among the practitioners, 2) management by process and superspecialisation and 3) improvement of indicators (continuous improvement). The indicators were obtained from the Hospital's information systems. The main source of information was the Balanced Scorecard for health systems management (CUIDISS). The comparison with other autonomous and national urology departments was performed through the same platform with the help of the Hospital's records department (IASIST). A baseline was established with the indicators obtained in 2011 for the comparative analysis of the results after implementing the Lean Healthcare methodology. The implementation of this methodology translated into high practitioner satisfaction, improved quality indicators reaching a risk-adjusted complication index (RACI) of 0.59 and a risk-adjusted mortality rate (RAMR) of 0.24 in 4 years. A value of 0.61 was reached with the efficiency indicator (risk-adjusted length of stay [RALOS] index), with a savings of 2869 stays compared with national Benchmarking (IASIST). The risk-adjusted readmissions index (RARI) was the only indicator above the standard, with a value of 1.36 but with progressive annual improvement of the same. The Lean methodology can be effectively applied to a urology department of a tertiary hospital to improve efficiency, obtaining significant and continuous improvements in all its indicators, as well as practitioner satisfaction. Team training, management by process, continuous improvement and delegation of responsibilities has been shown to be the fundamental pillars of this methodology. Copyright © 2017 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.
Recent developments in imaging system assessment methodology, FROC analysis and the search model.
Chakraborty, Dev P
2011-08-21
A frequent problem in imaging is assessing whether a new imaging system is an improvement over an existing standard. Observer performance methods, in particular the receiver operating characteristic (ROC) paradigm, are widely used in this context. In ROC analysis lesion location information is not used and consequently scoring ambiguities can arise in tasks, such as nodule detection, involving finding localized lesions. This paper reviews progress in the free-response ROC (FROC) paradigm in which the observer marks and rates suspicious regions and the location information is used to determine whether lesions were correctly localized. Reviewed are FROC data analysis, a search-model for simulating FROC data, predictions of the model and a method for estimating the parameters. The search model parameters are physically meaningful quantities that can guide system optimization.
Investigating accident causation through information network modelling.
Griffin, T G C; Young, M S; Stanton, N A
2010-02-01
Management of risk in complex domains such as aviation relies heavily on post-event investigations, requiring complex approaches to fully understand the integration of multi-causal, multi-agent and multi-linear accident sequences. The Event Analysis of Systemic Teamwork methodology (EAST; Stanton et al. 2008) offers such an approach based on network models. In this paper, we apply EAST to a well-known aviation accident case study, highlighting communication between agents as a central theme and investigating the potential for finding agents who were key to the accident. Ultimately, this work aims to develop a new model based on distributed situation awareness (DSA) to demonstrate that the risk inherent in a complex system is dependent on the information flowing within it. By identifying key agents and information elements, we can propose proactive design strategies to optimize the flow of information and help work towards avoiding aviation accidents. Statement of Relevance: This paper introduces a novel application of an holistic methodology for understanding aviation accidents. Furthermore, it introduces an ongoing project developing a nonlinear and prospective method that centralises distributed situation awareness and communication as themes. The relevance of findings are discussed in the context of current ergonomic and aviation issues of design, training and human-system interaction.
1981-01-01
comparison of formal and informal design methodologies will show how we think they are converging. Lastly, I will describe our involvement with the DoD...computer security must begin with the design methodology , with the objective being provability. The idea ofa formal evaluation and on-the-shelf... Methodologies ] Here we can compare the formal design methodologies with those used by informal practitioners like Control Data. Obviously, both processes
A Methodology to Support Decision Making in Flood Plan Mitigation
NASA Astrophysics Data System (ADS)
Biscarini, C.; di Francesco, S.; Manciola, P.
2009-04-01
The focus of the present document is on specific decision-making aspects of flood risk analysis. A flood is the result of runoff from rainfall in quantities too great to be confined in the low-water channels of streams. Little can be done to prevent a major flood, but we may be able to minimize damage within the flood plain of the river. This broad definition encompasses many possible mitigation measures. Floodplain management considers the integrated view of all engineering, nonstructural, and administrative measures for managing (minimizing) losses due to flooding on a comprehensive scale. The structural measures are the flood-control facilities designed according to flood characteristics and they include reservoirs, diversions, levees or dikes, and channel modifications. Flood-control measures that modify the damage susceptibility of floodplains are usually referred to as nonstructural measures and may require minor engineering works. On the other hand, those measures designed to modify the damage potential of permanent facilities are called non-structural and allow reducing potential damage during a flood event. Technical information is required to support the tasks of problem definition, plan formulation, and plan evaluation. The specific information needed and the related level of detail are dependent on the nature of the problem, the potential solutions, and the sensitivity of the findings to the basic information. Actions performed to set up and lay out the study are preliminary to the detailed analysis. They include: defining the study scope and detail, the field data collection, a review of previous studies and reports, and the assembly of needed maps and surveys. Risk analysis can be viewed as having many components: risk assessment, risk communication and risk management. Risk assessment comprises an analysis of the technical aspects of the problem, risk communication deals with conveying the information and risk management involves the decision process. In the present paper we propose a novel methodology for supporting the priority setting in the assessment of such issues, beyond the typical "expected value" approach. Scientific contribution and management aspects are merged to create a simplified method for plan basin implementation, based on risk and economic analyses. However, the economic evaluation is not the sole criterion for flood-damage reduction plan selection. Among the different criteria that are relevant to the decision process, safety and quality of human life, economic damage, expenses related with the chosen measures and environmental issues should play a fundamental role on the decisions made by the authorities. Some numerical indices, taking in account administrative, technical, economical and risk aspects, are defined and are combined together in a mathematical formula that defines a Priority Index (PI). In particular, the priority index defines a ranking of priority interventions, thus allowing the formulation of the investment plan. The research is mainly focused on the technical factors of risk assessment, providing quantitative and qualitative estimates of possible alternatives, containing measures of the risk associated with those alternatives. Moreover, the issues of risk management are analyzed, in particular with respect to the role of decision making in the presence of risk information. However, a great effort is devoted to make this index easy to be formulated and effective to allow a clear and transparent comparison between the alternatives. Summarizing this document describes a major- steps for incorporation of risk analysis into the decision making process: framing of the problem in terms of risk analysis, application of appropriate tools and techniques to obtain quantified results, use of the quantified results in the choice of structural and non-structural measures. In order to prove the reliability of the proposed methodology and to show how risk-based information can be incorporated into a flood analysis process, its application to some middle italy river basins is presented. The methodology assessment is performed by comparing different scenarios and showing that the optimal decision stems from a feasibility evaluation.
Wang, Li; Carnegie, Graeme K.
2013-01-01
Among methods to study protein-protein interaction inside cells, Bimolecular Fluorescence Complementation (BiFC) is relatively simple and sensitive. BiFC is based on the production of fluorescence using two non-fluorescent fragments of a fluorescent protein (Venus, a Yellow Fluorescent Protein variant, is used here). Non-fluorescent Venus fragments (VN and VC) are fused to two interacting proteins (in this case, AKAP-Lbc and PDE4D3), yielding fluorescence due to VN-AKAP-Lbc-VC-PDE4D3 interaction and the formation of a functional fluorescent protein inside cells. BiFC provides information on the subcellular localization of protein complexes and the strength of protein interactions based on fluorescence intensity. However, BiFC analysis using microscopy to quantify the strength of protein-protein interaction is time-consuming and somewhat subjective due to heterogeneity in protein expression and interaction. By coupling flow cytometric analysis with BiFC methodology, the fluorescent BiFC protein-protein interaction signal can be accurately measured for a large quantity of cells in a short time. Here, we demonstrate an application of this methodology to map regions in PDE4D3 that are required for the interaction with AKAP-Lbc. This high throughput methodology can be applied to screening factors that regulate protein-protein interaction. PMID:23979513
Wang, Li; Carnegie, Graeme K
2013-08-15
Among methods to study protein-protein interaction inside cells, Bimolecular Fluorescence Complementation (BiFC) is relatively simple and sensitive. BiFC is based on the production of fluorescence using two non-fluorescent fragments of a fluorescent protein (Venus, a Yellow Fluorescent Protein variant, is used here). Non-fluorescent Venus fragments (VN and VC) are fused to two interacting proteins (in this case, AKAP-Lbc and PDE4D3), yielding fluorescence due to VN-AKAP-Lbc-VC-PDE4D3 interaction and the formation of a functional fluorescent protein inside cells. BiFC provides information on the subcellular localization of protein complexes and the strength of protein interactions based on fluorescence intensity. However, BiFC analysis using microscopy to quantify the strength of protein-protein interaction is time-consuming and somewhat subjective due to heterogeneity in protein expression and interaction. By coupling flow cytometric analysis with BiFC methodology, the fluorescent BiFC protein-protein interaction signal can be accurately measured for a large quantity of cells in a short time. Here, we demonstrate an application of this methodology to map regions in PDE4D3 that are required for the interaction with AKAP-Lbc. This high throughput methodology can be applied to screening factors that regulate protein-protein interaction.
Modeling and replicating statistical topology and evidence for CMB nonhomogeneity
Agami, Sarit
2017-01-01
Under the banner of “big data,” the detection and classification of structure in extremely large, high-dimensional, data sets are two of the central statistical challenges of our times. Among the most intriguing new approaches to this challenge is “TDA,” or “topological data analysis,” one of the primary aims of which is providing nonmetric, but topologically informative, preanalyses of data which make later, more quantitative, analyses feasible. While TDA rests on strong mathematical foundations from topology, in applications, it has faced challenges due to difficulties in handling issues of statistical reliability and robustness, often leading to an inability to make scientific claims with verifiable levels of statistical confidence. We propose a methodology for the parametric representation, estimation, and replication of persistence diagrams, the main diagnostic tool of TDA. The power of the methodology lies in the fact that even if only one persistence diagram is available for analysis—the typical case for big data applications—the replications permit conventional statistical hypothesis testing. The methodology is conceptually simple and computationally practical, and provides a broadly effective statistical framework for persistence diagram TDA analysis. We demonstrate the basic ideas on a toy example, and the power of the parametric approach to TDA modeling in an analysis of cosmic microwave background (CMB) nonhomogeneity. PMID:29078301
ERIC Educational Resources Information Center
Selig, Judith A.; And Others
This report, summarizing the activities of the Vision Information Center (VIC) in the field of computer-assisted instruction from December, 1966 to August, 1967, describes the methodology used to load a large body of information--a programed text on basic opthalmology--onto a computer for subsequent information retrieval and computer-assisted…
2016-11-01
low- power RF transmissions used by the OBAN system. B. Threat Analysis Methodology To analyze the risk presented by a particular threat we use a... power efficiency5 and in the absolute worst case a compromise of the wireless channel could result in death. Fitness trackers on the other hand are...analysis is intended to inform the development of secure RT-PSM architectures. I. INTRODUCTION The development of very low- power computing devices and
2008-12-01
Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the...Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington DC 20503. 1 . AGENCY USE ONLY...on Investment (ROI) of the Zephyr system. This is achieved by ( 1 ) Developing a model to carry out Business Case Analysis (BCA) of JCTDs, including
DiNardo, Thomas P.; Jackson, R. Alan
1984-01-01
An analysis of land use change for an area in Boulder County, Colorado, was conducted using digital cartographic data. The authors selected data in the Geographic Information Retrieval and Analysis System (GIRAS) format which is digitized from the 1:250,000-scale land use and land cover map series. The Map Overlay and Statistical System (MOSS) was used as an analytical tool for the study. The authors describe the methodology used in converting the GIRAS file into a MOSS format and the activities associated with the conversion.
Results and Analysis from Space Suit Joint Torque Testing
NASA Technical Reports Server (NTRS)
Matty, Jennifer
2010-01-01
This joint mobility KC lecture included information from two papers, "A Method for and Issues Associated with the Determination of Space Suit Joint Requirements" and "Results and Analysis from Space Suit Joint Torque Testing," as presented for the International Conference on Environmental Systems in 2009 and 2010, respectively. The first paper discusses historical joint torque testing methodologies and approaches that were tested in 2008 and 2009. The second paper discusses the testing that was completed in 2009 and 2010.
2010-09-01
approach” that provides analysis for “program managers to attain the right knowledge [e.g., portfolio values] at critical junctures so they can make...portfolio, it is critical to create a credible beta (β) for the presumed “market volatility” of the IT asset class. The KVA methodology may be used...for this critical task. Establishing a beta (β) for DoD wide IT portfolio management uses the work conducted at the Naval Postgraduate School (NPS
Analysis of central enterprise architecture elements in models of six eHealth projects.
Virkanen, Hannu; Mykkänen, Juha
2014-01-01
Large-scale initiatives for eHealth services have been established in many countries on regional or national level. The use of Enterprise Architecture has been suggested as a methodology to govern and support the initiation, specification and implementation of large-scale initiatives including the governance of business changes as well as information technology. This study reports an analysis of six health IT projects in relation to Enterprise Architecture elements, focusing on central EA elements and viewpoints in different projects.